Career Area:
Technology, Digital and Data
Job Description:
Your Work Shapes the World at Caterpillar Inc.
When you join Caterpillar, you're joining a global team who cares not just about the work we do – but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here – we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it.
Job Summary
We are looking for a highly motivated and experienced Data Engineer with 5+ years of industry experience to join our data engineering team. The ideal candidate will have a strong background in building scalable data pipelines using the AWS cloud stack and extensive hands-on experience with Snowflake. Proficiency in Python and SQL is essential. This role requires excellent problem-solving skills and a proactive mindset to deliver robust and efficient data solutions.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines on AWS using services such as S3, Glue, Lambda, Redshift, and EMR.
- Build and optimise data warehousing solutions using Snowflake, including performance tuning and data modelling.
- Write efficient and reusable code in Python and SQL for data transformation and processing.
- Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements.
- Monitor, troubleshoot, and improve pipeline performance and reliability.
- Ensure data quality, integrity, and security across all stages of the pipeline.
- Participate in code reviews, architecture discussions, and continuous improvement initiatives.
Required Qualifications:
- 5+ years of experience in data engineering or related roles.
- Strong hands-on experience with AWS cloud services.
- Deep understanding of Snowflake architecture and best practices.
- Advanced proficiency in Python and SQL.
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Excellent analytical and problem-solving skills.
- Strong communication and collaboration abilities.
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Preferred Qualifications:
Experience with orchestration tools like Apache Airflow or AWS Step Functions.
Familiarity with data governance and compliance practices.
Exposure to real-time data processing frameworks (e.g., Kafka, Spark Streaming).
Posting Dates:
November 27, 2025 - December 3, 2025
Caterpillar is an Equal Opportunity Employer. Qualified applicants of any age are encouraged to apply
Not ready to apply? Join our Talent Community.