Career Area:
Technology, Digital and Data
Job Description:
Your Work Shapes the World at Caterpillar Inc.
When you join Caterpillar, you're joining a global team who cares not just about the work we do – but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here – we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it.
Basic Qualifications
- Bachelor's degree in computer science or a related field
- At least 6+ years of experience in Software engineering
- Strong skills in SQL (Minimum 2+ years of recent experience)
- At least 2+ years of experience recent experience in programming - Python development
- Experience with relational databases such as Snowflake, MySQL or PostgreSQL
- Experience with designing and developing software applications for cloud computing platforms such as AWS.
- Good in Communication
- Should also have experience with Python frameworks, such as Pandas, NumPy, and Pyramid.
- A solid understanding of fundamental computer science principles, including data structures and algorithms, is necessary.
- Capable of thriving in high-pressure situations and delivering results within tight time constraints.
- Demonstrated passion for technology coupled with an eagerness to contribute to a collaborative team environment.
Roles & responsibilities
- Responsible for developing and maintaining Data engineering processes, Python applications and services.
- Responsible for data analysis & validations on data & process migrations.
- Troubleshooting and debugging code
- Deploying applications to production.
- Working with DevOps teams to ensure that applications are running smoothly.
Skill/Tool level detail:
Top candidates will have proven experience in the following
- Designing, creating, deploying, and sustaining software solutions on a large scale.
- Implementing application architectural patterns, including but not limited to MVC, Microservices, and Event-driven architectures.
- Utilizing CI/CD tools such as Azure DevOps, Jenkins, GoCD, etc., for seamless software deployment.
- Deploying and managing software on public cloud platforms like AWS.
- Collaborating within an Agile framework, preferably following Scrum methodologies.
- Proficiency in managing and deploying applications using container orchestration tools, specifically ECS (Elastic Container Service). Good understanding of Docker fundamentals.
- Other AWS services like Lambda, cloudformation, cloudwatch, S3.
- Good working knowledge on Python modules like Pandas, Numpy, mutlithreading, requests, etc
- Possess a solid understanding of version control systems, with a particular focus on GIT.
- Proficiency with SQL.
Good to have
- Proficient in managing diverse data stores, including Snowflake, Elasticsearch, MySQL, and Oracle.
- Well-versed in developing Snowflake procedures, tasks, and other Snowflake components.
- Experienced in working with message brokers such as AWS SQS and AWS SNS.
- Proficient in utilizing batch or stream processing systems, including Apache Spark and AWS Glue.
- Familiarity with scheduling tools like Apache Airflow and Perfect.
- Skilled in developing and working with Restful APIs.
- Hands-on experience with API tools like Swagger, Postman, and Assertible.
- Advocate of Test-Driven Development (TDD) and Behaviour-Driven Development (BDD).
- Extensive hands-on experience with testing tools like Selenium and Cucumber, with expertise in seamlessly integrating them into CI/CD pipelines.
Posting Dates:
Caterpillar is an Equal Opportunity Employer (EEO).
Not ready to apply? Join our Talent Community.