Software Engineer, Python & PySpark, VP

Posted:
3/8/2026, 7:43:41 PM

Location(s):
Karnataka, India ⋅ Bengaluru, Karnataka, India ⋅ Gurgaon, Haryana, India ⋅ Haryana, India

Experience Level(s):
Expert or higher ⋅ Senior

Field(s):
AI & Machine Learning ⋅ Software Engineering

Join us as a Software Engineer

  • This is an opportunity for a technically minded individual to join us as a Software Engineer
  • You’ll be designing, producing, testing and implementing working software, working across the lifecycle of the system
  • Hone your existing software engineering skills and advance your career in this critical role
  • We're offering this role at vice president level

What you'll do

Working in a permanent feature team, you’ll be developing knowledge of aspects of the associated platform across the disciplines of business, applications, data and infrastructure. You’ll also be liaising with principal engineers, architects in the domain and other key stakeholders to understand how the platform works and how it supports business objectives.

You’ll also be:

  • Applying Agile methods to the development of software on the backlog
  • Producing resilient and long-lived software and acting flexibly to cope with future needs
  • Delivering intentional architecture and formulating emergent design through innovative ideas, experimentation and prototyping
  • Designing and developing software with a focus on the automation of build, test and deployment activities, using executable patterns

The skills you'll need

We’re looking for someone with strong full stack experience in software design and implementation, including being able to exploit programming languages to solve complex problems. You’ll also need to be capable of complex requirements analysis capture and validation against and with business and systems requirements.

Additionally, you’ll demonstrate:

  • A minimum of 10 years of hands-on experience in Python, with deep expertise in building scalable and high‑performance applications
  • At least 6 years of experience with PySpark, including distributed data processing, optimisation, and integration within AI/ML workflows
  • Strong experience with AWS AI and cloud services, covering model training, deployment, orchestration, and monitoring in production environments
  • Extensive knowledge of AI/ML frameworks and libraries, such as TensorFlow, PyTorch, Scikit-learn, or similar
  • Solid understanding of machine learning principles, model lifecycle management (MLOps), and integration of models into production applications
  • Proven ability to architect scalable, secure, and maintainable cloud‑based applications, with a focus on best practices and performance optimisation

Hours

45

Job Posting Closing Date:

11/03/2026