Senior Data Engineer (8+ Years in Python Development + ETL Development + Snowflake Development + 2 Years in ADF)

Posted:
1/18/2026, 12:44:34 PM

Location(s):
Telangana, India ⋅ Bengaluru, Karnataka, India ⋅ Karnataka, India ⋅ Hyderabad, Telangana, India

Experience Level(s):
Senior

Field(s):
Data & Analytics

Workplace Type:
Hybrid

Senior Data Engineer, Assurant, GCC-India

Reports To: Director of Product Engineering & Integrations.
 

Position Summary

We are seeking a highly skilled Senior Data Engineer to design, develop, and optimize data pipelines and cloud-based data solutions. The ideal candidate will have strong expertise in Azure Data Factory, Snowflake, and modern ETL/ELT practices, enabling scalable, secure, and high-performance data workflows. This role will collaborate closely with analytics and BI teams to deliver reliable data infrastructure supporting enterprise reporting and advanced analytics.

This position will be in Bangalore/Chennai/Hyderabad at our India location.

Work Time: 3:30 PM IST ~ 12:30 AM IST

What will be my duties and responsibilities in this job?

Data Engineering & ETL Development

  • Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory, Snowflake Tasks, and Snowpipe for real-time and batch ingestion.
  • Implement best practices for data modeling, transformation, and performance tuning within Snowflake.
  • Build and manage pipelines connecting multiple structured and unstructured data sources across cloud and on-prem environments.
  • Automate data quality checks, data lineage tracking, and error handling within ETL workflows.

Snowflake Development & Optimization

  • Develop and maintain Snowflake schemas, views, stored procedures, and materialized views.
  • Configure and optimize Snowpipe for continuous data loading.
  • Utilize Snowsight for monitoring query performance, cost optimization, and workload analysis.
  • Implement role-based access control and ensure data security in Snowflake.

Azure & Cloud Integration

  • Integrate Azure Data Factory with other Azure services (Blob Storage, Synapse, Key Vault).
  • Design scalable cloud architectures and orchestrate pipelines across hybrid environments.
  • Implement CI/CD pipelines for data workflows using GitHub Actions.

Analytics & Reporting Enablement

  • Collaborate with business analysts and BI teams to enable Power BI dashboards backed by optimized Snowflake data models.
  • Create semantic models and data marts to support self-service analytics and reporting.

Scripting & Automation

  • Develop Python scripts for automation, data processing, and custom integrations.
  • Leverage Python-based frameworks (Pandas, PySpark, Airflow) to enhance pipeline capabilities.

What are the requirements needed for this position?

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • 8+ years of experience in data engineering, ETL development, and cloud data platforms.
  • Strong proficiency in Snowflake, Azure Data Factory, and Python.
  • Experience with CI/CD, data security, and performance optimization.
  • Familiarity with BI tools (Power BI, Looker, etc.) and data modeling best practices.
  • Excellent problem-solving skills and ability to work in a fast-paced environment.

What are the Preferred requirements needed for this position?

  • Knowledge of Airflow, PySpark, and data orchestration frameworks.
  • Experience with real-time data ingestion and streaming architectures.
  • Understanding of cost optimization in cloud environments.