Principal Engineer - Data Platform (Noida, India)

Posted:
11/11/2024, 2:45:29 AM

Location(s):
Uttar Pradesh, India ⋅ Noida, Uttar Pradesh, India

Experience Level(s):
Expert or higher ⋅ Senior

Field(s):
Data & Analytics ⋅ Software Engineering

Workplace Type:
Hybrid

Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry.

Position Overview : We seek an experienced Principal Software Engineer to lead the design and development of our data warehouse and analytics platform in addition to  help raise the engineering bar for the entire technology stack at Level AI, including applications, platform and infrastructure. 
They will actively collaborate with team members and the wider Level AI engineering community to develop highly scalable and performant systems. They will be a technical thought leader who will help drive solving complex problems of today and the future by designing and building simple and elegant technical solutions. They will coach and mentor junior engineers and drive engineering best practices. They will actively collaborate with product managers and other stakeholders both inside and outside the team.

Competencies :

Data Modeling: Skilled in designing data warehouse schemas (e.g., star and snowflake schemas), with experience in fact and dimension tables, as well as normalization and denormalization techniques.
Data Warehousing & Storage Solutions: Proficient with platforms such as Snowflake, Amazon Redshift, Google BigQuery, and Azure Synapse Analytics.
ETL/ELT Processes: Expertise in ETL/ELT tools (e.g., Apache NiFi, Apache Airflow, Informatica, Talend, dbt) to facilitate data movement from source systems to the data warehouse.
SQL Proficiency: Advanced SQL skills for complex queries, indexing, and performance tuning.
Programming Skills: Strong in Python or Java for building custom data pipelines and handling advanced data transformations.
Data Integration: Experience with real-time data integration tools like Apache Kafka, Apache Spark, AWS Glue, Fivetran, and Stitch.
Data Pipeline Management: Familiar with workflow automation tools (e.g., Apache Airflow, Luigi) to orchestrate and monitor data pipelines.
APIs and Data Feeds: Knowledgeable in API-based integrations, especially for aggregating data from distributed sources.
To learn more visit : https://thelevel.ai/