Responsibilities:
- Collaborate with product / business owners through the product/project lifecycle and develop data engineering solutions aligning to Nissan's data strategy.
- Lead a team of data engineers and guide them during the development lifecycle.
- Work with solution and Data architect to resolve any technical and architectural challenges during the implementation.
- Design and develop various components of data pipeline that include data ingestion, data processing and analysis of business data.
- Create high level and low-level design for the module.
- Lead scoping effort, provide inputs in preparing effort, and time estimates for projects.
- Develop medium to complex modules using coding standards and guidelines. Build reusable components/frameworks.
- Perform Code review and maintain code review standards
- Follow best practices of agile & DevOps while implementing solutions.
Skills and Qualifications:
The ideal candidate should have worked on end-to-end data warehousing, data lake solutions in cloud platforms(AWS).
The candidate should have the following skills sets:
- Strong data engineering (ETL) experience in cloud preferably in AWS.AWS Certification (developer/Devops/SA) preferred.
- Excellent understanding of distributed computing paradigm.
- Should have excellent experience in data warehouse and data lake implementation.
- Should have excellent experience in Relational databases,ETL design patterns and ETL development.
- Should have excellent experience in CICD frameworks and container-based deployments.
- Should have excellent programming and SQL skills.
- Should have good exposure to No-SQL and Big Data technologies.
- Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies:
- Data integration/Engineering – ETL tools like Talend, ETL ,AWS Glue etc. Experience in Talend Cloud ETL will be plus.
- Data warehouse - Snowflake and or AWS Redshift.Experience in Snowflake cloud DWH would be an advantage.
- Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies.
- Programming - Java/Python/Scala and SQL.
- Data visualization – Tools like Tableau, Quicksight.
- Master data management (MDM) – Concepts and experience in tools like Informatica & Talend MDM.
- Exposure to Big data – Hadoop eco-system, AWS EMR.
- Exposure to Big Data processing frameworks – Kinesis, Spark & Spark streaming
- Demonstrate strong analytical and problem solving capability
- Good understanding of the data eco-system, both current and future data trends.
- Should be a go to person for the above technologies
Drive your career forward and join the company leading the technology and business evolution in the automotive industry.
Trivandrum Kerala India