Euronext is seeking a talented Senior Data Engineer with expertise in AWS and a strong background in solution architecture, database management, and data engineering. The ideal candidate will have extensive experience with AWS services such as Lambda, Glue, Step Functions, and CloudFormation, along with proficiency in Python, SQL, and database technologies. Experience with Iceberg tables for managing large datasets is highly desirable.
As a Senior Data Engineer, you will be responsible for designing, implementing, and maintaining scalable data solutions on the AWS cloud platform. You will collaborate with cross-functional teams to develop robust data pipelines, ETL processes, and orchestration workflows, leveraging tools like Apache Airflow and AWS Step Functions. Strong production awareness and troubleshooting skills are essential, along with the ability to provide technical leadership, mentorship, and guidance to junior team members.
Key Accountabilities
The Senior Data Analyst/Engineer will be assigned to one or more projects and might change projects when higher priorities are identified. When the Senior Data Engineer has acquired sufficient experience, it can be exposed to support activity of production systems, namely being on call for support.
Design and Implement AWS Solutions:
- Utilize expert knowledge of AWS services such as Lambda, Glue, Step Functions, and others to design, implement, and maintain scalable and efficient data solutions on the cloud platform
Solution Architecture and Cloud Infrastructure:
- Develop robust solution architectures and cloud infrastructure designs, considering factors such as scalability, performance, security, and cost optimization.
- Demonstrate proficiency in cloud networking, including VPCs, subnets, security groups, and routing tables, to ensure secure and reliable data transmission.
Data Engineering and Database Management:
- Data Modeling: Designing efficient data models for optimal query performance.
- SQL Proficiency: Writing and optimizing SQL queries.
- Performance Tuning: Identifying and optimizing performance bottlenecks.
- ETL and Data Integration: Extracting, transforming, and loading data into Redshift, mysql, Postgresql.
- Cluster Management: Provisioning, scaling, and monitoring Redshift clusters.
- Security and Compliance: Implementing security measures and ensuring compliance.
- AWS Integration: Integrating Redshift with other AWS services.
- Monitoring and Troubleshooting: Monitoring cluster performance and resolving issues.
- Documentation and Training: Creating documentation and providing training to team members.
- Logging and Tracing: Proficiency in setting up and managing logging and tracing mechanisms in AWS, including leveraging services like AWS CloudTrail for auditing API calls and AWS X-Ray for distributed tracing and performance analysis. Understanding of best practices for logging configuration, log aggregation, and analysis to ensure visibility into system activity and troubleshooting capabilities.
Orchestration and Workflow Management:
- Implement orchestration solutions using tools like Apache Airflow and AWS Step Functions to automate and manage data workflows effectively.
- Utilize Athena for interactive query analysis and exploration of large datasets stored in Amazon S3.
Technical Leadership and Solution Documentation:
- Provide technical leadership and guidance to the team, acting as a subject matter expert in AWS and data engineering technologies. Write comprehensive solution documents and technical documentation to communicate architecture designs, data workflows, and best practices effectively.
Proactive Production Awareness and Troubleshooting:
- Demonstrate production awareness by proactively monitoring system health, automating checks, and anticipating potential issues to ensure smooth operation of data solutions.
- Utilize strong troubleshooting skills to identify and resolve issues promptly, minimizing downtime and impact on business operations.
Continuous Improvement and Innovation:
- Stay updated on emerging technologies and industry trends, continuously evaluating and incorporating new tools and techniques to enhance data engineering processes and infrastructure.
- Take a proactive approach to challenge business requirements and propose innovative solutions to improve efficiency, scalability, and performance.
Profile and Skills
Education and Knowledge
- BS/MS degree in Computer Science, Engineering or equivalent working experience
- English (B2 or higher level)
Euronext is looking at the following profile:
Profile & Skills:
- Expertise in AWS: Extensive experience with AWS services, particularly Lambda, Glue, StepFunctions, CloudFormation, CloudWatch, and others.
- Strong Solution Architecture Knowledge: Ability to design scalable and efficient data solutions onAWS, considering best practices for cloud architecture and infrastructure.
- Proficiency in Python and Databases: Strong programming skills in Python and experience with relational databases (MySQL, PostgreSQL, RedShift) and NoSQL databases.
- Orchestration and Workflow Management: Experience with orchestration tools such as Apache Airflow and AWS Step Functions for automating and managing data workflows.
- ETL Tools and Big Data Experience: Knowledge of ETL tools and experience working with large volumes of data, with a preference for Kafka experience.
- Experience with Iceberg Tables: Familiarity with Iceberg tables for managing large datasets efficiently, ensuring data consistency, and supporting ACID transactions.
- Production Awareness and Troubleshooting: Proactive approach to production monitoring and troubleshooting, with the ability to anticipate and mitigate potential issues.
- Technical Leadership and Communication: Capable of evolving into a technical lead role, with excellent communication and teamwork skills to collaborate effectively with cross-functional teams.
- Strong Analytical and Problem-Solving Skills: Ability to analyze requirements, define technical approaches, and propose innovative solutions to complex problems.
- Documentation and Requirements Analysis: Experience in writing solution documents, technical documentation, and the ability to challenge and refine business requirements.
Nice to Have:
- Knowledge in Apache Flink, Kafka, and other big data technologies.
- Experience with cloud-native architectures and serverless computing.Certification in AWS or relevant technologies.
Euronext Values
Unity
• We respect and value the people we work with
• We are unified through a common purpose
• We embrace diversity and strive for inclusion
Integrity
• We value transparency, communicate honestly and share information openly
• We act with integrity in everything we do
• We don’t hide our mistakes, and we learn from them
Agility
• We act with a sense of urgency and decisiveness
• We are adaptable, responsive and embrace change
• We take smart risks
Energy
• We are positively driven to make a difference and challenge the status quo
• We focus on and encourage personal leadership
• We motivate each other with our ambition
Accountability
• We deliver maximum value to our customers and stakeholders
• We take ownership and are accountable for the outcome
• We reward and celebrate performance
We are proud to be an equal opportunity employer. We do not discriminate against individuals on the basis of race, gender, age, citizenship, religion, sexual orientation, gender identity or expression, disability, or any other legally protected factor. We value the unique talents of all our people, who come from diverse backgrounds with different personal experiences and points of view and we are committed to providing an environment of mutual respect.
Additional Information
This job description is only describing the main activities within a certain role and is not exhaustive. It does not prevent to add more tasks, projects.