Senior Data Engineer

Posted:
3/10/2026, 2:25:30 PM

Location(s):
North Carolina, United States ⋅ Durham, North Carolina, United States

Experience Level(s):
Senior

Field(s):
Data & Analytics

Workplace Type:
On-site

Job Description:

Position Description:

Modernizes and tests software applications that support client experiences within UNIX environments, using Shell Scripting. Writes SQL queries and debugs stored procedures using PL/SQL. Improves software capabilities within Cloud environments (Amazon Web Services (AWS)) and related services (Amazon Elastic Compute Cloud (EC2), Virtual Private Cloud (VPC), CloudWatch, Amazon Simple Storage Service (S3), and Lambda). Enhances data integration platforms using Extract, Transform, Load (ETL) technologies (Informatica). Automates the execution of routine computing jobs by leveraging job scheduling software tools -- Autosys and Control-M.

Primary Responsibilities:

  • Builds and modernizes innovative and Cloud-native internal software experiences, capabilities, and platforms.
  • Assists in the identification, isolation, resolution, and communication of technical problems within production and nonproduction environments.
  • Analyzes information to determine, recommend, and plan computer software specifications on major projects, and proposes modifications and improvements based on user need.
  • Develops software system testing and validation procedures, programming, and documentation.
  • Designs, builds, and maintains scalable and reliable ETL/ELT pipelines using Python and orchestration tools.
  • Develops reusable data platform components and frameworks to accelerate ingestion, transformation, and data quality enforcement.
  • Troubleshoots and resolves data pipeline failures and performance bottlenecks in production environments.
  • Monitors application performance and reliability, proactively identifying areas for improvement and optimization.
  • Collaborates with junior engineers and contributes to internal knowledge sharing, code reviews, and best practices.

Education and Experience:

Bachelor’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and three (3) years of experience as a Senior Data Engineer (or closely related occupation) performing data engineering, advanced data analytics, and database performance tuning and automation, within a financial services environment.

Or, alternatively, Master’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and one (1) year of experience as a Senior Data Engineer (or closely related occupation) performing data engineering, advanced data analytics, and database performance tuning and automation, within a financial services environment.

Skills and Knowledge:

Candidate must also possess:

  • Demonstrate Expertise (“DE”) developing Extract, Load, Transform (ELT)/ETL pipelines to move data to and from Snowflake data stores, using ETL technologies -- Informatica, Python, and Snowflake SnowSQL; and performing automation performance using Load Runner and J Meter for efficient data movement, system scalability, and to identify and resolve performance bottlenecks.
  • DE performing data integration and end-to-end validation or end-to-end functional and regression testing for data loads, using iCEDQ, SQL, PL/SQL, and Groovy; and performing automation and validation of REST APIs and Web services, using Fitnesse, SQL, Swagger, Postman, and Post Execution Review (PER) for data accuracy and system reliability across platforms, reducing defects, and enabling seamless data integration.
  • DE implementing business rules for assessing and holding data based on trades or market changes, using Tableau; and delivering visual data analytics to support timely, informed decision-making by transforming complex financial data and market trends into actionable insights, using Tableau and Pivot charts.
  • DE automating the build and release of software applications developed in C++, using Python and Shell scripting to accelerate deployment cycles and reduce manual errors through streamlined and repeatable release processes; and designing, building, and automating AWS Cloud DevSecOps services, using Lambda,  RDS, S3, EC2, Cloud Watch, IAM, Jenkins, and Concourse to enhance infrastructure security, availability, and operational efficiency through automated Cloud service management.

#PE1M2

#LI-DNI

Certifications:

Category:

Information Technology

Most roles at Fidelity are Hybrid, requiring associates to work onsite every other week (all business days, M-F) in a Fidelity office. This does not apply to Remote or fully Onsite roles. Some roles may have unique onsite requirements. Please consult with your recruiter for the specific expectations for this position.

Please be advised that Fidelity’s business is governed by the provisions of the Securities Exchange Act of 1934, the Investment Advisers Act of 1940, the Investment Company Act of 1940, ERISA, numerous state laws governing securities, investment and retirement-related financial activities and the rules and regulations of numerous self-regulatory organizations, including FINRA, among others. Those laws and regulations may restrict Fidelity from hiring and/or associating with individuals with certain Criminal Histories.