Senior Data Engineer

Posted:
1/13/2026, 2:46:03 PM

Location(s):
Westlake, Texas, United States ⋅ Texas, United States

Experience Level(s):
Senior

Field(s):
Data & Analytics

Workplace Type:
On-site

Job Description:

Position Description: 

 

Develops and implements data management and reporting services tools, platforms, and enhancements. Participates in the designing and crafting modern platforms in the Cloud to support company products for global markets. Creates, develops, and maintains various metrics, reports, dashboards, and Business Intelligence (BI) solutions. Builds scalable patterns for data consumption from Cloud-based data lakes by leveraging Cloud technologies and DevOps concepts, including Continuous Integration/Continuous Delivery (CI/CD) pipelines. Develops and maintains comprehensive reporting solutions to provide actionable insights and support data-driven decision-making for the team and stakeholders. 

 

Primary Responsibilities: 

 

  • Identifies opportunities for new development within a scalable public Cloud environment. 

  • Collaborates and partners with product owners and development teams to translate business requirements into actionable tasks, ensuring cross-functional teams are informed throughout the project lifecycle.  

  • Works closely with business partners and other system partners and serves as developer for new tools and implementation projects.  

  • Analyzes and manipulates datasets aligned with business requirements, to ensure data integrity, accessibility, and security throughout the entire data lifecycle.  

  • Applies data engineering, data warehousing, and analytics technologies in the data application development, data integration, and data pipeline design patterns on a distributed platform.  

  • Collaborates with development teams to integrate automated testing into the sprint cycle. 

  • Performs continuous testing and validation of new features and functionalities.  

  • Analyzes complex requirements, collaborates with developers to design efficient solutions, and creates prototypes to validate proposed solutions and mitigate technical risks. 

 

Education and Experience: 

 

Bachelor’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and three (3) years of experience as a Senior Data Engineer (or closely related occupation) building financial and Medicare data warehouse applications using data modeling and Extract, Transform, and Load (ETL) processing. 

 

Or, alternatively, Master’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and one (1) year of experience as a Senior Data Engineer (or closely related occupation) building financial and Medicare data warehouse applications using data modeling and Extract, Transform, and Load (ETL) processing. 

 

Skills and Knowledge: 

 

Candidate must also possess: 

 

  • Demonstrated Expertise (“DE”) implementing ETL and Data Integration (DataStage with UNIX); designing and optimizing ETL Workflows, using IBM DataStage, Mainframe files, databases (DB2 and Snowflake), and Flat files; automating and integrating tasks, using UNIX Shell scripting with DataStage jobs; implementing effective data transformation and performance tuning, using DataStage and UNIX; and orchestrating multistage data pipelines to ensure data accuracy and automated data profiling, using DataStage and UNIX. 

  • DE designing and implementing data workflows, using Apache Airflow (migrating TWS and Control-M to improve scaling and automation); coordinating the execution of DataStage and Snowflake jobs, using UNIX scripts; and automating Snowflake features (Snowpipe and stored procedures) and integrating them into workflows with task parallelization and dependency tracking, using Directed Acyclic Graphs (DAGs) (to ensure Audit logging).  

  • DE implementing Cloud Data Warehousing to store, process, and analyze data, using Snowflake; designing complex SQL and PL/SQL, and optimizing complex data transformations, analytics and aggregations, using Command Table Expression (CTEs), pivoting, and window functions (to handle data processing); developing data ingestion pipelines, using Snowflake with Airflow and ETL scripts; and implementing Change Data Capture (CDC) strategies for incremental loads and optimizing bulk data ingestion on Cloud services (Azure and AWS).  

  • DE developing and implementing business analytics and development strategies on FinTech and Medicare data to visualize trends and identify insights, using PowerBI and Tableau; conducting Cognos and MSTR Report Validations, using Cognos Framework Models and MicroStrategy;  verifying report data accuracy with SQL on database systems (Netezza, Snowflake, Oracle, and DB2); and executing ETL testing for data accuracy, completeness, and transformation logic, using SQL. 

#PE1M2 

#LI-DNI 

Certifications:

Category:

Information Technology

Most roles at Fidelity are Hybrid, requiring associates to work onsite every other week (all business days, M-F) in a Fidelity office. This does not apply to Remote or fully Onsite roles.

Please be advised that Fidelity’s business is governed by the provisions of the Securities Exchange Act of 1934, the Investment Advisers Act of 1940, the Investment Company Act of 1940, ERISA, numerous state laws governing securities, investment and retirement-related financial activities and the rules and regulations of numerous self-regulatory organizations, including FINRA, among others. Those laws and regulations may restrict Fidelity from hiring and/or associating with individuals with certain Criminal Histories.

Fidelity Investments

Website: https://www.fidelity.com/

Headquarter Location: Boston, Massachusetts, United States

Employee Count: 10001+

Year Founded: 1946

IPO Status: Private

Last Funding Type: Secondary Market

Industries: Asset Management ⋅ Finance ⋅ Financial Services ⋅ Retirement ⋅ Wealth Management