Important Information:
- Years of Experience: 3+ years of experience in data engineering or data analysis.
- Job Mode: Full-time.
- Work Mode: Remote.
Job Summary:
The Technical Data Steward will be responsible for translating business needs into technical data requirements, managing logical data models, and collaborating with ingestion and curation teams. This role requires expertise in logical data modeling, metadata management, data curation, and data governance, ensuring that data products meet business requirements while adhering to governance and cataloging standards.
Responsibilities and Duties:
- Collaborate with business leaders to define requirements and prioritize metric updates.
- Develop technical requirements such as aggregation levels and data granularity based on business metrics.
- Design logical data models and orchestration requirements for data processes.
- Manage and maintain the logical data model for the Enterprise Analytics team in coordination with ingestion and curation leads.
- Oversee data catalog entries and ensure compliance with data cataloging rules.
- Manage access controls for end data products and handle change requests related to data structure, definitions, and security.
- Communicate with Business Data System Owners and Business Data Stewards regarding metric updates and data logic.
- Act as an escalation point for Data Quality Analysts when issues arise that are beyond established processes.
Qualifications and Skills:
- Bachelor’s degree in Computer Science, Engineering, Statistics, Economics, or MBA.
- 3+ years of experience in data engineering or data analysis.
- Strong understanding of logical data modeling and data curation practices.
- Experience managing complex metadata and aligning technical data requirements with business needs.
Role-specific Requirements:
- Proficiency in SQL and Python (or other programming languages).
- Hands-on experience with Databricks or other cloud data management platforms.
- Familiarity with technical and business data cataloging tools.
- Experience with ETL/ELT tools like Apache Airflow.
- Experience designing logical data models and developing data dictionaries.
- Proven experience in data management, requirements management, and metadata management.
Technologies:
- SQL, Python.
- Databricks or other cloud data management platforms.
- Apache Airflow.
- Data cataloging and metadata management tools.
Skillset Competencies:
- Data modeling and curation.
- Requirements and metadata management.
- Change management for data structures and definitions.
- Communication with business and technical stakeholders.
About Encora:
Encora is the preferred digital engineering and modernization partner of some of the world’s leading enterprises and digital-native companies. With over 9,000 experts in 47+ offices and innovation labs worldwide, Encora’s technology practices include Product Engineering & Development, Cloud Services, Quality Engineering, DevSecOps, Data & Analytics, Digital Experience, Cybersecurity, and AI & LLM Engineering.
At Encora, we hire professionals based solely on their skills and qualifications, and do not discriminate based on age, disability, religion, gender, sexual orientation, socioeconomic status, or nationality.