Senior Data Engineer

Posted:
3/18/2025, 6:21:36 AM

Experience Level(s):
Senior

Field(s):
Data & Analytics

Report to Data Platform Senior Manager

At the Aztec Group we credit our technology as one of the core ingredients to our award-winning outsourced solutions. As part of its Five-Year Plan, Aztec has the ambition to be a market-leading alternative fund administrator that provides compelling client experiences, products, services.

These are exciting times across the group. Significant growth, change, and investment make it a truly world class opportunity to help shape our organisation for the next stage of its journey.

To drive towards this ambition, we are seeking a motivated individual to join our Data Platform team and support Aztec’s new technology strategy using Azure Databricks. You will lead our Data Engineering capability and collaborate with others passionate about solving business problems.

Key responsibilities:

Data Platform Design and Architecture

  • Design, develop, and maintain a high-performing, secure, and scalable data platform, leveraging Databricks Corporate Lakehouse and Medallion Architectures.
  • Utilise our metadata-driven data platform framework combined with advanced cluster management techniques to create and optimise scalable, robust, and efficient data solutions.
  • Implement comprehensive logging, monitoring and alerting tools to manage the platform, ensuring resilience and optimal performance are maintained.

Data Integration and Transformation

  • Integrate and transform data from multiple organisational SQL databases and SaaS applications using end-to-end dependency-based data pipelines, to establish an enterprise source of truth.
  • Create ETL and ELT processes using Azure Databricks, ensuring audit-ready financial data pipelines and secure data exchange with Databricks Delta Sharing and SQL Warehouse endpoints.

Governance and Compliance

  • Ensure compliance with information security standards in our highly regulated financial landscape by implementing Databricks Unity Catalog for governance, data quality monitoring, and ADLS Gen2 encryption for audit compliance.

Development and Process Improvement

  • Evaluate requirements, create technical design documentation, and work within Agile methodologies to deploy and optimise data workflows, adhering to data platform policies and standards.

Collaboration and Knowledge Sharing

  • Collaborate with stakeholders to develop data solutions, maintain professional knowledge through continual development, and advocate best practices within a Centre of Excellence.

Skills, knowledge and expertise:

  • Deep expertise in the Databricks platform, including Jobs and Workflows, Cluster Management, Catalog Design and Maintenance, Apps, Hive Metastore Management, Network Management, Delta Sharing, Dashboards, and Alerts.
  • Proven experience working with big data technologies, i.e., Databricks and Apache Spark.
  • Proven experience working with Azure data platform services, including Storage, ADLS Gen2, Azure Functions, Kubernetes.
  • Background in cloud platforms and data architectures, such as Corporate DataLake, Medallion Architecture, Metadata Driven Platform, Event-driven architecture.
  • Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing.
  • Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL.
  • Good working knowledge of data warehouse and data mart architectures.
  • Good experience in Data Governance, including Unity Catalog, Metadata Management, Data Lineage, Quality Checks, Master Data Management.
  • Experience using Azure DevOps to manage tasks and CI/CD deployments within an Agile framework, including utilising Azure Pipelines (YAML), Terraform, and implementing effective release and branching strategies.
  • Knowledge of security practices, covering RBAC, Azure Key Vault, Private Endpoints, Identity Management.
  • Experience working with relational and non-relational databases and unstructured data.
  • Exposure to Azure Purview, Power BI, and Profisee is an advantage.
  • Ability to compile accurate and concise technical documentation.
  • Strong analytical and problem-solving skills.
  • Good interpersonal and communication skills.

We will provide the training, both in house for relevant technical knowledge and for professional qualifications. You will need to be quick to learn new systems and be great with people, as close working relationships between our colleagues and clients is at the heart of what we do.

Beyond that, we will be with you every step of the way, enabling you to get the most out of your role, grow your skills your way, and see your career develop in the way you want. Be part of our talented Technology team and unbox your passion at a multi-award-winning leader in the alternative fund management industry.

****

“For all accepted offers of employment with Aztec Financial Services (Luxembourg) S.A, candidates will be required to complete pre-screening requirements, including providing a criminal record certificate (extrait de casier judiciaire).”

*****