Posted:
2/25/2026, 8:26:20 PM
Location(s):
Haryana, India ⋅ Gurgaon, Haryana, India
Experience Level(s):
Senior
Field(s):
Data & Analytics
Job Title: Sr. Data Engineer (FNZ)
About FNZ:
FNZ is a global fintech firm transforming the way financial institutions serve their clients. By combining cutting-edge technology, infrastructure, and investment operations, FNZ enables wealth management firms to deliver personalized investment solutions at scale. Operating across multiple regions and supporting over $1.5 trillion in assets under administration, FNZ partners with leading banks, insurers, and asset managers to create seamless and innovative wealth platforms that empower millions of investors worldwide.
Job Summary:
We are seeking an experienced and hands-on Kafka Engineer to join our data engineering team. The ideal candidate will design, implement, and manage real-time data pipelines using Apache Kafka, supporting Change Data Capture (CDC) from multiple transactional systems into a high-throughput, resilient, and cloud-agnostic data platform. This role plays a critical part in enabling near real-time analytics and operational decision-making across the organization.
Key Responsibilities:
•
Kafka Architecture & Management: Design, configure, monitor, and maintain Kafka clusters across AWS and Azure. Ensure high availability, fault tolerance, and secure operation of Kafka environments.
•
CDC Integration: Implement and manage Change Data Capture pipelines using tools such as Debezium, with a focus on PostgreSQL as a source system. Handle logical replication slots, publications, and plugins effectively.
•
Streaming Solutions: Build and operate real-time streaming pipelines using Kafka Streams or Flink (where applicable), supporting schema evolution, ordered event processing, and deduplication.
•
AI-Assisted Development: Leverage AI coding agents (e.g., Claude Code, GitHub Copilot) to accelerate development, improve code quality, and enhance productivity. Contribute to prompt engineering and agent configuration to optimize team workflows.
•
Data Ingestion Pipelines: Build connectors and microservices (Java preferred) for ingesting and transforming streaming data. Enable downstream consumers including lake houses, analytics layers, and operational dashboards.
•
Cross-Cloud Enablement: Develop cloud-agnostic Kafka deployment strategies with experience in managing services across AWS and Azure. Optimize networking, security, and cost profiles for multi-cloud streaming.
•
Containerization & Orchestration: Deploy and manage Kafka and supporting services (like Schema Registry, Kafka Connect) on Kubernetes. Ensure observability and horizontal scalability.
•
Infrastructure as Code: Use Terraform (or equivalent frameworks) to provision and manage Kafka infrastructure and CI/CD pipelines for repeatable and testable deployments.
•
Operational Readiness: Implement robust monitoring, alerting, and logging (e.g., Prometheus, Grafana, ELK) for Kafka pipelines. Support troubleshooting and root cause analysis.
•
Documentation & Best Practices: Maintain clear documentation for infrastructure, data flow, failure handling, and operational playbooks.
Qualifications:
•
Education: Bachelor’s degree in computer science, Engineering, or a related technical field.
•
Experience: 5+ years of hands-on experience with Kafka in production environments.
•
CDC Expertise: Strong working knowledge of CDC tools (preferably Debezium), logical replication in PostgreSQL, and real-time event sourcing patterns.
•
Programming Skills: Proficiency in Java (preferred), with knowledge of concurrency and stream processing. Familiarity with Python or Go is a plus.
•
AI Coding Tools: Demonstrated experience using AI coding agents (Claude Code, GitHub Copilot, Cursor, or similar) as part of daily development workflow. Comfortable with prompt engineering and iterating with AI to solve complex problems.
•
Cloud & Kubernetes: Demonstrated experience deploying and managing Kafka in Kubernetes environments across AWS and Azure.
•
DevOps: Strong experience with Terraform, Helm, and CI/CD pipelines (e.g., GitHub Actions, Jenkins).
•
Security & Compliance: Understanding of data encryption, TDE, IAM, network policies, and secure streaming practices.
Preferred Qualifications:
•
Experience working in the Wealth Management or Financial Services industry with strong emphasis on data governance and compliance.
•
Experience with schema management, Kafka Connect, and event versioning strategies.
•
Familiarity with Microsoft Fabric, Delta Lake, or similar real-time lake house architectures.
•
Exposure to stream processing frameworks like Flink, Spark Streaming, or Azure Stream Analytics.
•
Kafka certification or cloud certifications (AWS/Azure) are a plus.
•
Track record of contributing to or maintaining AI agent configurations, custom instructions, or team-wide AI tooling standards.
About FNZ
FNZ is committed to opening up wealth so that everyone, everywhere can invest in their future on their terms. We know the foundation to do that already exists in the wealth management industry, but complexity holds firms back.
We created wealth’s growth platform to help. We provide a global, end-to-end wealth management platform that integrates modern technology with business and investment operations. All in a regulated financial institution.
We partner with the world’s leading financial institutions, with over US$2.2 trillion in assets on platform (AoP).
Together with our clients, we empower nearly 30 million people across all wealth segments to invest in their future.
Website: https://fnz.com/
Headquarter Location: London, England, United Kingdom
Employee Count: 5001-10000
Year Founded: 2004
IPO Status: Private
Last Funding Type: Private Equity
Industries: Finance ⋅ Financial Services ⋅ FinTech ⋅ InsurTech ⋅ Wealth Management