Posted:
11/25/2024, 8:52:56 PM
Location(s):
Pune, Maharashtra, India ⋅ Maharashtra, India
Experience Level(s):
Junior ⋅ Mid Level ⋅ Senior
Field(s):
Software Engineering
Team/Function Overview
Markets Data team is building the next generation Data fabric to solve for Business, Analytics
and growing regulatory needs. Vast amounts of data assets have been accumulated through the
years. Data fabric built on emerging technologies will facilitate the data being inspected,
cleansed, transformed for support decision-making
This job involves being part of a dynamic team for Markets data Risk Reporting on Cancel &
Corrects and Open / Unconfirmed trades and contributing towards software development of
core components using ETL technologies and Cloud database platform. The ideal candidate will
have an eye for building and optimizing data systems and will work closely with our Systems
Architects, Data Scientists, and Analysts to help direct the flow of data within the pipeline and
ensure consistency of data delivery and utilization across multiple projects.
Role / Position Overview
Olympus (Re-platforming of Ocean), the regulatory reporting infrastructure is being re-built
strategically starting with Equities. As part of Olympus build out, the developer will be working
on to rebuild the Markets data Risk Enterprise Program on Cancel & Corrects along with Open
and Unconfirmed trades data.
We need a strong database developer with thorough understanding of advanced Database
concepts to understand the existing application and then migrate the same to Olympus. The
specific skill sets required are exposure to any RDBMS Database, Python, PySpark or JavaSpark.
Experience in any ETL Tools is a good to have skill set.
Key Responsibilities:
The role will include but not be limited to the following:
• Design/implement data objects using Data Warehousing methodologies including
Oracle or similar relational database tools, SQL and PL/SQL. Implement DWH solution
using Spark SQL and Python on Big Data.
• Identify re-usable database components and develop recommendations for ALPS target
architecture
• Align to database programming standards and best practices.
• Develop ETL jobs using Talend 8x for processing of data in a Datawarehouse.
• Collaborate with project teams to refine functional requirements and translate into
technical architecture/design
• Continuously monitor/tune database performance identifying potential
issues/opportunities for improvement and outline recommendations to improve
performance
• Work with development teams ensuring adherence to database standards
• Oversee change management process for database objects across multiple projects
• Accountable for delivery of the database objects through SIT, UAT and Production.
• Liaise with clients to determine requirements and interpret into solutions
• Mentoring and training of junior team members
Development Value:
Candidate has the opportunity to be a major contributor to the Citi Markets Data Strategy and
contribute towards the goal of increasing revenue using key metrics for decision making. The
candidate will work with bright and innovative individuals both on the business and technology
side and the successful candidate can make a significant difference to the business performance.
Knowledge/Experience:
• 8+ years of experience within the technology or banking industry
• Strong Experience in developing ETL solutions using Pyspark and thorough
understanding of advanced DWH concepts.
• Strong hands on experience in developing API modules using Python.• Strong experience/advanced knowledge of designing conceptual, logical & physical data
models and generating initial Data Definition Language
• Very strong database design/development experience using Oracle 12C/19C
• Working experience in Hadoop, Hive Impala
• Expert in SQL & PL/SQL modules such as packages, procedures, functions and other
database objects
• Expert in Database Performance Tuning
• Strong knowledge of DBA skills using Oracle 12C/19C
• Experience with Java will be an added advantage.
• Expert in Big Data querying tools e.g. Hive and Impala.
• Writing Python modules and API related to various Data abstraction layer
• Experience in working with any ETL tool like Talend 7x or higher will be an added
advantage
• Additional Job Description
Additional Job Description
Team/Function Overview
Markets Data team is building the next generation Data fabric to solve for Business, Analytics
and growing regulatory needs. Vast amounts of data assets have been accumulated through the
years. Data fabric built on emerging technologies will facilitate the data being inspected,
cleansed, transformed for support decision-making
This job involves being part of a dynamic team for Markets data Risk Reporting on Cancel &
Corrects and Open / Unconfirmed trades and contributing towards software development of
core components using ETL technologies and Cloud database platform. The ideal candidate will
have an eye for building and optimizing data systems and will work closely with our Systems
Architects, Data Scientists, and Analysts to help direct the flow of data within the pipeline and
ensure consistency of data delivery and utilization across multiple projects.
Role / Position Overview
Olympus (Re-platforming of Ocean), the regulatory reporting infrastructure is being re-built
strategically starting with Equities. As part of Olympus build out, the developer will be working
on to rebuild the Markets data Risk Enterprise Program on Cancel & Corrects along with Open
and Unconfirmed trades data.
We need a strong database developer with thorough understanding of advanced Database
concepts to understand the existing application and then migrate the same to Olympus. The
specific skill sets required are exposure to any RDBMS Database, Python, PySpark or JavaSpark.
Experience in any ETL Tools is a good to have skill set.
Key Responsibilities:
The role will include but not be limited to the following:
• Design/implement data objects using Data Warehousing methodologies including
Oracle or similar relational database tools, SQL and PL/SQL. Implement DWH solution
using Spark SQL and Python on Big Data.
• Identify re-usable database components and develop recommendations for ALPS target
architecture
• Align to database programming standards and best practices.
• Develop ETL jobs using Talend 8x for processing of data in a Datawarehouse.
• Collaborate with project teams to refine functional requirements and translate into
technical architecture/design
• Continuously monitor/tune database performance identifying potential
issues/opportunities for improvement and outline recommendations to improve
performance
• Work with development teams ensuring adherence to database standards
• Oversee change management process for database objects across multiple projects
• Accountable for delivery of the database objects through SIT, UAT and Production.• Liaise with clients to determine requirements and interpret into solutions
• Mentoring and training of junior team members
Development Value:
Candidate has the opportunity to be a major contributor to the Citi Markets Data Strategy and
contribute towards the goal of increasing revenue using key metrics for decision making. The
candidate will work with bright and innovative individuals both on the business and technology
side and the successful candidate can make a significant difference to the business performance.
Knowledge/Experience:
• 8+ years of experience within the technology or banking industry
• Strong Experience in developing ETL solutions using Pyspark and thorough
understanding of advanced DWH concepts.
• Strong hands on experience in developing API modules using Python.
• Strong experience/advanced knowledge of designing conceptual, logical & physical data
models and generating initial Data Definition Language
• Very strong database design/development experience using Oracle 12C/19C
• Working experience in Hadoop, Hive Impala
• Expert in SQL & PL/SQL modules such as packages, procedures, functions and other
database objects
• Expert in Database Performance Tuning
• Strong knowledge of DBA skills using Oracle 12C/19C
• Experience with Java will be an added advantage.
• Expert in Big Data querying tools e.g. Hive and Impala.
• Writing Python modules and API related to various Data abstraction layer
• Experience in working with any ETL tool like Talend 7x or higher will be an added
advantage
------------------------------------------------------
Job Family Group:
Technology------------------------------------------------------
Job Family:
Applications Development------------------------------------------------------
Time Type:
Full time------------------------------------------------------
Citi is an equal opportunity and affirmative action employer.
Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi.
View the "EEO is the Law" poster. View the EEO is the Law Supplement.
View the EEO Policy Statement.
View the Pay Transparency Posting
Website: https://www.citigroup.com/
Headquarter Location: New York, New York, United States
Employee Count: 10001+
Year Founded: 1812
Last Funding Type: Post-IPO Equity
Industries: Banking ⋅ Credit Cards ⋅ Financial Services ⋅ Wealth Management