Why Ryan?
Global Award-Winning Culture
Flexible Work Environment
Generous Paid Time Off
World-Class Benefits and Compensation
Rapid Growth Opportunities
Company Sponsored Two-Way Transportation
Exponential Career Growth
The Senior Data Architect is responsible for overseeing the implementation of technical solutions for the ingestion, transformation, processing, and storage of data in large-volume/low-latency pipelines. This role requires an enterprise mindset to build out robust, high-performance technology solutions.
The Senior Architect position mixes strong hands-on technical leadership with the management of matrixed resources in order to achieve business outcomes. It is expected that this person is as excited about developing code as they are about developing the skills and competencies of the team at large.
People
- Model and design the data platform architecture — defining data domains, data flows, and platform topology within a lakehouse environment (e.g., Databricks) — while leveraging a variety of tools and languages to build and maintain data pipelines within the Platform Reference Architecture.
- Working directly with management, product teams and practice personnel to understand their platform data requirements, translating them into well-structured platform data models and analytical frameworks that reflect the intended data platform design
- Maintaining a positive work atmosphere by behaving and communicating in a manner that encourages productive interactions with customers, co-workers and supervisors
- Developing and engaging with team members by creating a motivating work environment that recognizes, holds team members accountable, and rewards strong performance
- Fostering an innovative, inclusive and diverse team environment, promoting positive team culture, encouraging collaboration and self-organization while delivering high quality solutions
Client
- Collaborating on an Agile team to design, develop, test, implement and support highly scalable data solutions on lakehouse platforms such as Databricks, with particular emphasis on platform data modeling, data flow analysis, and ensuring solutions reflect sound data architecture principles
- Collaborating with product teams and clients to deliver robust cloud-based data solutions that drive tax decisions and provide powerful experiences, leveraging data analysis and platform modeling to ensure data is accurate, well-governed, and business-ready
- Analyzing user feedback and activity and iterate to improve the services and user experience
Value
- Securing data in alignment with internal information and data security policies, best practices and client requirements
- Creating and implementing robust cloud-based data solutions that scale effectively, underpinned by well-designed platform data models and analytical frameworks — including medallion architecture (Bronze/Silver/Gold) and Delta Lake patterns — that provide powerful experiences for both internal teams and clients
- Performing unit tests and conducting reviews with other team members to make sure solutions are rigorously designed, elegantly coded, and effectively tuned for performance, including formal platform architecture reviews to validate data flows and platform design against business requirements and enterprise standards
- Staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities and mentoring other members of the engineering community, with a focus on advancing data platform modeling and analytics capabilities across the practice
- Perform other duties as assigned
Education and Experience:
- Bachelor’s and/or master’s degree in a related field..
- 10+ years of experience developing data technologies, with significant depth in data platform modeling — including defining data domains, data products, information flows, and platform reference architectures on lakehouse platforms such as Databricks, including Delta Lake, Unity Catalog, and medallion architecture design.
- 10+ years of experience deploying ETL/ELT solutions in production environments, preferably using Spark-based platforms such as Databricks, with strong proficiency in data analysis and profiling to inform platform design decisions and support reporting needs.
- 10+ years of experience developing cloud-based data services, preferably in AWS or Azure, with emphasis on data governance, metadata management, and data catalog practices within lakehouse platforms such as Databricks Unity Catalog.
- 10+ years of developing and overseeing the development of Python, Scala, Java, .Net or similar solutions — including PySpark and Databricks notebooks — combined with strong SQL skills and experience with BI/reporting tools (e.g., Power BI, Tableau) to support analytical deliverables
- 10+ years of database/query tuning, including experience in data quality assessment, data profiling, and establishing data standards to ensure accuracy and consistency across data assets
- 10+ years of experience in mixed Windows/Linux environments, with demonstrated ability to partner with business stakeholders to define data requirements, validate data definitions, and deliver analytical insights.
Additional Required Skills and Experience:
- Results-proven track record of exceeding goals and evidence of the ability to consistently make good decisions through a combination of analysis, experience and judgment
- Fluency in one or more databases, preferably relational and NoSQL is a plus.
- Fluency with distributed data platforms, with hands-on experience on Databricks (Delta Lake, Unity Catalog, Databricks Workflows, and the Databricks lakehouse architecture) and proficiency with data architecture and platform design tools such as LeanIX, Ardoq, or similar enterprise architecture tooling.
- Knowledge of at least one AI/ML pipeline technology or platform, along with familiarity with statistical analysis and data visualization to translate findings into clear business recommendations.
- Experience deploying, monitoring, and maintaining data pipelines in production environments — including Databricks Workflows and Delta Live Tables — with familiarity with data governance frameworks, data lineage documentation, and enterprise data catalog tools
- Design models of the data platform that implement the intended business architecture, including data domains, data products, and information flows within a lakehouse paradigm (e.g., medallion architecture on Databricks)
- Develop diagrams representing key platform components, data flows, and system interactions across the data platform landscape
- Generate a list of components needed to build the designed platform, including data definitions, integration points, and analytical requirements needed to support the platform architecture
- Communicate clearly, simply, and effectively
Computer Skills:
To perform this job successfully, an individual must have intermediate knowledge of Microsoft Project, Word, Excel, Access, PowerPoint, Outlook, and Internet navigation and research.
Supervisory Responsibilities:
Requires supervisory responsibilities, including training employees, assigning work, and assuring quality throughout any deliverables
Work Environment:
- Standard indoor working environment.
- Occasional long periods of sitting while working at computer.
- Position requires regular interaction with employees at all levels of the Firm and interface with external vendors as necessary.
- Independent travel requirement: As Needed
Equal Opportunity Employer: disability/veteran