Technical Architect - Cloud - Snowflake, FinOps, SnowPro Advanced & Implementation

Posted:
3/29/2026, 10:17:04 AM

Location(s):
Bengaluru, Karnataka, India ⋅ Karnataka, India

Experience Level(s):
Expert or higher ⋅ Senior

Field(s):
DevOps & Infrastructure ⋅ Software Engineering

Scope:

  • The role requires both hands-on advanced engineering and architectural leadership, ensuring performant, secure, and future-ready data ecosystems across major business units.
  • The individual will own end-to-end data architecture, multi-cluster compute scaling, and technical governance across Snowflake features (Iceberg, Snowpark, native governance).
  • The position involves enabling FinOps optimization, architectural innovation, and platform automation while guiding cross-functional stakeholders and engineering teams.

Technical Environment:

  • Snowflake Platform: Multi-Cluster Warehouses, Snowpipe, Tasks, Streams, Zero-Copy Cloning, Time Travel, Apache Iceberg tables.
  • Data Engineering & Scripting: Advanced SQL, Python (Snowpark), Java/Scala (UDFs/UDTFs), dbt (Data Build Tool).
  • Integrations & Orchestration: Apache Airflow, Fivetran, Kafka, Spark, Trino, external catalogs (AWS Glue, Polaris).
  • Governance & Security: Hierarchical RBAC, Dynamic Data Masking, Row Access Policies, Object Tagging, Secure Data Sharing.
  • Platform Enhancements: Snowpark Container Services, Snowflake Cortex (AI/ML), Search Optimization Service, Materialized Views.
  • DataOps/Agile: CI/CD pipelines, Git, GitHub Actions/GitLab, Terraform (Infrastructure-as-Code), Agile delivery.

What you’ll do:

  • Design & Architect: Define scalable, petabyte-scale data lakehouse architectures. Establish enterprise coding standards, hierarchical RBAC models, and data security best practices.
  • Develop & Deliver: Architect multi-cluster compute topologies for high-concurrency workloads. Build low-latency integrations with external compute engines and catalogs.
  • Guide & Govern: Provide technical leadership to data engineers. Lead technical whiteboarding sessions and data strategy workshops with business stakeholders.
  • Operate & Optimize: Implement robust FinOps monitoring, compute quotas, and warehouse right-sizing routines. Perform deep root-cause analysis on chronic platform contention issues.

What we are looking for:

  • Bachelor’s degree in Computer Science, Data Engineering, IT, or a related technical field.
  • 9–12 years of IT/Data experience, with 5–7+ years specifically in deep Snowflake implementation, architecture, and system design.
  • Proven experience designing modern data stacks utilizing open table formats (Apache Iceberg) and Data Mesh principles.
  • Experience in FinOps (resource monitors, chargeback models) and enterprise security (Dynamic Data Masking, Row Access Policies).
  • Strong communication and stakeholder management skills to influence technical roadmaps.
  • Snowflake Certifications - SnowPro Advanced (Architect or Data Engineer) strongly preferred.

Our Values


If you want to know the heart of a company, take a look at their values. Ours unite us. They are what drive our success – and the success of our customers. Does your heart beat like ours? Find out here: Core Values

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.

Blue Yonder

Website: https://blueyonder.com/

Headquarter Location: Scottsdale, Arizona, United States

Employee Count: 5001-10000

Year Founded: 1985

IPO Status: Private

Last Funding Type: Secondary Market

Industries: CRM ⋅ Data Management ⋅ SaaS ⋅ Software ⋅ Supply Chain Management ⋅ Warehouse Automation