Expert Support Engineer - Snowflake

Posted:
3/26/2026, 7:42:50 PM

Location(s):
Bengaluru, Karnataka, India ⋅ Karnataka, India

Experience Level(s):
Expert or higher ⋅ Senior

Field(s):
DevOps & Infrastructure ⋅ IT & Security ⋅ Software Engineering

Scope:

  • This individual defines the 3-to-5-year technical vision for the enterprise data platform, functioning as the ultimate technical authority for Snowflake across the organization.

  • The role involves driving enterprise-wide topology design, multi-region disaster recovery, and cross-organizational data monetization strategies.

  • The individual will partner directly with VPs, C-level executives, and external vendors to solve unprecedented scale and business challenges

Technical Environment:

  • Snowflake Platform: Multi-Cluster Warehouses, Snowpipe, Tasks, Streams, Zero-Copy Cloning, Time Travel, Apache Iceberg tables.

  • Data Engineering & Scripting: Advanced SQL, Python (Snowpark), Java/Scala (UDFs/UDTFs), dbt (Data Build Tool).

  • Integrations & Orchestration: Apache Airflow, Fivetran, Kafka, Spark, Trino, external catalogs (AWS Glue, Polaris).

  • Governance & Security: Hierarchical RBAC, Dynamic Data Masking, Row Access Policies, Object Tagging, Secure Data Sharing.

  • Platform Enhancements: Snowpark Container Services, Snowflake Cortex (AI/ML), Search Optimization Service, Materialized Views.

  • DataOps/Agile: CI/CD pipelines, Git, GitHub Actions/GitLab, Terraform (Infrastructure-as-Code), Agile delivery.

What you’ll do:

  • Design & Architect: Design multi-account, multi-region Snowflake topologies for global data sovereignty and disaster recovery. Architect decentralized Data Mesh and Data Clean Room solutions.

  • Develop & Deliver: Pioneer the adoption of cutting-edge platform capabilities (Snowpark Container Services, Cortex AI) to unlock new enterprise capabilities.

  • Guide & Govern: Chair the Data Architecture Review Board. Establish the ultimate engineering baselines, CI/CD frameworks, and security postures adopted by hundreds of engineers globally.

  • Operate & Optimize: Define enterprise operational KPIs. Ensure high confidence from audit, security, and executive stakeholders by designing overarching compliance and governance frameworks.

What we are looking for:

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related technical field.

  • 12+ years of IT/Data experience, with 7+ years of Snowflake experience (operating at a petabyte-scale enterprise level).

  • Recognized industry expert with a track record of designing multi-cloud or multi-region data topologies.

  • Deep expertise in organizational FinOps strategies, legal data compliance boundaries, and executive technical communication.

  • Demonstrated ability to lead organizational transformations, moving legacy enterprise systems to modern, decoupled data lakehouse architectures.

  • Snowflake Certifications - Multiple SnowPro Advanced certifications expected.

Our Values


If you want to know the heart of a company, take a look at their values. Ours unite us. They are what drive our success – and the success of our customers. Does your heart beat like ours? Find out here: Core Values

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.

Blue Yonder

Website: https://blueyonder.com/

Headquarter Location: Scottsdale, Arizona, United States

Employee Count: 5001-10000

Year Founded: 1985

IPO Status: Private

Last Funding Type: Secondary Market

Industries: CRM ⋅ Data Management ⋅ SaaS ⋅ Software ⋅ Supply Chain Management ⋅ Warehouse Automation