About Slickdeals:
We believe shopping should feel like winning. That’s why 10 million people come to Slickdeals to swap tips, upvote the best finds, and share the thrill of a great deal. Together, our community has saved more than $10 billion over the past 25 years.
We’re profitable, passionate, and in the middle of an exciting evolution—transforming from the internet’s most trusted deal forum into the go-to daily shopping destination. If you thrive in a fast-moving, creative environment where ideas turn into impact fast, you’ll fit right in.
The Purpose:
Slickdeals is seeking a Staff Software Engineer with deep expertise in Big Data platforms/systems to lead and evolve our data engineering ecosystem. This role goes beyond pipeline maintenance; it’s about architecting scalable, resilient platforms that power analytics, experimentation, and machine learning across the business. You’ll inherit a mature platform built over 3+ years, spanning Databricks, dbt, Airflow, AWS, Tableau, and AtScale, and drive its next phase of growth. As a technical leader, you’ll shape architecture, mentor engineers, and ensure our data infrastructure supports analytics, experimentation, and business enablement at scale.
What You'll Do:
- Architect, evolve, and maintain core ETL/ELT pipelines using dbt, Airflow, and Databricks
- Design and optimize semantic models in AtScale to support BI tools like Tableau
- Lead cross-functional collaboration with Analytics, Product, and Engineering to deliver reliable, timely data
- Own observability, performance, and reliability of data workflows across environments
- Guide infrastructure decisions in AWS (S3, Kafka, EC2, Lambda, IAM), balancing scalability and cost
- Drive cost optimization and platform hygiene across data storage, compute, and tooling
- Champion CI/CD practices and automated testing for data pipelines and infrastructure-as-code
- Uphold engineering rigor through SDLC best practices, including version control, peer reviews, and reproducible builds
- Lead documentation efforts, promote reproducibility, and support onboarding for long-term team health
- Facilitate code reviews, lead architecture discussions, championing engineering excellence, and contribute to overall technical strategy
- Mentor engineers and foster a culture of knowledge sharing and continuous learning
What We're Looking For:
Skills & Qualifications:
- BS/BA/BE degree in a quantitative area such as mathematics, statistics, economics, computer science, engineering, or equivalent experience.
- 12+ years of experience in software engineering
- 7+ years of experience in building large scale data solutions
- Strong proficiency in SQL, Python, and dbt
- Hands-on experience with Databricks, Airflow, and AWS infrastructure
- Strong understanding of semantic modeling
- Experience building dashboards and supporting BI teams using Tableau
- Familiarity with CI/CD pipelines, automated testing, and infrastructure-as-code workflows
- Experience with data governance, security, and compliance best practices
- Deep understanding of SDLC principles and how they apply to data engineering
- Excellent communication and documentation skills
- Comfortable working in a fast-paced, collaborative environment
- Strategic mindset with a bias for action, always curious and a continuous learner
Nice to Have:
- Experience with cost monitoring tools or FinOps practices
- Experience with Machine Learning and Data Science
- Familiarity with vendor integrations and API-based data sharing
- Exposure to AtScale, Tableau, or other modern data platforms
- Passion for mentoring and knowledge sharing
With your application, kindly attach a cover letter that outlines your greatest achievement. Please share what you built, how you measured success, and your role in the result.
Please note: We are unable to sponsor visas at this time. Candidates must be authorized to work in the U.S. without current or future visa sponsorship or transfer.
LOCATION: Las Vegas, NV
Hybrid schedule visiting our Las Vegas office three days a week (Tues-Thurs).