Sr. Data Engineer

Posted:
10/25/2024, 6:56:54 AM

Location(s):
North Carolina, United States ⋅ Raleigh, North Carolina, United States

Experience Level(s):
Senior

Field(s):
Data & Analytics

Who We Are:

Bandwidth delivers world-class messaging, voice, and emergency service connectivity for the world’s biggest brands. We are the APIs and global network behind the platforms that the Global 2000’s use to power their internal communications, contact center platforms, apps, and software. We transform interactions for top-tier orgs—and we do it on a global scale. We’re the only ones who marry the power of our global network with the control and agility offered by our enterprise-grade APIs. Unmatched reliability meets unparalleled control. That’s the Bandwidth way.

At Bandwidth, your music matters when you are part of the BAND.  We celebrate differences and encourage BANDmates to be their authentic selves.  #jointheband

What We Are Looking For:

This role focuses on designing, building, and optimizing scalable data platforms that enable data ingestion, transformation, and governance for the organization. The Sr. Data Engineer will work closely with cross-functional teams to support both traditional data engineering pipelines and modern data mesh initiatives, balancing infrastructure development, data-ops, and enablement. The role requires a strong understanding of data architecture in Snowflake, along with proficiency in Python, AWS services, and dev-ops practices, to drive efficient data solutions for enterprise-wide data consumption. 

What You'll Do:

  • Develop and Maintain Scalable Data Platforms: Lead the design and development of scalable, secure, and efficient data platforms using Snowflake and other cloud technologies, ensuring alignment with the organization’s data mesh and platform enablement goals. 
  • Build and Optimize Data Pipelines: Architect, build, and optimize ETL/ELT data pipelines to support various business units, balancing batch and streaming solutions. Leverage tools like AWS DMS, RDS, S3, and Kafka to ensure smooth data flows. 
  • Data Enablement & Ops: Collaborate with cross-functional teams (data engineers, data scientists, analysts, and architects) to create data-as-a-product offerings, ensuring self-service enablement, observability, and data quality. 
  • Automation & CI/CD: Develop, manage, and continuously improve automated data pipelines using Python (Prefect) and cloud-native services (AWS). Ensure CI/CD pipelines are set up for efficient deployment and monitoring. 
  • Snowflake Expertise: Apply deep knowledge of Snowflake SQL to build efficient and performant data models, optimizing storage and query performance. 
  • Cloud Data Integration: Use AWS services and data integration tools (DMS, Prefect) to ingest and manage diverse data sets across the organization. 
  • Data Governance & Compliance: Work closely with solution architects to implement data governance frameworks, ensuring adherence to enterprise data standards and regulatory compliance. 
  • Stakeholder Collaboration: Partner with product owners, business stakeholders, and data teams to translate business requirements into scalable data solutions, aligning roadmaps with organizational objectives. 
  • Agile Practices & Leadership: Participate in Agile ceremonies, mentor junior engineers, and help drive a culture of continuous improvement across the data engineering team. 
  • Metrics and Observability: Implement monitoring and metrics for data pipeline health, performance, and data quality, driving insights to improve product decision-making.

What You Need:

Education: 

  • Bachelor's degree (within an engineering or computer science discipline strongly preferred) or equivalent work experience 

Experience: 

  • 8+ years of experience working in software architecture in the data engineering domain. 
  • Demonstrated track record of operating in highly collaborative, flexible, and productive cross-organization teams. 
  • Proven ability to perform in high-visibility, high-growth environments. 
  • Action and results-oriented, able to communicate strategies and tactics, identify obstacles. 
  • Experience in providing effective leadership to developers/engineers through the creation of Agile backlog assets and ongoing day-to-day guidance, resulting in development outcomes steeped in productivity and fun. 

Knowledge: 

  • Programming Languages, e.g. Python, Java 
  • SQL Databases, e.g. MySQL, PostgreSQL, SQL Server, MariaDB 
  • NoSQL Databases, e.g. MongoDB, Cassandra 
  • Data Warehousing Solutions, e.g. Snowflake 
  • Data Processing, e.g. Flink 
  • Stream Process, e.g. Kafka, MSK 
  • Data modeling, ETL/ELT processes, and data quality frameworks. 
  • Other data engineering tools, e.g. Kafka Connect, DMS, Talend, Prefect 
  • Cloud Platforms, e.g. AWS 
  • Containerization/Orchestration, e.g. Kubernetes (k8s) 
  • CI/CD tools, Version control tools: ArgoCD, Github, Github actions 
  • Agile methodologies and frameworks, e.g. Scrum, Kanban 
  • Collaboration tools, e.g. JIRA, Monday 

Skills: 

  • Demonstrated skills in developing products and/or solutions with a focus on agile principles working with internal delivery teams.
  • Strong written and verbal communication.

Bonus Points:

Education: 

  • SnowPro Advanced Certification or equivalent certification. 

Experience: 

  • Working in Prefect or building other Python-based ETL pipelines 
  • Building and maintaining DevOps/DataOps workflows 
  • Implementing observability, monitoring and alerting against data feeds and products 

Knowledge: 

  • Data governance frameworks and compliance standards.
  • Enterprise data architecture or solution architecture.

The Whole Person Promise:

At Bandwidth, we’re pretty proud of our corporate culture, which is rooted in our “Whole Person Promise.” We promise all employees that they can have meaningful work AND a full life, and we provide a work environment geared toward enriching your body, mind, and spirit. How do we do that? Well…

  • 100% company-paid Medical, Vision, & Dental coverage for you and your family with low deductibles and low out-of-pocket expenses.
  • All new hires receive four weeks of PTO.
  • PTO Embargo. When you take time off (of any kind!) you’re embargoed from working. Bandmates and managers are not allowed to interrupt your PTO – not even with email.
  • Additional PTO can be earned throughout the year through volunteer hours and Bandwidth challenges.
  • “Mahalo moments” program grants additional time off for life’s most important moments like graduations, buying a first home, getting married, wedding anniversaries (every five years), and the birth of a grandchild.
  • 90-Minute Workout Lunches and unlimited meetings with our very own nutritionist.

 

Are you excited about the position and its responsibilities, but not sure if you’re 100% qualified? Do you feel you can work to help us crush the mission? If you answered ‘yes’ to both of these questions, we encourage you to apply! You won’t want to miss the opportunity to be a part of the BAND.

Applicant Privacy Notice