The role:
You’ll help build and scale the core data platform powering a rapidly growing global product. This is a hands-on role focused on infrastructure, reliability, and enabling teams across the company to work with data safely, efficiently, and independently.
What you’ll do:
- Build and maintain core data infrastructure (lake, warehouse, orchestration, observability)
- Design and operate scalable data ingestion and transformation pipelines
- Embed data quality, lineage, governance, and security by default
- Partner with platform, product, and analytics teams on data architecture
- Implement CI/CD for data systems and drive performance and cost optimizations
- Define best practices and enable self-service analytics across the org
- Strong experience with cloud data platforms (Snowflake, BigQuery, Redshift, Databricks)
- Deep knowledge of orchestration tools (Airflow, Dagster, Prefect) and IaC (Terraform, Pulumi)
- Experience with event-driven systems (Kafka, Pub/Sub, Kinesis)
- Strong SQL skills and understanding of distributed systems (Go is a plus)
- Clear communicator who can connect technical work to business impact
#LI-OP1