Own your future:
Our culture isn't something people join, it's something they build and shape. We believe that every person deserves to be heard and empowered. If you're on the fence about whether you're a fit, we say go for it. Let’s build something great together.
Must Haves:
- Strong experience with Python, Java, or other programming languages
- Advanced knowledge of SQL, including complex queries, query modularization, and optimization for performance and readability
- Familiarity with the modern data stack and cloud-native data platforms, such as Snowflake, BigQuery, or Amazon Redshift
- Hands-on experience with dbt (data build tool) for data modeling and transformations
- Experience with data orchestration tools, such as Airflow or Dagster
Nice to Have:
- Experience with GitOps, continuous delivery for data pipelines
- Experience with Infrastructure-as-Code tooling (Terraform)
Key Responsibilities:
- Design and build a data platform that standardizes data practices across multiple internal teams
- Support the entire data lifecycle
- Build and maintain integrations across data processing layers, including ingestion, orchestration, transformation, and consumption
- Collaborate closely with cross-functional teams to understand data needs and ensure the platform delivers value
- Document architectures, solutions, and integrations to promote best practices, maintainability, and usability