Similar Jobs
See allResponsibilities:
- Monitor, optimize, and integrate batch data source pipelines into Snowflake databases using ELT vendors (e.g. Fivetran, Airbyte) and Python integrations to databases and APIs
- Build and maintain dbt pipelines to transform source data into reporting data models for our visualization team
- Set up easy-to-use data sources in Tableau and build out Tableau dashboards following internal design standards
Requirements:
- 4+ years of professional experience in building cloud data pipelines and data warehousing within a cloud data warehouse (ideally Snowflake)
- High proficiency and experience working with SQL and Python
- Experience in designing data models to support reporting and business outcomes
Pluses:
- Familiarity with Airflow, Fivetran, Tableau, Terraform, Github Actions
- Experience or a strong desire to learn AWS, Snowflake, dbt, and Tableau
DataDrive
DataDrive is a fast-growing managed analytics service provider that provides modern cloud analytics data platforms to data-driven organizations, while also supporting ongoing training, adoption, and growth of our clients’ data cultures. DataDrive offers a unique team-oriented environment where one can develop their skills and work directly with some of the most talented analytics professionals in the business.