Job Description
Help Orita handle massive amounts of data efficiently and reliably. Play a crucial role in unifying our data pipeline with an event taxonomy and normalization layer, ensuring our machine learning models have high-quality data to work with. You will be working on a team of engineers, reporting to our Director of Data, Will Goldstein. You will work closely with our talented teams to enhance our data infrastructure and drive innovation in our data processing capabilities.
Responsibilities:
Design and build scalable and reliable data pipelines to handle large volumes of data from various sources.
Unify our data pipeline by developing an event taxonomy and normalization layer for consistent and accurate data.
Develop and maintain workflows using Airflow and dbt.
Collaborate with data scientists and machine learning engineers to facilitate seamless data integration for model training and deployment.
Optimize data retrieval and develop data models for storage and analysis.
Ensure data quality, integrity, and security throughout the data lifecycle.
Implement ETL/ELT processes and data integration solutions.
Contribute to feature engineering efforts to enhance model performance.
Set up and analyze A/B testing in big data environments.
Monitor and troubleshoot data pipelines and workflows to maintain optimal performance.
About Orita
Orita helps direct-to-consumer brands market more effectively by using math and machine learning to target subscribers who want to hear from them.