- Analyze blockchain and payments data at scale.
- Build and maintain automated data models and ETL pipelines using SQL, Python, Airflow, and dbt.
- Design dashboards that reveal trends across user growth, capital movement, and transaction health.
62 results
Filters
Maintain and optimize ELT pipelines to ensure reliable data flows. Develop and maintain scalable data models following best practices. Support data validation, testing, documentation, and governance across environments.
You’ll be joining a small, high-impact data engineering team within Federato’s AI/ML organization, where the focus is on building the infrastructure and internal frameworks that empower machine learning engineers. Data and tooling are reliable, scalable, and aligned with the needs of our AI-native platform which enables machine learning engineers to develop, deploy, and iterate on AI-powered features.
You’ll be joining a small, high-impact data engineering team within Federato’s AI/ML organization, where the focus is on building the infrastructure and internal frameworks that empower machine learning engineers. Data and tooling are reliable, scalable, and aligned with the needs of our AI-native platform which enables machine learning engineers to develop, deploy, and iterate on AI-powered features.
Help power the data pipelines behind The Public Interest Company’s AI-driven claims recovery platform. If you love SQL, building clean data flows, and working with modern tools like Databricks and dbt, we’d love to meet you.
Combine engineering rigor, business context, and storytelling to transform complex data into clear, consistent, and actionable insights as a Senior Analytics Engineer at Axios. You'll design, maintain, and evolve the curated layers of our data platform ensuring data is reliable, modeled for clarity, and accessible to everyone. You’ll transform raw and complex source data into well-structured data products, certified dashboards, and compelling visualizations that underpin reporting.
Design, build, and maintain robust data pipelines and warehouses that support analytics, machine learning, and real-time data products. Work closely with engineers, data scientists, and product teams to integrate and curate complex datasets from multiple sources, ensuring data quality, accessibility, and scalability. Contribute to evolving the data infrastructure and implement best practices across the software development lifecycle.