Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
Develop and optimize data models in Snowflake or similar platforms.
Implement ETL/ELT processes using Python and modern data tools.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.
Provide data analysis reports for submissions from various states.
Develop reports in Tableau Desktop across multiple divisions.
Create documentation and presentations for stakeholders.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Design, build, and scale performant data pipelines and infrastructure, primarily using ClickHouse, Python, and dbt.
Build systems that handle large-scale streaming and batch data, with a strong emphasis on correctness and operational stability.
Own the end-to-end lifecycle of data pipelines, from raw ingestion to clean, well-defined datasets consumed by downstream teams.
Nansen is a leading blockchain analytics platform that empowers investors and professionals with real-time, actionable insights derived from on-chain data. We’re building the world’s best blockchain analytics platform, and data is at the heart of everything we do.
Design, develop, test, and maintain Python applications and SQL queries.
Perform data analysis, validation, and resolve data quality issues.
Build and optimize ETL/data pipelines using modern data tools.
Miratech helps visionaries change the world. They are a global IT services and consulting company that brings together enterprise and start-up innovation. Miratech is a values driven organization with over 1000 full-time professionals and has achieved sustained shareholder value growth exceeding 25% annually over many years.
Design and implement cloud-native data pipelines and storage architectures supporting large-scale analytics workloads.
Via Logic LLC is an information technology partner supporting mission-driven work across DevSecOps, cloud engineering, data science, cybersecurity, and national security analytics. Via Logic values flexibility, collaboration, and a people-first culture that supports work-life balance and gives our team the space to do meaningful, high-impact work.
Design, build, and maintain data models in our cloud data warehouse using dbt.
Own the transformation layer from raw and staged data to curated marts.
Promote and exemplify development standards and data quality for analytics engineering work.
Maven Clinic is the world's largest virtual clinic for women and families, dedicated to making healthcare accessible for everyone. With over $425 million in funding, they offer clinical, emotional, and financial support through their digital programs.
Drive data engineering work across project phases, including discovery, design, build, test, deploy, and ongoing improvement.
Design and build scalable data pipelines using Microsoft Fabric (lakehouses, warehouses, pipelines, dataflows, notebooks).
Cleanse, model, and transform raw data to support analytics, reporting, semantic modeling, and governance needs.
Stoneridge Software helps clients succeed in implementing business software solutions. As a 2025 Top Workplace Honoree and Microsoft Solutions Partner, they have a meticulous approach to project delivery and empower client's success with long-term support.
Develop, solve, and review advanced mathematical problems with real-world relevance.
Apply expertise in algebra, calculus, statistics, discrete mathematics, or related areas to design complex problem statements.
Collaborate asynchronously with AI researchers and domain experts to enhance AI model reasoning.
Alignerr partners with leading AI research teams and labs to build and train cutting-edge AI models. The job posting does not provide company size or culture information.
Write and deploy crawling scripts to collect source data from the web
Write and run data transformers in Scala Spark to standardize bulk data sets
Write and run modules in Python to parse entity references and relationships from source data
Sayari is a risk intelligence provider equipping sectors with visibility into commercial relationships, delivering corporate and trade data from over 250 jurisdictions. Headquartered in Washington, D.C., its solutions are trusted globally and recognized for growth and workplace culture.
Design, develop, and maintain scalable data pipelines and data warehouses.
Develop ETL/ELT processes using Python and modern data tools.
Ensure data quality, reliability, and performance across systems.
3Pillar Global is dedicated to engineering solutions that challenge conventional norms. They are an elite team of visionaries that actively shapes the tech landscape for their clients and sets global standards along the way.