Build and optimize scalable, efficient ETL and data lake processes.
Own the ingestion, modeling, and transformation of structured and unstructured data.
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.
Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.
Design, develop, and optimize EDP data pipelines using Python, Airflow, DBT, and Snowflake for scalable financial data processing.
Build performant Snowflake data models and DBT transformations following best practices, standards, and documentation guidelines.
Own ingestion, orchestration, monitoring, and SLA-driven workflows; proactively troubleshoot failures and improve reliability.
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, and their culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Build and Maintain Bronze/Silver Layer Pipelines: ensure core data sources lands accurately, on time, and with full lineage.
Lead Data Ingestion, Transformation, and Enrichment: own the end-to-end pipeline from raw file landing through cleansed, conformed staging tables, including deduplication, standardization, code mapping, and entity resolution.
Develop Automated Ingestion Pipelines: use Snowpipe, Matillion, or custom solutions with reliability, observability, and minimal manual intervention in mind.
Precision AQ is building a centralized Data Hub to consolidate fragmented data infrastructure, establish enterprise-wide data governance, and enable AI-ready analytics across our life sciences portfolio. They value innovation and a collaborative environment.
Designing, building and maintaining data pipelines and data warehouses.
Developing ETL ecosystem tools using tools like Python, External APIs, Airbyte, Snowflake, dbt, PostgreSQL, MySQL, and Amazon S3.
Helping to define automated solutions to solve complex problems around better understanding data, users, and the market
Neighborhoods.com provides real estate software platforms, including 55places.com and neighborhoods.com. They strive to foster an inclusive environment, comfortable for everyone, and will not tolerate harassment or discrimination of any kind.
Develop and maintain custom connectors using AirByte.
Build and optimize data transformation pipelines using AWS Glue.
Structure data to enable efficient AWS Athena queries.
Coderoad is a software development company that provides end-to-end services. They offer opportunities to work on real-world projects, helping individuals skill up and advance their careers in a supportive environment.
Design and develop data integration pipelines across cloud and legacy systems.
Lead and support data engineering implementation efforts.
Apply advanced analytics techniques to support business insights and decision-making.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company, with the final decision managed by the internal team.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Design, develop, and maintain dbt data models that support our healthcare analytics products.
Integrate and transform customer data to conform to our data specifications and pipelines.
Design and execute initiatives that improve data platform and pipeline automation and resilience.
SmarterDx, a Smarter Technologies company, builds clinical AI that is transforming how hospitals translate care into payment. Founded by physicians in 2020, our platform connects clinical context with revenue intelligence, helping health systems recover millions in missed revenue, improve quality scores, and appeal every denial.
Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
Model, store, and transform data to support analytics, reporting, and downstream applications.
Build API-based and file-based integrations across enterprise platforms.
Jobgether is a platform that uses AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Architect and maintain robust, scalable, and secure data infrastructure on AWS leveraging Databricks.
Design, develop, and maintain data pipelines, primarily using tools like Airbyte and custom-built services in Go, to automate data ingestion and ETL processes.
Oversee the creation and maintenance of the data lake, ensuring efficient storage, high data quality, and effective partitioning, organization, performance, monitoring and alerting.
Trust Wallet is the leading non-custodial cryptocurrency wallet, trusted by over 200 million people worldwide to securely manage and grow their digital assets. They aim to be a trusted personal companion — helping users safely navigate Web3, the on-chain economy, and the emerging AI-powered future.
Design and implement scalable data pipelines and solutions using Snowflake.
Develop and optimize SQL queries, stored procedures, and views for performance and efficiency.
Integrate Snowflake with ETL tools and cloud platforms.
Cayuse Commercial Services, LLC delivers technical solutions that meet customer needs. They foster teamwork and prioritize quality and inclusivity in deliverables.
Serve as a primary advisor to identify technical improvements and automation opportunities.
Build advanced data pipelines using the medallion architecture in Snowflake.
Write advanced ETL/ELT scripts to integrate data into enterprise data stores.
Spring Venture Group is a digital direct-to-consumer sales and marketing company focused on the senior market. They have a dedicated team of licensed insurance agents and leverage technology to help seniors navigate Medicare.
Monitor and support ETL processes moving data from on-premise or hosted servers into Snowflake
Design and maintain aggregate and reporting tables optimized for Tableau and Power BI dashboards
Optimize Snowflake performance and cost , including warehouse usage, query tuning, and table design
Affinitiv is the largest provider of end-to-end, data-driven marketing and software solutions exclusively focused on the automotive customer lifecycle. Backed by 20+ years of automotive and marketing expertise, they work with over 6,500 dealerships and every major manufacturer in the country.
Contribute to the technical integrity and evolution of the Data Platform tech stack.
Design and implement core features and enhancements within the Data Platform.
Build and maintain robust data extraction, loading, and transformation processes.
Dimagi is an award-winning social enterprise and certified B-corp building software solutions and providing technology consulting services to improve the quality of essential services for underserved populations. They are passionate and committed to tackling complex health and social inequities and working towards a brighter future for all.
Design and develop robust ETL/ELT pipelines using Python, Airflow, and DBT.
Build and optimize Snowflake data models for performance, scalability, and cost efficiency.
Implement ingestion pipelines for internal and external financial datasets (Market, Securities, Pricing, ESG, Ratings).
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation. They support digital transformation for some of the world's largest enterprises. Miratech retains nearly 1000 full-time professionals, with an annual growth rate exceeding 25%, and offers a ForeverRemote work culture.
Assist in designing and implementing Snowflake-based analytics solutions.
Build and maintain data pipelines adhering to enterprise architecture principles.
Act as a technical leader within the team, ensuring quality deliverables.
Jobgether is a company that uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Collaborate with cross-functional teams and business units to understand and communicate Ion’s needs and goals.
Develop, improve, and maintain robust data pipelines for extracting and transforming data from log files and event streams.
Design models and algorithms to derive insights and metrics from large datasets.
Intuitive is a global leader in robotic-assisted surgery and minimally invasive care. Their technologies, like the da Vinci surgical system and Ion, have transformed how care is delivered for millions of patients worldwide. They are a team of engineers, clinicians, and innovators united by one purpose: to make surgery smarter, safer, and more human.