Lead the development of ETL pipelines and Data Lake infrastructure.
Perform ETL and Sanitation on disparate data flows from multiple internal and external sources.
Maintain data security, quality, and performance with details documentation.
Swing Left helps people maximize their impact on competitive elections to help Democrats win. They have one million members, who have raised more than $140 million and engaged more than 50 million voters.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Designing, building and maintaining data pipelines and data warehouses.
Developing ETL ecosystem tools using tools like Python, External APIs, Airbyte, Snowflake, dbt, PostgreSQL, MySQL, and Amazon S3.
Helping to define automated solutions to solve complex problems around better understanding data, users, and the market
Neighborhoods.com provides real estate software platforms, including 55places.com and neighborhoods.com. They strive to foster an inclusive environment, comfortable for everyone, and will not tolerate harassment or discrimination of any kind.
Build and optimize scalable, efficient ETL and data lake processes.
Own the ingestion, modeling, and transformation of structured and unstructured data.
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.
Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.
Design, build, and maintain robust data pipelines.
Own and scale ETL/ELT processes using tools like dbt, BigQuery, and Python.
Build modular data models that power analytics, product features, and LLM agents.
Jobgether is a platform that uses AI to match candidates with jobs. They aim to review applications quickly and fairly, ensuring the top-fitting candidates are identified and shared with hiring companies.
Develop and optimize real-time data pipelines from existing data sources to support marketing science initiatives.
Structure and organize large-scale datasets to ensure high performance, scalability, and reliability.
Build and maintain production-grade data tables powering 20–50 marketing signals across ~20 million advertisers.
ItD is a consulting and software development company blending diversity, innovation, and integrity with real business results and rejecting strong hierarchy. They are a woman- and minority-led firm employing a global community with excellent benefits such as medical, dental, vision, life insurance, paid holidays, 401K, networking & career learning and development programs.
Manage the analytics data warehouse to make it scalable and reliable.
Build and scale data catalogs and a semantic layer for the data infrastructure.
Sibill is the accounting and financial operating system for small and medium businesses (SMBs). They integrate invoicing, payments, treasury, and accounting in one modern platform and are backed by top international investors..
Design and improve data pipelines that process large, multi-modal datasets from internal and external sources into AI model training datasets.
Evolve our data storage layer to support analytics, schema evolution, reproducibility, and efficient data access.
Collaborate with ML engineers to improve the performance and reliability of Python-based data processing workflows.
Iambic Therapeutics is a clinical-stage life-science and technology company developing novel medicines using its AI-driven discovery and development platform. Founded in 2020 and based in San Diego, Iambic has assembled a world-class team that unites pioneering AI experts and experienced drug hunters.
Participate in end-to-end data migration projects.
Configure data migration tools including Syniti ADMM and BODS for ETL processes.
Develop and implement data migration plans, including data cleansing, validation, and reconciliation processes.
Jobgether is a platform helping people find jobs. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Serve as a primary advisor to identify technical improvements and automation opportunities.
Build advanced data pipelines using the medallion architecture in Snowflake.
Write advanced ETL/ELT scripts to integrate data into enterprise data stores.
Spring Venture Group is a digital direct-to-consumer sales and marketing company focused on the senior market. They have a dedicated team of licensed insurance agents and leverage technology to help seniors navigate Medicare.
Analyze, validate, and maintain large datasets to ensure accuracy and completeness.
Perform data quality checks and resolve inconsistencies proactively.
Manage mass data activities, including updating and implementing product data in enterprise software solutions.
REPA is Europe's leading distributor of spare parts for foodservice and refrigeration equipment, coffee and vending machines, and a trusted partner to OEMs, delivering the right part at the right time. They have the world's largest inventory of in-stock original and universal spare parts.
Build and own Sardine’s internal data infrastructure integrating CRM, marketing, product, finance, and operational systems.
Design, improve, and own ETL/ELT pipelines to ensure clean, reliable, and scalable data flows across the organization.
Partner with data, engineering, revenue/business operations, and executive stakeholders to define and track KPIs.
Sardine is a leader in fraud prevention and AML compliance. Their platform uses device intelligence, behavior biometrics, machine learning, and AI to stop fraud before it happens. Sardine has over 300 banks, retailers, and fintechs as clients and a remote-first work culture.
Drive decision making in building cutting-edge products.
Transform large datasets from complex systems, generate actionable insights, and share the results with stakeholders at all levels of the company.
Partner and lead projects with product management, engineering, client engagement, finance, and other enterprise level teams.
Kraken is a mission-focused company rooted in crypto values. As a Krakenite, you’ll join us on our mission to accelerate the global adoption of crypto, so that everyone can achieve financial freedom and inclusion.
Design and maintain ETL pipelines that ingest, process, and load data into AWS Neptune.
Develop and evolve graph data models representing relationships across users, sessions, devices, and security events.
Integrate diverse data sources including S3, relational databases, streaming services, and APIs into a cohesive graph architecture
Keeper Security is transforming cybersecurity for organizations around the world with next-generation privileged access management. Keeper’s zero-trust and zero-knowledge cybersecurity solutions are FedRAMP and StateRAMP Authorized, FIPS 140-2 validated, as well as SOC 2 and ISO 27001 certified.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Collaborate with business leaders, engineers, and product managers to understand data needs.
Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.
Design and implement scalable, performant data models.
Develop and optimize processes to improve the correctness of 3rd party data.
Implement data quality principles to raise the bar for reliability of data.
SmithRx is a venture-backed Health-Tech company disrupting the Pharmacy Benefit Management (PBM) sector with a next-generation drug acquisition platform. They have a mission-driven and collaborative culture that inspires employees to transform the U.S. healthcare system.
Design, build, and maintain robust ETL/ELT pipelines to ingest large-scale datasets and high-frequency streams.
Lead the design and evolution of our enterprise data warehouse, ensuring it is scalable and performant.
Manage our data transformation layer using Dataform (preferred) or dbt to orchestrate complex, reliable workflows.
UW provides utilities all in one place, including energy, broadband, mobile, and insurance. They aim to double in size and offer savings to customers, fostering a culture that values imaginative and pragmatic problem-solvers.
Own and deliver the technical execution of Proofs of Value (PoVs) to support the Chattermill sales cycle.
Prepare, ingest, and transform customer data for PoVs, including data acquisition, ETL processes, data modeling, and validation.
Build and maintain PoV outputs such as dashboards, analyses, and reports, aligned to agreed scope, success criteria, and customer use cases.
Chattermill's Customer Experience Intelligence platform continuously analyzes explicit and implicit feedback to enable clients to identify what they should do next, using best-in-class tech in a fast-developing AI space. They value trust, growth, and a team-oriented approach, striving for diversity and inclusion.
Design and develop data integration pipelines across cloud and legacy systems.
Lead and support data engineering implementation efforts.
Apply advanced analytics techniques to support business insights and decision-making.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company, with the final decision managed by the internal team.