Strengthen the real estate MLS data platform squad. Build robust data pipelines and backend services. Own the end-to-end architecture for MLS and property data.
Source Job
20 jobs similar to Staff Software Engineer, Data Platform
Jobs ranked by similarity.
As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
- Partner with our customer teams to develop engineering plans to implement our health system partners
- Build and support robust batch and streaming pipelines
- Evolve the maturity of our monitoring systems and processes to improve visibility and failure detection in our infrastructure
Paradigm is rebuilding the clinical research ecosystem by enabling equitable access to trials for all patients. Incubated by ARCH Venture Partners and backed by leading healthcare and life sciences investors, Paradigm’s seamless infrastructure implemented at healthcare provider organizations, will bring potentially life-saving therapies to patients faster.
- Lead and guide the design and implementation of scalable streaming data pipelines.
- Engineer and optimize real-time data solutions using frameworks like Apache Kafka, Flink, Spark Streaming.
- Drive adoption of best practices in data governance, observability, and performance tuning for streaming workloads.
PointClickCare is a leading health tech company that’s founder-led and privately held, empowering employees to push boundaries, innovate, and shape the future of healthcare.
- Architect and implement scalable Lakehouse solutions using Delta Tables and Delta Live Tables.
- Design and orchestrate complex data workflows using Databricks Workflows and Jobs.
- Develop production-grade Python and PySpark code, including custom Python libraries.
Coderio designs and delivers scalable digital solutions for global businesses with a strong technical foundation and a product mindset.
- Design, develop, and maintain scalable backend services and APIs, ensuring high performance, security, and reliability.
- Build and optimize data pipelines to ingest, store, and serve large-scale datasets, including video and ML training data.
- Own and evolve the foundational data model that powers the entire Frontera platform.
Frontera is reimagining how children with autism and other behavioral health needs get the care they deserve. They bring together clinicians, technologists, and autism specialists to build cutting-edge AI tools that help care teams, expanding access to high-quality services for families everywhere.
- Design and implement data cataloging infrastructure to make datasets discoverable and connected.
- Develop consistent data access patterns and libraries for engineers and data scientists.
- Collaborate with Data Engineers to ensure platform infrastructure complements data pipelines.
Jobgether helps people find jobs using an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly.
Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.
This position is posted by Jobgether on behalf of a partner company.
Design, build, and maintain a robust, self-service, scalable, and secure data platform. Create and edit data pipelines, considering business logic, levels of aggregation, and data quality. Enable teams to access and use data effectively through self-service tools and well-modeled datasets.
We are Grupo QuintoAndar, the largest real estate ecosystem in Latin America, with a diversified portfolio of brands and solutions across different countries.
- Design, build, and maintain the pipelines that power all data use cases.
- Develop intuitive, performant, and scalable data models that support product features, internal analytics, experimentation, and machine learning workloads.
- Define and enforce standards for accuracy, completeness, lineage, and dependency management.
Patreon is a media and community platform where over 300,000 creators give their biggest fans access to exclusive work and experiences.
- Architect and maintain robust data pipelines to transform diverse data inputs.
- Integrate data from various sources into a unified platform.
- Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
Join the Streaming team to power Attentive's backbone of messaging and personalization. Help shape the future of the streaming architecture, moving to modern platforms. Influence how quickly Attentive can experiment, personalize, and scale across millions of devices and billions of events.
Attentive is the AI marketing platform for 1:1 personalization redefining the way brands and people connect.
Design, develop, and manage modern, scalable data solutions across federal and commercial environments. Build robust data pipelines, integrate data across multiple sources, and ensure high-quality, reliable data for analytics and operational use. Collaborate with cross-functional teams including Architects, Data Scientists, and DevOps engineers to deliver secure and efficient data solutions.
This position is posted by Jobgether on behalf of a partner company and uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly.
Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.
YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B.
- Build and scale data services by designing, developing, and maintaining scalable backend systems and APIs.
- Collaborate on data architecture and models, partnering with engineering and analytics teams to optimize storage and processing workflows.
- Contribute to standards, quality, and governance by building reliable, observable data systems with strong testing and validation.
Zapier builds and uses automation every day to make work more efficient, creative, and human.
- Collaborate with engineering, data science, ML, data engineering, and product analytics teams to understand and shape the future needs of our data platform and infrastructure.
- Define, drive, and implement the future live ingestion layer of data into our data platform (e.g. Kafka, Kinesis).
- Define and evolve standards for storage, compute, data management, provenance, and orchestration.
Inspiren offers the most complete and connected ecosystem in senior living.
- Design, develop, and maintain scalable backend services and APIs, ensuring high performance, security, and reliability.
- Build and optimize data pipelines to ingest, store, and serve large-scale datasets, including video and ML training data.
- Balance rapid development of MVPs and POCs for user research with building robust, long-term backend foundations.
Frontera is reimagining how children with autism and other behavioral health needs get the care they deserve. They bring together clinicians, technologists, and autism specialists to build AI tools that help care teams work smarter and spend more time with the children and families who need them most.
- Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
- Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
- Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions. Design and oversee the data architecture and infrastructure, ensuring scalability, performance, and security. Create scalable and efficient data processing frameworks, including ETL processes and data pipelines.
Lingaro has been on the market since 2008, with 1500+ talents currently on board in 7 global sites and emphasizes career growth and skills development.
- Design robust solutions for major initiatives, define service boundaries, and improve system observability.
- Coordinate backend decisions with other teams, aligning API contracts and shared infrastructure.
- Collaborate with the Product Manager to understand user needs and evaluate technical solutions based on customer value.
TrustYou is a leading AI-driven hospitality platform dedicated to transforming guest experiences and empowering businesses to thrive.
- Design and maintain data models that organize rich content into canonical structures optimized for product features, search, and retrieval.
- Build high-reliability ETLs and streaming pipelines to process usage events, analytics data, behavioral signals, and application logs.
- Develop data services that expose unified content to the application, such as metadata access APIs, indexing workflows, and retrieval-ready representations.
Udio's success hinges on hiring great people and creating an environment where we can be happy, feel challenged, and do our best work.