Looking for candidates who are passionate about technology, proficient in English, and excited to engage in remote collaboration for a worldwide presence. Must have 5+ years of professional experience as a Python Developer and strong hands-on experience with Apache Airflow, experience working with Snowflake and solid knowledge of SQL. Ability to design, build, and maintain data pipelines and integrations is required, as well as the ability to work independently and troubleshoot effectively.
Remote Data Jobs · Airflow
41 results
FiltersJob listings
The Data Engineer will be key in the design, construction and evolution of the engineering, BI and Generative Artificial Intelligence ecosystem. Their purpose will be to transform the company's financial records into actionable information, enabling strategic and operational decisions of high impact for the business. They will design and build data pipelines from multiple sources to the Data Lake, ensuring scalability, performance and quality.
LaunchDarkly is seeking an experienced Data Analytics Engineer to build high-quality data models and data products that directly influence sales strategy and revenue reporting. You’ll join our Data Analytics and Engineering team in the Technology organization and work closely with embedded revenue analysts, salesforce, software engineers, and the revenue operations team.
We are seeking a Data Engineer to help build and maintain a robust, trustworthy, and well-documented data platform. In this role, you will be responsible for developing and maintaining data pipelines, implementing monitoring and alerting systems, establishing data quality checks, and ensuring that data structures and documentation stay up to date and aligned with business needs.
Seeking a Data Architect / Principal Data Engineer to join our development center, working with a major US client in a fast-paced, high-impact environment. In this role, you will define, architect, and implement scalable data platforms and end-to-end ELT pipelines aligned with modern Lakehouse principles. You will work closely with cross-functional teams across the US, Colombia, and Brazil to ensure that our data ecosystem is reliable, future-proof, and aligned with enterprise architecture standards.
In this role, you will define, architect, and implement scalable data platforms and end-to-end ELT pipelines aligned with modern Lakehouse principles, collaborating with cross-functional teams across the US, Colombia, and Brazil. This position requires deep technical expertise, strong architectural thinking, and the ability to influence and mentor engineering teams. Fluent English communication is essential for collaborating with global stakeholders, presenting architectural recommendations, and ensuring alignment across distributed teams.
As a Data Systems Engineer at Northbeam, you will translate customer feedback into scalable data pipelines and products, creating, maintaining, and improving integrations and transformations in a complex network. The system is powered by data that spans numerous ad platforms and order management systems, requiring curiosity, experience, and a desire to build data pipelines and applications at scale.
The Data Engineering team is focused on the design, development, and support for 'all things data' at OppFi. This includes the deployment of Postgres databases which support our applications, our Snowflake Data Warehouse and multiple Airflow and Hevo ETLs. You will work on Postgresql Database Administration and Data Engineering.
As a Data Engineer on the Growth org, you will collaborate with our cross-functional partners in Data Science, Product, and Finance to build and maintain high quality data sources and build software that maximizes the value and efficacy of that data. The team operates independent of skill set, so your peers with software engineering backgrounds will execute on data engineering work, and you will make contributions on software systems that don’t directly work with our data systems.
Lead Docker's Data Engineering team and drive the strategic evolution of data analytics across the company. Requires deep technical expertise in modern data platforms, strong leadership skills, and the ability to translate business needs into robust data solutions that scale with Docker's growth.