The Data Engineering team is focused on the design, development, and support for 'all things data' at OppFi. This includes the deployment of Postgres databases which support our applications, our Snowflake Data Warehouse and multiple Airflow and Hevo ETLs. You will work on Postgresql Database Administration and Data Engineering.
Remote Data Jobs · Snowflake
48 results
FiltersJob listings
Join a role for dynamic financial institution where your core aim is to unlock business insights from large datasets, empowering decision-makers across trading, investment, and research teams with robust analytics and data-driven projects. You will play a pivotal role in improving operational efficiency, developing scalable reporting frameworks, and enhancing performance tracking through advanced financial modeling and dashboard automation.
Lead Docker's Data Engineering team and drive the strategic evolution of data analytics across the company. Requires deep technical expertise in modern data platforms, strong leadership skills, and the ability to translate business needs into robust data solutions that scale with Docker's growth.
Build and operate a cutting-edge, near-real-time data platform that synchronizes data across enterprise systems. You'll be hands-on with modern streaming technologies, solving complex data engineering challenges that enable real-time decision-making, eliminate data silos, and modernize how the organization leverages data.
Join our global technology firm in a challenging role focused on driving automation, improving productivity, and leveraging data within core business processes. This hands-on role combines technical expertise in data analysis and automation tooling with a strong focus on generating business intelligence and optimizing organizational processes.
Support the development of data pipelines, ensuring quality and organization in deliveries under the guidance of more experienced members. Collaborate in the maintenance and evolution of existing solutions, contributing to the continuous improvement of processes. Participate actively in agile rituals, sharing progress and doubts clearly. Demonstrate technical curiosity and a willingness to learn new tools and good data engineering practices. Support the documentation of developed solutions, ensuring traceability and alignment with team standards.
Design, implement, and maintain scalable data pipelines using Snowflake, Coalesce.io, Airbyte, and SQL Server/SSIS, with some use of Azure Data Factory. Build and maintain dimensional data models to ensure high-quality, structured data for analytics and reporting. Implement Medallion architecture in Snowflake, managing bronze, silver, and gold layers. Collaborate with teams using Jira for task tracking and GitHub for code repository management.
Lead the migration to a modern data platform built on Snowflake, dbt, and Prefect. As the technical architect, you will guide this transformation, define data movement, redesign workflows, and ensure business continuity throughout the transition. This is an architecture and leadership role that requires hands-on work when needed.
You'll play a key role in designing, building, and maintaining the systems that power our analytics, personalization, and financial insights. You’ll collaborate closely with product and engineering to ensure that high-quality, reliable data drives every decision we make and every experience we deliver.
Design and architect end-to-end data solutions leveraging Snowflake as the primary cloud data platform. Implement and maintain Medallion architecture in Snowflake, ensuring a robust dimensional data model for the gold layer. Define and oversee data transformation processes using Coalesce.io, and manage data extraction and loading with Airbyte. Integrate and modernize legacy data systems built on SQL Server/SSIS, with limited use of Azure Data Factory where required.