Remote Data Jobs · SQL

Job listings

You will design and maintain high-throughput, low-latency backend systems that process and serve massive amounts of data, collaborate across teams to deliver scalable, reliable, and high-performance solutions used by thousands of enterprise customers worldwide. Solve complex technical challenges, from optimizing concurrency and synchronization to debugging distributed behaviors and improving system reliability.

We are looking for a Data Engineering Intern eager to gain hands-on experience in database management, data pipeline development, and ETL processes. This internship provides an excellent opportunity to work alongside experienced data engineers, building foundational skills in data integration and data quality management. Ideal candidates should have some experience in scripting (Python preferred) and a curiosity about creating effective data solutions to support business operations.

This is a 12 week internship within the Data Science team at Ever.ag, assisting in continued development and deployment of leading data science applications in the food production supply chain space. Potential projects include cleaning and analyzing large complex data sets for commodity food processors, translating customer problems into application development and data science requirements, assessing cutting edge machine learning techniques, and developing data science forecasting techniques.

As a Senior Data Product Manager, you will own the vision, roadmap, and delivery of key data initiatives that enable the entire organization to make better decisions, faster. Your scope will include data infrastructure, internal analytics tools, and data compliance topics. You will define and drive the data platform and pipeline roadmap, ensuring scalability, reliability, and accessibility of data across products and teams.

Design, maintain, and scale streaming ETL pipelines for blockchain data. Build and optimize ClickHouse data models and materialized views for high-performance analytics. Develop and maintain data exporters using orchestration tools. Implement data transformations and decoding logic. Establish and improve testing, monitoring, automation, and migration processes for pipelines. Ensure timely delivery of new data features in alignment with product goals.

India Unlimited PTO

Combine and transform diverse data sources using SQL. Produce report-ready data tables, by integrating various data sources, complex logic, and business rules. Develop a deep understanding of Acquia’s internal systems with a focus on data structure and corresponding processes that impact the data. Collaborate with stakeholders globally to develop automated reporting solutions. Drive data initiatives by refining processes and data structures to enable scalable reporting solutions.

US Unlimited PTO

Wavicle Data Solutions is hiring a Databricks Solution Architect, who will be responsible for leading design and implementation of scalable and optimized solutions that leverage the latest Databricks for features. This individual will work closely with customers, understanding their needs and business drivers, and helping them adopt and optimize Databricks for their analytics, data science, and AI/ML workloads.

Focuses on supporting external customers with analyzing current data and reporting needs, developing technical solutions using Power BI, Power Automate, Azure Synapse, and Databricks, implementing and testing technical reporting solutions, and supporting end users with reports. Candidates can work in a hybrid-location role in Arlington, VA or for full-time remote work.