The Senior Data Analyst will help shape the future of employer services, with a focus on HR compliance and outsourcing. You will help develop and implement data-driven solutions that ensure strategic decision-making and drive business growth. We are looking for an experienced and motivated Senior Data Analyst with technical expertise, business insight, and a proactive mindset to solving problems.
Remote Data Jobs · Data Modeling
68 results
FiltersJob listings
We are looking for a Data Engineer passionate about technology and with expertise in building and maintaining robust and scalable data pipelines. You will be a technical reference in Oracle and PowerCenter, ensuring quality, governance, and end-to-end performance in the team's pipelines and dimensional model. Expertise in Databricks and PySpark is a valuable differential.
Design, implement, and maintain scalable data pipelines using Snowflake, Coalesce.io, Airbyte, and SQL Server/SSIS, with some use of Azure Data Factory. Build and maintain dimensional data models to ensure high-quality, structured data for analytics and reporting. Implement Medallion architecture in Snowflake, managing bronze, silver, and gold layers. Collaborate with teams using Jira for task tracking and GitHub for code repository management.
Lead the redesign of a data platform to support high-volume analytics and faster BI. Define the target architecture, resolve ingestion challenges, implement modern data pipelines, and optimise database performance for advanced analytics use cases.
You'll be joining a strategic data platform integration project to consolidate data from different acquired companies into one unified Azure-based data warehouse. Integrate disparate data systems (NetSuite, SAP, Dynamics NAV, Business Central, Oracle) from multiple acquired companies, build a modern Azure data platform that scales globally, and enable executive decision-making with unified financial and operational reporting.
The Senior Data Engineer is a critical technical leader in our modern data ecosystem, responsible for designing and delivering high-scale data solutions, mentoring junior engineers, and shaping the architectural backbone of our analytics infrastructure. This role balances execution with engineering excellence to support fast, reliable, and trustworthy data delivery, including designing integration architectures, developing reusable libraries, and implementing monitoring and testing systems.
This role blends deep technical database expertise with architectural leadership—driving performance, scalability, and data quality across the organization’s data ecosystem. This role sits at the intersection of engineering and business operations responsible for ensuring our data pipelines and warehouses are fast, reliable, and scalable. Architect and optimize data warehouse platforms and underlying database systems.
As a Data Strategist, you will play a vital role in supporting the development and implementation of ATPCO’s commercial strategy. The ideal candidate leverages analytical thinking, technical skills, and business acumen to uncover key insights and recommendations that drive corporate objectives. In this role, you will improve data quality, enhance reporting, and analyze data to generate strategic insights that influence financial performance and business growth.
The Analytics Engineer II is responsible for building, maintaining, and optimizing data pipelines, models, and analytics frameworks that enable data-driven insights and strategic decision-making. This role will focus on integrating and analyzing key operational and customer experience data from Salesforce (SF), survey platforms, communication tools and Learning Management Systems (LMS). The Analytics Engineer II will work across medium-complexity data sets that support singular product lines.
As an Airtable’s Analytics Engineer, you will play a crucial role in shaping the future of Airtable by deepening our understanding of our product and partnering with Sales, Sales Ops, and engineers. Analytics Engineers build the data foundation for reporting, analysis and experimentation. Using SQL, DBT, and Looker, you will transform data from data warehouse tables into critical self-serve data artifacts that power impactful analytic use cases and empower data consumers.