Remote Data Jobs · GCP

Job listings

$193,600–$242,000/yr

You’ll lead a high-caliber team building and operating large-scale, production AI systems that power critical business functions, acting as a player-coach who pairs deep technical execution with ownership of strategy, architecture, and standards. You’ll work directly with the Chief Data Officer and senior leadership at one of the most advanced organizations applying AI to consumer lending.

$95,000–$115,000/yr

Adswerve is seeking a Data Scientist to join the Technical Services team to solve complex data-related challenges for some of the biggest brands in the world. The work involves solving client problems with data engineering and data science solutions, applying marketing segmentation, and propensity modeling to surface insights out of marketing data warehouses.

$200,000–$220,000/yr

Lead the architecture, design, and implementation of scalable Data Lakehouse solutions on Google Cloud Platform (GCP) using BigQuery, GCS, BigLake, and Dataplex. Collaborate with customers to understand business goals, data challenges, and technical requirements. Design and implement data pipelines, establish best practices for data cataloging, and define data security and governance models. Serve as the technical lead, mentoring engineers and ensuring architectural consistency.

Design, build, and maintain reliable data pipelines to consolidate information from various internal systems and third-party sources. Develop and manage a comprehensive semantic layer. Implement and enforce data quality checks, validation rules, and governance processes to ensure data accuracy. Ensure AI agents have access to the necessary structured and unstructured data. Create clear documentation for data models and pipelines.

Assemble large, complex data that meet functional/non-functional business requirements. Identify, design, and implement internal process improvements, automating manual processes and optimizing data delivery. Use the infrastructure/services required for optimal extraction, transformation, and loading of data from a wide variety of data sources using GCP services. Work with stakeholders to assist with data-related technical issues and support their data requirement needs.

The Data Engineer / Integration Engineer will design, develop, and maintain scalable data pipelines, integrating various systems, and ensuring data quality and consistency across platforms. The role involves ETL/ELT processes using Python and workflow automation tools. Implementing and managing data integration between various systems, including APIs and Oracle EBS is critical.

$135,000–$175,000/yr

Support a transformative state government AI initiative focused on leveraging data to drive better decision-making, transparency, and service delivery as a Sr. Data Engineer. This role will be part of a multidisciplinary team building scalable data infrastructure, enabling machine learning applications, and ensuring data quality and governance across state systems. The ideal candidate will bring deep expertise in data engineering, modern cloud architectures, and public-sector data practices.

Design and evolve scalable, cloud-native data architectures for advanced analytics and AI. Develop and maintain real-time and batch data processing platforms. Define and implement data modeling standards for structured/unstructured data. Integrate innovative technologies (vector databases, LLMs, real-time streaming). Ensure data quality, lineage, and governance. Collaborate with engineering/product teams to translate business needs into solutions. Optimize platform scalability, cost, and performance in cloud environments. Establish architectural standards for data-driven decision-making.