Develop analytical data products using Airflow, DataProc, PySpark, and BigQuery on the Google Cloud Platform, with solid data warehouse principles. Build data pipelines to monitor data quality and analytical model performance. Maintain the data platform infrastructure using Terraform and develop, evaluate, and deliver code through CI/CD.
Source Job
20 jobs similar to Mid-Level Data Engineer (GCP Cloud)
Jobs ranked by similarity.
- Projetar, desenvolver e manter pipelines de dados (batch e streaming) para ingestão, transformação e disponibilização de dados para analytics e consumo por aplicações.
- Construir e evoluir modelagem analítica (camadas bronze/silver/gold, data marts, star schema, wide tables), garantindo consistência, documentação e reuso.
- Implementar boas práticas de qualidade de dados (testes, validações, contratos, SLAs/SLOs, monitoramento de freshness/completeness/accuracy) e atuar na resolução de incidentes com RCA.
CI&T specializes in technological transformation, combining human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters around the world, they have partnered with more than 1,000 clients during their 30 years of history.
Design, build, and maintain a robust, self-service, scalable, and secure data platform. Create and edit data pipelines, considering business logic, levels of aggregation, and data quality. Enable teams to access and use data effectively through self-service tools and well-modeled datasets.
We are Grupo QuintoAndar, the largest real estate ecosystem in Latin America, with a diversified portfolio of brands and solutions across different countries.
- Design, build and execute data pipelines.
- Build the configurable ETL framework.
- Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
- Design, develop, and maintain scalable and robust data pipelines.
- Create solutions for data ingestion, transformation, and modeling using Databricks, Spark/PySpark, Cloudera, and Azure Data Factory (ADF).
- Ensure the quality, integrity, and usability of data throughout the entire pipeline.
CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with over 1,000 clients during their 30-year history, with a focus on Artificial Intelligence.
Support the management and maintenance of Google Cloud Platform (GCP) infrastructure. Assist in ensuring the reliability, scalability, and performance of cloud services. Contribute to the design and maintenance of GitLab CI/CD pipelines.
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation.
- Design and develop the whole GCP data ecosystem, including Cloud Storage, Cloud Functions, and BigQuery.
- Guide and mentor the data engineering team, providing technical direction and ensuring adherence to best practices.
- Design and optimize data storage architectures, including data lakes and data warehouses.
On the market since 2008, 1500+ talents currently on board in 7 global sites.
Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.
As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
- Build and optimize Sauce's lakehouse architecture using Azure Databricks and Unity Catalog for data governance.
- Create and maintain data quality tests and improve existing alerting setups.
- Own data warehouse by connecting data sources, and maintaining a platform and architecture in coordination with R&D infrastructure and operations teams.
Sauce is a premier restaurant technology platform that helps businesses grow with our Commission-Free Delivery & Pickup structure and proprietary delivery optimization technology.
- Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
- Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
- Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.
Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.
- Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
- Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
- Implement and maintain data quality measures throughout the data lifecycle
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.
- Design, build, and maintain cloud-native data infrastructure using Terraform for IaC.
- Develop and optimize data pipelines leveraging AWS services and Snowflake.
- Build and maintain LLM frameworks, ensuring high-quality and cost-effective outputs.
ClickUp is building the first truly converged AI workspace, unifying tasks, docs, chat, calendar, and enterprise search, all supercharged by context-driven AI.
- Build and maintain tools and robust infrastructure for complex distributed data engineering systems.
- Enhance the software development lifecycle with robust code review and integration with existing tools and cloud technologies.
- Design, develop, and maintain advanced CI/CD pipelines and automation tools.
Cloudera empowers people to transform complex data into clear and actionable insights and is the preferred data partner for the top companies.
- Design, build, and oversee the deployment of technology for managing structured and unstructured data.
- Develop tools leveraging AI, ML, and big-data to cleanse, organize, and transform data.
- Design and maintain CI/CD pipelines using GitHub Actions to automate deployment, testing, and monitoring.
NBCUniversal is one of the world's leading media and entertainment companies creating world-class content across film, television, streaming, theme parks, and more.
- Designing, building, and maintaining infrastructure that enables fast, reliable, and secure product delivery.
- Improving and maintaining CI/CD pipelines to streamline deployments and increase reliability.
- Contributing to infrastructure reliability and ensuring systems are designed for resilience and growth.
Incident.io is the leading AI incident response platform, built to help teams dramatically reduce incident response time and improve reliability. They have raised $100M from Index Ventures, Insight Partners, and Point Nine, alongside founders and executives from world-class technology companies.
Design and architect scalable data pipelines and infrastructure on GCP. Provide technical guidance to data engineering teams. Partner with Sales and Engineering to translate client requirements into data engineering solutions.
66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.
- Design, develop, and maintain Python-based services and data integrations supporting IAM and access management platforms.
- Deploy, optimize, and manage cloud infrastructure using Infrastructure as Code (Terraform / Terraform Enterprise).
- Collaborate with application and product teams to onboard internally hosted, SaaS, and homegrown applications into centralized data and access frameworks.
Blend is an AI services provider that co-creates meaningful impact for its clients through data science, AI, technology, and people. They are dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy.
- Design and implement the next generation of our Continuous Integration and Continuous Delivery (CI/CD) pipelines, focusing on security, speed, and reliability.
- Maintain and optimize the health of our monorepo, ensuring scalable dependency management and fast incremental builds.
- Work with GCP to architect secure, scalable runtime environments.
Anchorage Digital is building the world’s most advanced digital asset platform for institutions to participate in crypto. As a diverse team of more than 600 members, they are united in one common goal: building the future of finance by providing the foundation upon which value moves safely in the new global economy.
- Partner with clients and implementation teams to understand data distribution requirements.
- Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
- Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.