Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
Implement and maintain data quality measures throughout the data lifecycle
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.
Design, build, and maintain scalable data pipelines.
Develop and implement data models for analytical use cases.
Implement data quality checks and governance practices.
MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.
Design, develop, and maintain scalable and robust data pipelines.
Create solutions for data ingestion, transformation, and modeling using Databricks, Spark/PySpark, Cloudera, and Azure Data Factory (ADF).
Ensure the quality, integrity, and usability of data throughout the entire pipeline.
CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with over 1,000 clients during their 30-year history, with a focus on Artificial Intelligence.
Lead support of client’s Azure Data platform and Power BI Environment.
Consult, develop, and advise on solutions in Microsoft Azure and Power BI.
Proactively mentors junior team members and actively gives feedback.
3Cloud is a company where people roll up their sleeves to take on tough problems together. They hire people who aren’t afraid to experiment or fail and who are willing to give direct and candid feedback, so they can deliver amazing experiences and solutions to their clients.
Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
Collaborate with cross-functional teams to deliver impactful data solutions.
Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.
Design and implement scalable, reliable, and efficient data pipelines to support clinical, operational, and business needs.
Optimize data storage and processing in data lakes and cloud data warehouses (Azure, Databricks).
Proactively suggest improvements to infrastructure, processes, and automation to improve system efficiency, reduce costs, and enhance performance.
Care Access is dedicated to ensuring that every person has the opportunity to understand their health, access the care they need, and contribute to the medical breakthroughs of tomorrow. They are working to make the future of health better for all and have hundreds of research locations, mobile clinics, and clinicians across the globe.
Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
Build, manage, and operationalize data pipelines for marketing use cases.
Develop a comprehensive understanding of customer and marketing data requirements.
Transform large data sets into targeted customer audiences for personalized experiences.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Architect and maintain robust data pipelines to transform diverse data inputs.
Integrate data from various sources into a unified platform.
Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
Design, develop, test, and maintain scalable applications using modern frameworks.
Actively participate in Agile/Scrum ceremonies, contributing to planning, estimation, and continuous improvement.
Contribute to architectural design discussions, test planning, and operational excellence initiatives.
Tealium is a trusted leader in real-time Customer Data Platforms (CDP), helping organizations unify their customer data to deliver more personalized, privacy-conscious experiences. Team Tealium has team members present in nearly 20 countries worldwide, serving customers across more than 30 countries, winning together with respect and appreciation.
Design, build, and maintain scalable ETL pipelines for large-scale data processing.
Implement data transformations and workflows using PySpark at an intermediate to advanced level.
Optimize pipelines for performance, scalability, and cost efficiency across environments.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York. Their team of 600+ highly skilled tech professionals, based in Latin America, drives digital disruption by partnering with U.S. companies on their most impactful projects.
Partner with clients and implementation teams to understand data distribution requirements.
Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
Code, test, and document new or modified data pipelines.
Conduct logical and physical database design.
Perform root cause analysis on internal and external data.
Aker Systems builds and operates ground-breaking, ultra-secure, high performance, cloud-based data infrastructure for the enterprise. They were recognised as a ‘One to Watch’ on the Sunday Times Tech Track and won the Thames Valley Tech Company of the year.
Design, build, and maintain scalable, high-quality data pipelines.
Implement robust data ingestion, transformation, and storage using cloud-based technologies.
Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
Design, develop, and maintain real-time data streaming pipelines using Spark Streaming or Azure Functions.
Load, merge, and process machine logs from Kafka, ensuring efficient data flow and transformation.
Integrate processed data into Redis cache and send it to the data lake for long-term storage and analysis.
Rackspace delivers end-to-end services and solutions for multi-cloud environments. I am unable to find information about their employee count but they seem to promote a collaborative environment.
Design and evolve the enterprise Azure Lakehouse architecture.
Lead the transformation of classic Data Warehouse environments into modern Lakehouse.
Define and implement architecture principles, standards, patterns, and best practices for data engineering and analytics platforms.
Deutsche Telekom IT Solutions Slovakia, formerly T-Systems Slovakia, has been a part of the Košice region since 2006. They have grown to be the second-largest employer in eastern Slovakia, with over 3900 employees, aiming to provide innovative information and communication technology services.
Build and operate data pipelines from D365, Power Platform, and other sources into the enterprise data platform.
Design and implement star schemas, data lake house structures, and semantic models for Power BI.
Optimize performance and cost management for reporting in Azure.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify top-fitting candidates and share the shortlist with the hiring company.
Build & Operate Data Pipelines, using AWS-native data tools and distributed processing frameworks.
Operate and improve core data platform services, addressing incidents, performance issues, and operational toil.
Partner with data producers and consumers to onboard pipelines, troubleshoot issues, and improve platform usability.
Fetch is a platform where millions of people use Fetch earning rewards for buying brands they love, and a whole lot more. With investments from SoftBank, Univision, and Hamilton Lane, and partnerships with Fortune 500 companies, it is reshaping how brands and consumers connect in the marketplace. Ranked as one of America’s Best Startup Employers by Forbes, Fetch fosters a people-first culture rooted in trust, accountability, and innovation.
Projetar, desenvolver e manter pipelines de dados (batch e streaming) para ingestão, transformação e disponibilização de dados para analytics e consumo por aplicações.
Construir e evoluir modelagem analítica (camadas bronze/silver/gold, data marts, star schema, wide tables), garantindo consistência, documentação e reuso.
Implementar boas práticas de qualidade de dados (testes, validações, contratos, SLAs/SLOs, monitoramento de freshness/completeness/accuracy) e atuar na resolução de incidentes com RCA.
CI&T specializes in technological transformation, combining human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters around the world, they have partnered with more than 1,000 clients during their 30 years of history.
Play a Sr.tech lead & architect role to build world-class data solutions and applications that power crucial business decisions throughout the organization.
Enable a world-class engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian's data-driven culture.
Maintain a high bar for operational data quality and proactively address performance, scale, complexity and security considerations.
At Atlassian, they're motivated by a common goal: to unleash the potential of every team. Their software products help teams all over the planet and their solutions are designed for all types of work. They ensure that their products and culture continue to incorporate everyone's perspectives and experience, and never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.