Lead support of client’s Azure Data platform and Power BI Environment, including response to any escalations while helping to analyze and resolve incidents for customers environment.
Consult, develop, and advise on solutions in Microsoft Azure with tools such as Synapse, Data Factory, Databricks, Azure ML, Data Lake, Data Warehouse, and Power BI.
Consistently learn, apply, and refine skills around data engineering and data analytics.
3Cloud hires people who aren’t afraid to experiment or fail and who are willing to give direct and candid feedback. They hire people who challenge and hold each other accountable for living 3Cloud’s core values because they know that it will result in amazing experiences and solutions for clients.
Architect, develop, and deploy robust, scalable data solutions using Azure tools.
Design and optimize ETL/ELT data pipelines using Python, PySpark, and SQL.
Build and manage modern data architectures, including data lakes and warehouses.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. The system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Build and maintain Azure Data Factory pipelines for data ingestion.
Write Python code in Databricks for data cleaning and transformation.
Monitor daily jobs and troubleshoot pipeline failures to ensure reliability.
Jobgether is a platform that helps candidates find relevant jobs through AI-powered matching. The company ensures applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Design and implement scalable, high-performing data pipelines and optimize our data architecture.
Build and deploy cloud-native solutions leveraging Azure Data Services, Databricks and other Big Data technologies.
Collaborate across teams to understand and support their data needs while ensuring the data architecture supports ongoing and future initiatives.
Redcare Pharmacy is Europe’s No.1 e-pharmacy, powered by passionate teams and cutting-edge innovation. They strive to create a healthy collaborative work environment where every employee feels valued and inspired to contribute to their vision “Until every human has their health”.
Design and optimize scalable data pipelines and data architecture.
Build cloud-native data solutions using Azure, Databricks (Unity Catalog & Delta Lake), and other big data technologies.
Contribute to a strong data culture through continuous learning and knowledge sharing.
Redcare Pharmacy is Europe’s No.1 e-pharmacy, powered by passionate teams and cutting-edge innovation. They strive to create a healthy collaborative work environment where every employee feels valued and inspired to contribute to their vision.
Build and maintain Azure Data Factory pipelines to ingest data from multiple sources.
Write Python code in Databricks to clean raw data and move it into the silver layer, handling deduplication, type casting, and validation.
Monitor daily jobs and troubleshoot any failures to ensure pipeline stability.
Jobgether is a platform that leverages AI to connect job seekers with employers. They focus on ensuring fair and efficient application reviews, connecting top candidates directly with hiring companies.
Design, build, and maintain scalable data pipelines using Microsoft Fabric and Apache Airflow
Ingest, transform, and integrate data from a variety of sources, including relational systems, APIs, and MongoDB
Design and maintain analytical data models, including fact and dimension tables, to support reporting and analytics
Theoria Medical is a comprehensive medical group and technology company dedicated to serving patients across the care continuum with an emphasis on post-acute care and primary care. Theoria serves facilities across the United States with a multitude of services to improve the quality of care delivered, refine facility processes, and enhance critical relationships.
Play a key role in designing, developing, and delivering modern data solutions that drive business insight and innovation.
Implement scalable, high-performing cloud architectures that support analytics, AI, and operational excellence.
Be responsible for technical delivery, authoring solution documentation, and ensuring data pipelines and models meet enterprise standards for performance, reliability, and cost efficiency.
3Cloud is a company where people aren’t afraid to experiment or fail. They hire people who care about the collective growth and success of the company, challenging each other to live by 3Cloud’s core values, and resulting in amazing experiences and solutions for clients and each other.
Expertise in designing and implementing logical and physical data models for cloud and hybrid data warehouse environments
Implementing data architectures to support a variety of data formats and structures including structured, semi-structured and unstructured data
Experience with multiple full life-cycle data warehouse implementations
3Cloud focuses on providing Azure-based solutions. They foster a culture of experimentation, candid feedback, and accountability, committed to technical excellence and client experience.
Lead the design and implementation of scalable ETL pipelines and data lakes in AWS
Develop and optimise data architectures for terabyte-scale relational and distributed data systems
Collaborate with Data Scientists, Software Engineers, and Architects to integrate data solutions into analytics platforms and applications
Smart Working connects skilled professionals with outstanding global teams for full-time, long-term roles. They help discover meaningful work with teams that invest in your success, empowering you to grow personally and professionally in a remote-first world.
Architect our AWS-based data warehouse and ingestion pipelines.
Transform high-volume simulation outputs into clean, trusted datasets.
Establish schema standards and data contracts with engineering.
Onebrief provides collaboration and AI-powered workflow software designed for military staffs, making them faster, smarter, and more efficient. The company, founded in 2019, values ownership and excellence, with a team spanning veterans and technologists; it has raised $320m+ from investors and is valued at $2.15B.
Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.
Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.
Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.
Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.
Design, build, and optimize scalable data pipelines using Databricks, Apache Spark, Delta Lake, and Unity Catalog.
Develop ingestion frameworks for structured and semi‑structured data from multiple enterprise sources.
Implement data governance, data quality, and security controls across the data lifecycle.
Bridgenext is a digital consulting services leader that helps clients innovate with intention and realize their digital aspirations by creating digital products, experiences, and solutions around what real people need. Our global consulting and delivery teams facilitate highly strategic digital initiatives through digital product engineering, automation, data engineering, and infrastructure modernization services.
Design and implement modern data architectures and lakehouse solutions.
Provide technical leadership throughout the project lifecycle and collaborate with business stakeholders.
Create effort estimations and provide technical expertise during the pre-sales phase.
Hiflylabs is a Budapest-based company that provides effective solutions to business problems. They have 250+ employees and their application development team and business intelligence experts work in various industries such as the financial, telecommunication, and energy sectors.
Lead the foundational setup of new data environments.
Design, build, and manage scalable data pipelines.
Develop and maintain robust DBT data models to support semantic layer build-outs.
CompassX is a boutique business and technology consulting firm that helps Fortune 500 and high-growth clients deliver their most strategic initiatives through digital and data-driven projects. Recognized as a three-time winner of Consulting Magazine’s Best Boutique Firms to Work For, consultants value the freedom to shape their client work and maintain a direct line to leadership.
Build and maintain scalable data pipelines from ingestion through transformation and delivery.
Design, build, and maintain our data warehouse and data marts.
Partner with stakeholders to translate business needs into clean data models.
Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.