Job Description

We are seeking a skilled and detail-oriented DataOps Engineer to design, build, and maintain automated data pipelines and cloud-based data platforms that support enterprise-scale web, mobile, and AI-driven applications. The DataOps Engineer will be responsible for ensuring the reliability, performance, and scalability of data operations — enabling smooth integration between systems, analytics, and applications. This role requires hands-on expertise with Azure data services, data orchestration tools, and CI/CD automation for data workflows.

Key responsibilities: Develop, deploy, and maintain data ingestion, transformation, and delivery pipelines using Azure Data Factory, Databricks, or equivalent frameworks. Implement CI/CD pipelines for data workflows using Azure DevOps or GitHub Actions. Automate data quality checks, schema validation, and error handling mechanisms. Integrate structured, semi-structured, and unstructured data sources (APIs, databases, IoT, etc.). Collaborate with Data Engineers, DevOps, and Developers to enable seamless data flow between systems. Manage and optimize Azure cloud data services (Data Lake, Synapse, SQL, Blob Storage, Key Vault). Implement infrastructure as code (IaC) using ARM templates, Terraform, or Bicep for provisioning data resources. Enforce data governance and access controls aligned with organizational policies. Implement automated data quality and validation frameworks. Build dashboards for tracking data pipeline health, latency, and throughput. Work closely with Data Engineers, Business Analysts, and Product Teams to align data operations with business goals. Support the data integration needs of downstream systems like BI dashboards (Power BI, Tableau) or AI applications

About Bridgenext

Bridgenext is a digital consulting services leader that helps clients innovate with intention and realize their digital aspirations.

Apply for This Position