In this position, you will be responsible for designing and implementing data pipelines (ETL), ensuring the integrity and quality of information, conducting audits to identify inconsistencies, correcting them, and ensuring adherence to data governance standards. Demonstrate deep knowledge in data engineering to build and maintain non-interactive (batch, distributed) and real-time data pipelines with high availability.
Job listings
Train and integrate Machine Learning and Deep Learning models and techniques including NLP, BERT, Transformers, Encoders, RAG, LLMs, and Agents. You will also develop machine learning pipelines, define and implement the metrics and evaluation of ML models, as well as apply statistical methods and hypothesis tests and perform data analysis and develop pipelines for processing big data.
The Senior Data Architect focuses on designing the blueprint of the entire data management framework, setting the vision, policies, practices, and standards that govern the use of data across an organization. Responsibilities include constructing data pipelines that transform and transport data from various sources so that it can be used by Data Analysts and Data Scientists efficiently and effectively.
Conduct research and analyses to support the Policy Research Department by conducting quantitative research and analyses to support our policy research agenda, state and federal legislative priorities, while translating the results for general audiences.
You will join a team of data management experts within our Digital Transformation Division to support our clients, detached to a major player in the insurance sector within a squad-organized team and work on a project to develop the Big Data platform. Your missions will include participating in the generation of application imprints from functional specifications, automating the configuration of Big Data environments and ensure a first level of support to projects.