Design and implement complex SQL queries, stored procedures, and database optimization strategies
Develop and maintain ETL (Extract, Transform, Load) pipelines for data ingestion and transformation
Design and develop data architecture, data models, and data manipulation structures for content management systems
GHX enables better patient care and billions in savings for the healthcare community by maximizing automation, efficiency and accuracy of business processes. GHX is a healthcare business and data automation company, empowering healthcare organizations; they employ more than 1000 people worldwide.
Design, build, and maintain data pipelines connecting internal systems to the organisation’s Delta Lake environment.
Develop and optimise SQL-based data transformations and relational data models to support analytics and reporting.
Integrate new data sources and systems into the data platform as the organisation expands its technology landscape.
Smart Working believes your job should not only look right on paper but also feel right every day. It connects skilled professionals with outstanding global teams and products for full-time, long-term roles, offering a genuine community that values growth and well-being.
Design and develop data flows and extraction processes.
Integrate client’s data into the Edifecs/Cotiviti Risk Adjustment Workflows.
Implement open-source standards for data quality.
Edifecs/Cotiviti delivers innovative software solutions. They focus on onboarding customers to Risk Adjustment workflow applications and value collaboration across platform engineering, product, and implementation teams.
Design, build, and maintain scalable data pipelines for clients across industries.
Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.
NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.
Serves as a point of contact for all iPaaS platforms and integrations within Aledade Enterprise Systems.
Implements data integration solutions to meet Aledade business needs.
Drives simple level complexity integration solutions independently and collaborates on medium to higher complexity designs and solutions.
Aledade, a public benefit corporation, empowers independent primary care practices to deliver better care and thrive in value-based care. Founded in 2014, they've become the largest network of independent primary care in the US, fostering a collaborative, inclusive, and remote-first culture.
Assist in building and maintaining ETL/ELT pipelines for healthcare datasets.
Support the development of data models and data transformations aligned with healthcare standards.
Contribute to data quality checks, validation rules, and documentation for healthcare data assets.
Curana Health is committed to radically improving the health, happiness, and dignity of older adults. They are a national leader in value-based care, serving 200,000+ seniors in 1,500+ communities across 32 states with over 1,000 clinicians and other professionals.
Develops tools and techniques for improving process efficiencies and query performance.
Creates and maintains ETL scripts, queries, and applications for healthcare data management.
Performs data analysis, data mining, and investigations to identify root cause of issues with SQL queries.
Cotiviti is a solutions and analytics company that helps healthcare payers improve their financial performance. They offer solutions and expertise that help their clients capture more revenue, reduce costs, and improve quality. They focus on a culture of innovation, collaboration, and commitment to delivering exceptional results for their clients.
Design and implement Python-based data integration solutions to move and synchronize data across enterprise applications.
Deploy and manage integrations in Azure, optimizing for scalability, reliability, and cost-effectiveness.
Automate institutional processes by leveraging application APIs, web services, and data warehouse capabilities.
Unitek Learning is a leading healthcare education organization that helps students launch and accelerate their careers. As a rapidly growing company, they offer a competitive salary, generous benefits, unlimited growth potential, and a collegiate work environment.
Design and evolve scalable data pipelines and architectures.
Act as the primary anchor for data ingestion, transformation, and storage solutions.
Ensure mission-critical data is accessible and reliable.
CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.
Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
Implement data quality checks, monitoring, and validation processes.
Automate manual processes into centralized and scalable solutions.
Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.
Experience with the integration of data from multiple data sources.
Experience with various database technologies such as SQLServer, Redshift, Postgres, and RDS.
Experience designing, building, and maintaining data pipelines.
Bluelight Consulting is a leading software consultancy dedicated to designing and developing innovative technology that enhances users' lives. With a presence across the United States and Central/South America, Bluelight is in an exciting phase of expansion, continually seeking exceptional talent to join its dynamic and diverse community.
Own the delivery of scalable internal data solutions.
Translate business needs into clear technical designs and working systems.
Build and improve data pipelines, integrations, and automation.
Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and driven by a strong “build and ship” mindset.
Design and build scalable data architecture for current and future products.
Design, build, and maintain data pipelines and ETL/ELT processes across various sources.
Evaluate potential migration from MSSQL to PostgreSQL, including analysis and planning.
IANS Research is an information security advisory and consulting firm, serving Fortune-class information security teams and professionals with in-depth insights and decision support regarding their most pressing technical and strategic challenges. They help security teams achieve technical excellence and improve engagement with the organization to drive security's impact deeper into the company.
Serve as a key contributor to the migration of the NICE Actimize platform from on-prem infrastructure to Actimize Cloud.
Support other ETL processes and data pipelines used for ingesting, transforming, and validating large data sets.
Perform data analysis, reconciliation, and validation to ensure data integrity across systems.
Pathward is a financial empowerment company that works with innovators to increase financial availability, choice, and opportunity for all. They strive to remove barriers that traditional institutions put in the way of financial access, and promote economic mobility. They are a team of problem solvers and innovators who celebrate their differences and know that their unique perspectives make them stronger and well-positioned for success.
Design, build, and maintain scalable data pipelines.
Develop and optimize ETL/ELT processes using cloud data technologies.
Partner with teams to understand data requirements and improve data capture strategies.
Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.
Own the architecture and delivery of scalable internal solutions.
Translate business needs into clear technical designs and working systems.
Build and improve data pipelines, integrations, and automation.
Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and hands-on, driven by a strong “build and ship” mindset.
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.
Work independently and thrive in a team environment as a technical data resource.
Optimize SQL queries and create interactive reports using Power BI.
Design, develop, and implement ETL processes using SSIS.
Ramboll is a global team of bright water consultants. They support the development of sustainable societies, working with water projects on a global scale with over 2,000 experts working across more than 60 offices in the Americas.
QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.