Manage day-to-day database operations including monitoring, configuration, upgrades, scaling, backups, and data migrations.
Continuously improve database operations through automation, tooling, and process optimization.
Manage backups, replication, failover, and disaster recovery (DR) strategies to ensure business continuity.
PayPay Card Corporation provides users a FinTech service that is more accessible and convenient compared to previous credit cards and credit services, by integrating with the PayPay payment platform. They are looking for passionate people to refine their products and promote cashless payments in Japan.
Support production systems and help triage issues during live sporting events
Architect low-latency, real-time analytics systems including raw data collection, feature development and endpoint production
Integrate large and complex real-time datasets into new consumer and enterprise products
Swish Analytics is a sports analytics, betting, and fantasy startup building the next generation of predictive sports analytics data products. They're looking for team-oriented individuals with an authentic passion for accurate and predictive real-time data who can execute in a fast-paced, creative, and continually-evolving environment without sacrificing technical excellence.
Administer and manage Tableau Server and Tableau Cloud including security and user access.
Design and implement large-scale data solutions on platforms like Snowflake and AWS.
Establish best practices for platform security and Data Governance.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Own and operate AWS Aurora (PostgreSQL) in a high-load production environment.
Design and evolve schemas for large transactional domains.
Analyze and optimize slow queries and production metrics.
Ruby Labs is a leading tech company that creates and operates innovative consumer products, offering opportunities across health, education, and entertainment. Their innovative teams are driving the future of consumer-led products.
QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.
Lead the development of our data infrastructure, from ingestion, ETL / ELT, aggregations, processing, and quality checks.
Provide technical leadership, mentoring, and guidance to a team of data engineers and analytic engineers.
Collaborate closely with data scientists to translate intricate model requirements into optimized data pipelines, ensuring impeccable data quality, processing, and integration.
Oportun is a mission-driven fintech company that aims to make its members' financial goals achievable. They empower members with intelligent borrowing, savings, and budgeting capabilities to build a better financial future and have provided more than $19.7 billion in responsible and affordable credit.
Improve consumer experience for sustainable growth
Build and enhance machine learning, statistical and causal models for product search, ranking and recommendation that support various business goals
Work closely with product and engineering team to develop and deploy solutions with cross-functional support
Gopuff provides the modern-day solution to meet customer's immediate everyday needs with products ranging from snacks and ice cream to household goods and beer, at the click of a button. They are assembling a team of motivated people to help drive forward that vision to bring a new age of convenience and predictability to an unpredictable world.
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Design and implement event-driven pipelines using AWS services to ingest data from external sources in real-time.
Build and maintain streaming data pipelines between HubSpot CRM and PostgreSQL, handling webhook events and API polling.
Implement schema validation, data type checking, and automated quality gates at the ingestion layer to prevent bad data from entering the system.
PropHero is a property analytics platform provider. They have reached €30M revenue in 4 years, 25% QoQ growth, and are already profitable, offering a modern, cloud-native AWS data platform.
Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
Develop and optimize data models in Snowflake or similar platforms.
Implement ETL/ELT processes using Python and modern data tools.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.