← All Positions
Posted Apr 10, 2026

Overseas Contractor (BR)

Apply Now
Job Title Senior Data Engineer Role Description We are looking for a talented Senior Data Engineer to design and build a highly scalable, cloud-based GraphQL API layer that delivers sports and betting data across multiple digital platforms. This role focuses on creating a seamless, efficient data ingestion, processing, and delivery ecosystem that integrates streaming data with high-performance querying. You will work closely with cross-functional teams to design and implement a federated GraphQL architecture, helping drive the future of sports and betting data integration within the CBS Sports digital ecosystem. Responsibilities Day-to-Day - Build foundational platform capabilities using Domain-Driven Design (DDD) and Test-Driven Development (TDD). - Lead the design, development, and optimization of GraphQL subgraphs and services within a federated architecture. - Participate in Agile ceremonies, including daily standups, backlog grooming, and sprint planning. - Practice DevOps by writing high-quality tests and deploying continuously to production with proper monitoring and observability. - Review peer designs and code, providing constructive feedback and participating in technical discussions and deep-dive sessions. - Collaborate with product partners to design technical solutions leveraging GraphQL for data integration. Key Projects - Build and expand a GraphQL API layer for seamless data access across multiple services. - Design and implement a core data platform powering CBS Sports Digital products, ingesting and processing data from hundreds of sources across multiple transports and data stores. - Define and implement distributed tracing and data observability within a GraphQL ecosystem. - Containerize core systems to reduce infrastructure overhead and improve scalability. Qualifications Required - 5+ years of experience in data engineering or large-scale data systems development. - Strong experience designing and maintaining GraphQL schemas, optimizing queries, and ensuring efficient data access. - Hands-on experience with JavaScript/TypeScript, Python, and Node.js. - Experience with relational, NoSQL, and key-value databases, including Aurora MySQL, DynamoDB, MongoDB, and Redis. - Experience working in distributed, cloud-based environments (AWS preferred) with high transaction volumes. - Proficiency with CI/CD pipelines, unit and integration testing, and modern build strategies. - Experience using GitHub, Jira, and GitOps workflows. - Ability to design technical solutions, provide estimates, and assess risks and feasibility. - Bachelor’s degree in Computer Science, Engineering, or equivalent professional experience. Nice to Have - Experience with Apollo GraphQL, WunderGraph, or Cosmo. - AWS Certifications. - Experience building ETL pipelines and working with data orchestration platforms. - Background in digital media, sports, or content-driven platforms. - Knowledge of sports and gaming/betting ecosystems.