We’re building a modern, consumer-focused financial product designed to help people manage everyday spending while supporting small businesses. This is a high-impact technical role at the intersection of data engineering and analytics, with ownership over a core consumer data domain.
You’ll work closely with backend engineers, maintain business-critical data pipelines, and enable teams across Product, Marketing, Finance, and Risk to make data-driven decisions through reliable, self-service analytics. Your work will span real-time event streams, data transformations, and business intelligence dashboards, ensuring data accuracy and scalability in a fast-growing environment.
What You’ll Do
Own and maintain end-to-end data infrastructure for a core product domain, including dbt transformation pipelines, data marts, and BI dashboards
Partner with backend engineers on integrations with tools such as CRM platforms, customer support systems, and analytics tools
Handle real-time event streams and data flows, including transformations and data warehouse management
Build and optimise production-grade dbt models, ensuring data quality, reliability, and performance
Enable self-service analytics through well-structured datasets, dashboards, query templates, and documentation
Monitor pipeline health, troubleshoot failures, and implement data quality checks for business-critical reporting
Drive technical improvements such as incremental processing, performance optimisation, and scalable data architecture
Translate stakeholder requirements into robust, maintainable technical solutions
Act as the primary data owner and subject-matter expert for your data domain
What We’re Looking For
3–5+ years of experience in analytics engineering or data engineering
Expert-level SQL skills for complex transformations and performance optimisation
Strong hands-on experience with dbt in production environments
Experience with cloud data warehouses (Snowflake or similar)
Experience with BI and dashboarding tools (Tableau, Looker, or similar)
Familiarity with data orchestration tools (Apache Airflow or similar)
Experience with event-driven architectures and streaming platforms (Kafka or similar)
Proven experience with API integrations and ELT/ETL pipelines
Strong documentation skills and a passion for enabling self-service analytics
Ownership mindset, proactive communication, and ability to work in fast-paced environments
Nice to Have
Experience in fintech or financial services
Familiarity with customer and product analytics platforms (e.g. mobile attribution, web analytics)
Tools
SQL
dbt (Data Build Tool)
Snowflake
Kafka
Apache Airflow
Tableau
Looker
Google Analytics
AppsFlyer
API Integrations
Cloud Data Platforms
ETL / ELT Pipelines
Application Note
You don’t need to meet every requirement to apply. If this role aligns with your experience and interests, we’d still like to hear from you.
Apply Now
Apply Now