Data (Platform) Engineer
Job Description
[Up to c. £110k Comp Package | Hybrid Working - 4 Days in Office]
Role Overview
We’re partnering with a fast-growing digital asset investment firm that is undertaking a multi-year transformation of its data and analytics platform. As the organisation expands its investment, research, and ETF/ETP capabilities, scalable and reliable data infrastructure has become a critical foundation for the business. This hire will become the first dedicated Data Engineer within the organisation, joining a small but highly capable engineering team responsible for building and modernising the firm’s core data platform. The position will focus on developing robust ingestion pipelines, improving data reliability, and supporting analytics across trading, fund management, and operational teams.
Initially, the role will work closely with external consultants currently assisting with infrastructure redesign. Over time, ownership of the firm’s data engineering capability will transition in-house, making this an opportunity for someone early in their career to take meaningful responsibility and grow into broader platform ownership. The environment is highly collaborative and engineering-focused, with a modern microservices stack running primarily on Python services deployed within AWS, alongside containerised services and internal trading infrastructure...
Key Responsibilities
- Develop and maintain data ingestion pipelines connecting internal systems, APIs, and external data sources
- Build reliable ETL/ELT workflows using Python and SQL to support analytics and reporting use cases
- Implement and manage orchestration pipelines using Airflow or similar workflow tools
- Optimise and maintain data transformations within Postgres and related analytics layers
- Help design structured, analytics-ready datasets that support internal business teams
- Improve data quality, validation, and monitoring processes across the platform
- Troubleshoot data pipeline failures and identify performance or reliability improvements
- Collaborate closely with software engineers and internal stakeholders to ensure data is accessible and well-structured
- Contribute to the ongoing modernisation of legacy data systems as the platform evolves
- Support the longer-term development of a scalable cloud-native data platform capable of supporting AI and advanced analytics initiatives
What You’ll Bring…
- 2-3 years’ commercial experience in data engineering or related backend engineering roles
- Strong Python development skills, particularly for building data pipelines and automation workflows
- Solid SQL expertise with experience working with relational databases such as Postgres
- Hands-on experience using workflow orchestration tools such as Airflow (or similar)
- Familiarity with AWS data infrastructure such as S3 or related services
- Practical understanding of ETL/ELT pipelines and modern data platform concepts
- Experience writing clean, maintainable production code within collaborative engineering teams
- Strong communication skills and an eagerness to learn within a fast-moving environment
- (Preferred) Experience working with financial or market data
- (Preferred) Interest in modern data governance, lineage, or data quality tooling
- (Preferred) Curiosity around emerging AI or LLM-enabled data workflows
...
Apply for this role
All fields marked with * are required.