Data Engineer
Europe,
United Kingdom,
London,
Switzerland,
Zug
Permanent
Job ID: 2141
Job Description
[c. £120-200k Comp Package (or equivalent) | Hybrid Working - occasional travel between NYC, London, and Zug]
Do you thrive on building scalable data solutions that power quantitative trading strategies? Our client, a leading investment firm operating at the intersection of digital assets, machine learning, and quantitative finance, is seeking a Data Engineer to enhance its high-throughput data platform. This is a high-impact role where you’ll design and optimise petabyte-scale data lakes, real-time event-driven datasets, and advanced ETL pipelines, working closely with top-tier quantitative researchers and engineers to unlock new trading opportunities...
Key Responsibilities
- Build and optimise scalable data pipelines, ensuring seamless ingestion, transformation, and accessibility of live and historical market data
- Enhance data quality, governance, and automation, implementing rigorous assurance measures to maintain accuracy and consistency across structured and unstructured datasets
- Develop and refine high-throughput event-driven datasets, integrating with external APIs and alternative data vendors
- Support the training, validation, and deployment of machine learning-driven trading models, optimising data workflows for research and execution
- Collaborate with quantitative researchers and trading teams, translating complex data needs into scalable, high-performance engineering solutions
- Optimise data infrastructure, leveraging Apache Arrow, Ray, Dask, and Protobuf to handle large-scale processing efficiently
- Implement event-driven architectures, utilising Kafka, Schema Registry, and distributed computing tools
- Ensure compliance with security, privacy, and regulatory standards, particularly in data governance and cloud-based storage solutions
What You Bring...
- 3+ years of experience in data engineering within a market-maker, ideally a quantitative hedge fund or proprietary trading firm
- Proficiency in Python and Rust, with hands-on experience in SQL, Linux, and Docker
- Deep understanding of L2 and L3 market data, and experience in designing real-time, high-frequency data processing pipelines
- Proven expertise in ETL, data modelling, and data warehousing, using AWS and its data offerings
- Experience with event-driven architectures, working with tools like Kafka, Protobuf, and Schema Registry
- Strong problem-solving and analytical skills, capable of extracting insights from complex financial datasets
- Highly detail-oriented, with a meticulous approach to building and maintaining data integrity
- Strong communication skills, able to collaborate across research, engineering, and trading teams
- Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field
...
Apply for this role
All fields marked with * are required.