Data Engineer
Data Engineer - Crypto HFT (Remote)
Our client is a leading player in the high frequency trading space, setting new standards in crypto trading using innovative strategies and cutting-edge technology. Our client is looking for a Data Engineer to join their high-calibre team on a remote basis, aligned with Central European time to build and maintain the data backbone of their trading platform.
You will be designing, maintaining and optimizing their well-architected data infrastructure in Clickhouse, including cluster performance and schema tuning to support business-critical data pipelines. This is a high-impact opportunity to work across the full data lifecycle, from ingestion to analytics, in close collaboration with Technology and Trading teams. You'll help offload operational responsibilities from the Data Architect and ensure continuity of key data flows as they expand their platform. If you're passionate about building robust data systems and enabling data-driven decision-making within a high-performing, cross-functional environment - this role is for you!
Responsibilities
- Design and maintain robust data pipelines to support real-time and batch processing to support internal analytics and ML projects.
- Manage and optimize our Clickhouse data warehouse, including cluster performance and schema tuning.
- Collaborate with backend engineers, trading teams, and data stakeholders to to ensure data quality, observability, and governance across critical pipelines.
- Support internal initiatives by building tooling and monitoring for business and technical metrics.
- Take ownership of scheduling and workflow orchestration (Argo, Airflow, etc.) and contribute to CI/CD automation.
Requirements
- At least 5 years of professional experience in data engineering or backend infrastructure.
- Mastery in Python and SQL, especially in complex joins, window functions, and performance optimization.
- Hands-on experience with Clickhouse (especially the MergeTree engine family) or similar columnar DBs.
- Familiarity with workflow schedulers (e.g., Argo Workflows, Airflow, or Kubeflow) and Kafka architecture for streaming messages
- Comfortable with CI/CD pipelines (GitLab CI, ArgoCD, GitHub Actions).
- Experience with monitoring and BI tools such as Grafana for technical/business dashboards.
FAQs
Congratulations, we understand that taking the time to apply is a big step. When you apply, your details go directly to the consultant who is sourcing talent. Due to demand, we may not get back to all applicants that have applied. However, we always keep your CV and details on file so when we see similar roles or see skillsets that drive growth in organisations, we will always reach out to discuss opportunities.
Yes. Even if this role isn’t a perfect match, applying allows us to understand your expertise and ambitions, ensuring you're on our radar for the right opportunity when it arises.
We also work in several ways, firstly we advertise our roles available on our site, however, often due to confidentiality we may not post all. We also work with clients who are more focused on skills and understanding what is required to future-proof their business.Â
That's why we recommend registering your CV so you can be considered for roles that have yet to be created.Â
Yes, we help with CV and interview preparation. From customised support on how to optimise your CV to interview preparation and compensation negotiations, we advocate for you throughout your next career move.