Senior Data Engineer

New
AnywhereFull-TimeSenior
Salary not disclosed
Apply NowOpens the employer's application page

Job Details

Required Skills
AWSPythonSQLETLGCPClickhouseTerraform

Requirements

  • Proven track record of delivering working, production data in weeks, not months, with the ability to ruthlessly cut scope to hit a 60-day MVP while managing technical debt.
  • Built Tardis historical and real-time pipelines (or equivalent high-quality crypto market data feeds), understanding specific quirks, rate limits, and WebSocket structures.
  • Expert in large-scale, reliable ETL/ELT for financial or market data.
  • Fluent in provisioning full environments with Terraform in days.
  • Expert in AWS/GCP serverless technologies.
  • Expert Python and SQL skills.
  • Proficiency with time-series databases like TimescaleDB or ClickHouse.
  • Advanced knowledge of WebSocket clients, message queues, and low-latency streaming.
  • Experience with GitOps, automated testing/deploy and observability practices.
  • Significant understanding of stablecoins, lending protocols, and opportunity surface concepts, or a proven ability to ramp up extremely quickly.

Responsibilities

  • Rapidly spin up the cloud environment.
  • Deliver working historical backfill pipelines from Tardis.dev into a queryable database.
  • Deliver a real-time Tardis WebSocket pipeline, ensuring data is normalized, cached for live consumption, accurate, replayable, and queryable by Day 60.
  • Ensure all pipelines are idempotent, retryable, and use exactly-once semantics. Implement full CI/CD, Terraform, automated testing, and secrets management.
  • Implement proper observability (structured logs, metrics, dashboards, alerting) from day one. Provide immediate self-service access to the MVP database for Trading and BI teams via tools like Tableau/Metabase, and through simple internal REST APIs.
  • Develop specialized timeseries data, including USDe backing-asset and a full opportunity-surface timeseries for Delta-neutral/lending/borrow opportunities.
  • Ingest data from additional sources (Kaiko, CoinAPI, on-chain via TheGraph/Dune). Plan for 10x+ data growth via schema evolution, partitioning, and performance tuning.
  • Establish enterprise-grade governance, including a data quality framework, RBAC, audit logs, and a semantic layer.
  • Create full architecture documentation, runbooks, and a data dictionary. Onboard and mentor future junior staff.
View Full Description & ApplyYou'll be redirected to the employer's site
View details
Apply Now