Experience building streaming & batch data pipelines/ETL Expert in Python, PostgreSQL, and PL/pgSQL Experience with scalability solutions and multi-region replication/failover Experience with data warehouse technologies (Trino, Clickhouse, Airflow, etc.) Bachelor’s degree (or equivalent) in Computer Science, Engineering, or related technical discipline or equivalent experience Deep understanding of programming and experience with at least one programming language Knowledge of Kubernetes and Docker (preferred) 4+ years of working experience in relevant data field (preferred) Knowledge of blockchain technology / mining pool industry (preferred) Experience with agile development methodology (preferred) Experience delivering and owning web-scale data systems in production (preferred) Experience working with Kafka, preferably redpanda & redpanda connect (preferred)