Apply

Senior or Staff Software Engineer, Data Platform

Posted 2024-12-03

View full description

💎 Seniority level: Senior, 5+ years

📍 Location: US, EST,PST

🔍 Industry: Blockchain intelligence and financial services

🗣️ Languages: English

⏳ Experience: 5+ years

🪄 Skills: DockerPythonSQLBlockchainElasticSearchETLKafkaKubernetesAirflowClickhouseData engineeringElasticsearchPostgresRedisSparkCollaborationTerraformDocumentation

Requirements:
  • Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 5+ years of hands-on experience in architecting distributed system architecture.
  • Exceptional programming skills in Python.
  • Adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as ClickHouse, ElasticSearch, Postgres, Redis, and Neo4j.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, Luigi, Azkaban, and Storm.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools like Docker, Terraform, Kubernetes, and Datadog.
Responsibilities:
  • Build highly reliable data services to integrate with various blockchains.
  • Develop complex ETL pipelines for processing structured and unstructured data in real-time.
  • Design intricate data models to support optimal storage and retrieve with sub-second latency.
  • Oversee large database cluster deployment and monitoring with a focus on performance.
  • Collaborate with data scientists, engineers, and product managers to enhance TRM's products.
Apply

Related Jobs

Apply

📍 US

🧭 Full-Time

💸 200000 - 255000 USD per year

🔍 Financial services, Blockchain

  • A Bachelor's degree (or equivalent) in Computer Science or a related field.
  • A proven track record with 8+ years of hands-on experience in architecting distributed system architecture.
  • Exceptional programming skills in Python and adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Iceberg, Trino, BigQuery, and StarRocks.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms using Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.

  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with a focus on performance and high availability.
  • Collaborate across departments to design and implement novel data models that enhance TRM’s products.

DockerPythonSQLBlockchainETLKafkaKubernetesAirflowData engineeringSparkCollaborationTerraformDocumentation

Posted 2024-12-03
Apply