Apply

Senior or Staff Software Engineer, Data Platform

Posted 4 months agoViewed

View full description

💎 Seniority level: Senior, 8+ years

📍 Location: United States, EST,PST

💸 Salary: 200000 - 255000 USD per year

🔍 Industry: Blockchain intelligence and financial services

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B about 2 years agoCryptocurrencyComplianceBlockchainBig Data

🗣️ Languages: English

⏳ Experience: 8+ years

🪄 Skills: DockerPythonSQLBlockchainETLKafkaKubernetesAirflowData engineeringSparkCollaborationTerraform

Requirements:
  • A Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 8+ years of hands-on experience in architecting distributed systems.
  • Exceptional programming skills in Python and proficiency in SQL or SparkSQL.
  • In-depth experience with data stores such as Iceberg, Trino, BigQuery, and StarRocks.
  • Proficiency in tools like Airflow and DBT for data pipeline orchestration.
  • Expertise in technologies like Spark, Kafka, and Flink.
  • Competence in deploying and monitoring cloud infrastructure with tools like Docker and Terraform.
  • Proven ability in managing extensive datasets.
Responsibilities:
  • Build highly reliable data services to integrate with blockchains.
  • Develop complex ETL pipelines for real-time data processing.
  • Design intricate data models for optimal storage and retrieval.
  • Oversee deployment and monitoring of large database clusters.
  • Collaborate with data scientists and engineers to enhance products.
Apply

Related Jobs

Apply

📍 US

🧭 Full-Time

💸 200000 - 255000 USD per year

🔍 Financial services, Blockchain

  • A Bachelor's degree (or equivalent) in Computer Science or a related field.
  • A proven track record with 8+ years of hands-on experience in architecting distributed system architecture.
  • Exceptional programming skills in Python and adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Iceberg, Trino, BigQuery, and StarRocks.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms using Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.

  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with a focus on performance and high availability.
  • Collaborate across departments to design and implement novel data models that enhance TRM’s products.

DockerPythonSQLBlockchainETLKafkaKubernetesAirflowData engineeringSparkCollaborationTerraformDocumentation

Posted 20 days ago
Apply
Apply

📍 US

🔍 Blockchain intelligence and financial services

  • Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 5+ years of hands-on experience in architecting distributed system architecture.
  • Exceptional programming skills in Python.
  • Adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as ClickHouse, ElasticSearch, Postgres, Redis, and Neo4j.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, Luigi, Azkaban, and Storm.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools like Docker, Terraform, Kubernetes, and Datadog.

  • Build highly reliable data services to integrate with various blockchains.
  • Develop complex ETL pipelines for processing structured and unstructured data in real-time.
  • Design intricate data models to support optimal storage and retrieve with sub-second latency.
  • Oversee large database cluster deployment and monitoring with a focus on performance.
  • Collaborate with data scientists, engineers, and product managers to enhance TRM's products.

DockerPythonSQLBlockchainElasticSearchETLKafkaKubernetesAirflowClickhouseData engineeringPostgresRedisSparkCollaborationTerraformDocumentation

Posted 20 days ago
Apply