Apply

Staff Software Engineer, Data Platform

Posted over 1 year ago

View full description

📍 Location: New york, ny; remote, us

💸 Salary: $195,000 - $300,000

🔍 Industry: Virtual clinic for women's and family health

🗣️ Languages: English

Requirements:
Bachelor's or master's degree in computer science or related field, 8+ years of backend development and platform architecture experience, expertise in developing cloud-based solutions, proficiency in multiple programming languages, experience leading technical design discussions and mentoring junior engineers
Responsibilities:
Lead the design, development, and maintenance of scalable, performant, and reliable systems for maven's data platform teamsApply

Related Jobs

Apply

📍 US

🧭 Full-Time

💸 200000 - 255000 USD per year

🔍 Financial services, Blockchain

  • A Bachelor's degree (or equivalent) in Computer Science or a related field.
  • A proven track record with 8+ years of hands-on experience in architecting distributed system architecture.
  • Exceptional programming skills in Python and adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Iceberg, Trino, BigQuery, and StarRocks.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms using Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.

  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with a focus on performance and high availability.
  • Collaborate across departments to design and implement novel data models that enhance TRM’s products.

DockerPythonSQLBlockchainETLKafkaKubernetesAirflowData engineeringSparkCollaborationTerraformDocumentation

Posted 8 days ago
Apply
Apply

📍 US

🔍 Blockchain intelligence and financial services

  • Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 5+ years of hands-on experience in architecting distributed system architecture.
  • Exceptional programming skills in Python.
  • Adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as ClickHouse, ElasticSearch, Postgres, Redis, and Neo4j.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, Luigi, Azkaban, and Storm.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools like Docker, Terraform, Kubernetes, and Datadog.

  • Build highly reliable data services to integrate with various blockchains.
  • Develop complex ETL pipelines for processing structured and unstructured data in real-time.
  • Design intricate data models to support optimal storage and retrieve with sub-second latency.
  • Oversee large database cluster deployment and monitoring with a focus on performance.
  • Collaborate with data scientists, engineers, and product managers to enhance TRM's products.

DockerPythonSQLBlockchainElasticSearchETLKafkaKubernetesAirflowClickhouseData engineeringPostgresRedisSparkCollaborationTerraformDocumentation

Posted 8 days ago
Apply
Apply

📍 United States

💸 200000 - 255000 USD per year

🔍 Blockchain intelligence and financial services

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B about 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • A Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 8+ years of hands-on experience in architecting distributed systems.
  • Exceptional programming skills in Python and proficiency in SQL or SparkSQL.
  • In-depth experience with data stores such as Iceberg, Trino, BigQuery, and StarRocks.
  • Proficiency in tools like Airflow and DBT for data pipeline orchestration.
  • Expertise in technologies like Spark, Kafka, and Flink.
  • Competence in deploying and monitoring cloud infrastructure with tools like Docker and Terraform.
  • Proven ability in managing extensive datasets.

  • Build highly reliable data services to integrate with blockchains.
  • Develop complex ETL pipelines for real-time data processing.
  • Design intricate data models for optimal storage and retrieval.
  • Oversee deployment and monitoring of large database clusters.
  • Collaborate with data scientists and engineers to enhance products.

DockerPythonSQLBlockchainETLKafkaKubernetesAirflowData engineeringSparkCollaborationTerraform

Posted 4 months ago
Apply

Related Articles

Remote Job Certifications and Courses to Boost Your Career

Posted 4 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

How to Balance Work and Life While Working Remotely

Posted 4 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Weekly Digest: Remote Jobs News and Trends (August 11 - August 18, 2024)

Posted 4 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

How to Onboard Remote Employees Successfully

Posted 4 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.

Remote Work Statistics and Insights for 2024

Posted 4 months ago

The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.