Apply

Software Engineer, Data Platform

Posted 4 days agoViewed

View full description

💎 Seniority level: Senior, 3+ years

💸 Salary: 170000.0 - 190000.0 USD per year

🔍 Industry: Software Development

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

🗣️ Languages: English

⏳ Experience: 3+ years

Requirements:
  • A proven track record, with 3+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Icerberg, Trino, BigQuery, and StarRocks, and Citus.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, etc.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.
Responsibilities:
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.
Apply

Related Jobs

Apply

📍 USA

🧭 Full-Time

🔍 Software Development

🏢 Company: Mysten Labs👥 11-50💰 $300,000,000 Series B over 2 years agoCryptocurrencyBlockchainWeb3Software

  • 8+ years of software engineering experience with a focus on systems architecture, databases, and performance optimization with experience programming in either Rust or C++.
  • Extensive experience building and scaling high-throughput systems.
  • Strong knowledge of data storage architectures, database technologies (SQL, NoSQL), and distributed computing.
  • Postgres experience is highly desirable.
  • Expertise in performance tuning and optimizing both system architecture and low-level services.
  • Ability to lead technically complex projects and drive solutions from concept to production.
  • Build a robust, high-performance indexing framework that supports both real-time and batch processing needs.
  • Optimize Sui’s data infrastructure from end-to-end: write performance, storage footprint, read performance, scaling, reliability, and costs.
  • Collaborate with cross-functional teams and external partners to ensure seamless integration of data platform solutions with first-party applications and the ecosystem at large.
  • Serve as the technical lead on key projects, ensuring the delivery of high-quality solutions on time.

Backend DevelopmentPostgreSQLSQLBlockchainSoftware ArchitectureC++AlgorithmsData engineeringData StructuresNosqlRustCI/CDRESTful APIsData modelingSoftware EngineeringData management

Posted 13 days ago
Apply
Apply

🧭 Full-Time

🔍 Software Development

  • 5+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as ClickHouse, ElasticSearch, Postgres, Redis, and Neo4j.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, Luigi, Azkaban, and Storm.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.
Posted 14 days ago
Apply
Apply

🧭 Full-Time

💸 170000.0 - 190000.0 USD per year

🔍 Software Development

  • 3+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Icerberg, Trino, BigQuery, and StarRocks, and Citus.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, etc.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.
Posted 14 days ago
Apply
Apply

🧭 Full-Time

💸 240000.0 - 265000.0 USD per year

🔍 Software Development

  • A proven track record, with 7+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Icerberg, Trino, BigQuery, and StarRocks, and Citus.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, etc.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.
Posted 17 days ago
Apply
Apply

🧭 Full-Time

🔍 Software Development

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • 5+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as ClickHouse, ElasticSearch, Postgres, Redis, and Neo4j.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, Luigi, Azkaban, and Storm.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.
Posted 18 days ago
Apply
Apply

📍 USA

💸 176000.0 - 207000.0 USD per year

🔍 Software Development

🏢 Company: Abnormal Security👥 501-1000💰 $250,000,000 Series D 8 months agoArtificial Intelligence (AI)EmailInformation TechnologyCyber SecurityNetwork Security

  • 6+ years of experience working on data-intensive applications and distributed systems.
  • Strong background in building platforms, tools, and systems to serve the needs of engineering teams, empowering them to ship with velocity and reliability.
  • Depth in one or more types of online data storage systems (relational, key-value, document-oriented, or columnar databases) either as an expertise provider or as a power user, in a high scale environment.
  • Proven track record of being a change agent, defining and driving engineering initiatives that involve cross-team collaboration
  • Expertise in one or more key areas of the Data Platform online storage tech stack - relational DBs, RocksDB, ElasticSearch, Redis, Kafka
  • Excellent ability and strong desire to onboard and mentor other engineers.
  • Define and deliver our data platform offerings across AWS to empower and delight our internal engineering and data science customers.
  • Build tooling and services for our online storage platforms that will deliver self-serve capabilities for the team’s stakeholders.
  • Work cross functionally with other engineering teams to identify scalability gaps in their data intensive systems, owning key areas from our charter, building and executing on plans to mitigate them.

AWSBackend DevelopmentPostgreSQLSQLDynamoDBKafkaCross-functional Team LeadershipAlgorithmsData engineeringData StructuresRedisCommunication SkillsCI/CDProblem SolvingMentoringMicroservicesData modelingScriptingSoftware Engineering

Posted 19 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 240000.0 - 265000.0 USD per year

🔍 Software Development

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • 7+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Icerberg, Trino, BigQuery, and StarRocks, and Citus.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, etc.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.

AWSDockerPythonSQLCloud ComputingETLKafkaKubernetesAirflowData engineeringPostgresSparkTerraformData modeling

Posted 20 days ago
Apply
Apply

📍 CA, CO, ID, IL, FL, GA, MA, MI, MN, MO, NJ, NV, NY, OR, TX, UT, WA

🧭 Full-Time

💸 160000.0 - 210000.0 USD per year

🔍 Software Development

🏢 Company: Liftoff👥 501-1000💰 Private over 4 years agoAdvertising PlatformsBig DataMobile AdvertisingApp MarketingAd RetargetingMobileAd Network

  • Very strong coding ability
  • Solid core CS fundamentals
  • 8+ years software engineering experience
  • Experience building and debugging high scale distributed systems
  • B.S. or higher in Computer Science or equivalent experience
  • Build and have end-to-end ownership of core engines
  • Improve tooling and infrastructure for rapid deployment
  • Partner with product team for strategy planning
  • Contribute to an engineering excellence culture

Backend DevelopmentCloud ComputingAlgorithmsData StructuresSoftware EngineeringDebugging

Posted about 1 month ago
Apply
Apply

📍 Romania, Poland, Italy, Spain

🧭 Full-Time

🔍 Software Development

🏢 Company: YouGov👥 501-1000💰 $293,437,993 Post-IPO Debt over 1 year ago🫂 Last layoff 5 months agoInternetDatabaseAd TargetingConsultingBusiness IntelligenceInternet of ThingsBig DataMarket ResearchAnalyticsSoftware

  • Extensive enterprise experience solving complex problems using multiple database systems, data lake architectures, and query engines.
  • Proven previous experience with open data tools such as Apache Arrow, Pandas, Polars.
  • A record of successful delivery of SaaS and cloud-based applications.
  • Strong understanding of the software development lifecycle.
  • Extensive programming experience using Python as a programming language.
  • A commitment to producing robust, testable code.
  • Results-driven, self-motivated and enthusiastic.
  • Excellent communication skills – verbal, written and presentation.
  • Develop effective ways to store, query, and interactively analyze large datasets that contain millions of rows and hundreds of thousands of columns.
  • Work closely with product managers, sales, and customer success teams to understand the system’s functional and non-functional requirements.
  • Contribute to code quality through unit testing, integration testing, code review, and system design using Python.
  • Establish realistic estimates for timelines and ensure that projects remain on target to meet deadlines.
  • Assist in diagnosing and fixing system failures quickly when they occur in your area of expertise. This is limited to when the on-call rotation needs a subject-matter expert to help troubleshoot an issue.
  • Design and implement RESTful API endpoints using the Python programming language.
  • Break down complex problems to identify key variables and makes informed decisions based on thorough analysis.

Backend DevelopmentPythonSQLData AnalysisAPI testingData engineeringREST APIPandasData modelingSaaS

Posted about 1 month ago
Apply
Apply

📍 United States, Canada

🧭 Full-Time

💸 140000.0 - 160000.0 USD per year

🔍 Fraud Prevention and AML Compliance

🏢 Company: Sardine👥 101-250💰 $70,000,000 Series C about 2 months agoCryptocurrencyFraud DetectionFinTechSoftware

  • 5+ years of experience in backend or data engineering roles
  • Strong knowledge of database systems (SQL and NoSQL)
  • Expertise in a modern programming language (Go, Python, Java)
  • Familiarity with cloud platforms (AWS, GCP, Azure)
  • Experience with containerization (Docker, Kubernetes)
  • Design and implement ETL pipelines for large datasets
  • Develop and optimize APIs for data retrieval
  • Architect and manage scalable storage solutions
  • Collaborate on data product development
  • Perform data analysis for client value
  • Document processes and mentor junior engineers

AWSDockerPythonSQLDynamoDBElasticSearchETLGCPKubernetesNosqlCI/CD

Posted about 2 months ago
Apply

Related Articles

Posted about 1 month ago

Why remote work is such a nice opportunity?

Why is remote work so nice? Let's try to see!

Posted 7 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 7 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 8 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 8 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.