Apply

Staff Software Engineer, Data

Posted about 10 hours agoViewed

View full description

💎 Seniority level: Staff, 8+ years of experience in Software Engineering

🔍 Industry: Healthcare and life sciences

🏢 Company: Paradigm Health

⏳ Experience: 8+ years of experience in Software Engineering

Requirements:
  • Extensive experience driving technical projects, prioritizing work, identifying dependencies, and facilitating technical decisions across teams.
  • Deep expertise in SQL, data analysis, and data modeling, with a strong understanding of distributed systems and cloud environments.
  • Proven track record in designing systems that are scalable, available, maintainable, and performant.
  • Proficiency in coding (Python or other languages) and experience with ETL orchestration tools like Apache Airflow or Dagster.
  • Familiarity with data transformation tools such as dbt or dataform.
  • Experience with data warehouse technologies like BigQuery, Redshift, or Snowflake.
  • 8+ years of experience in Software Engineering, with a focus on Data, cloud computing or distributed systems.
Responsibilities:
  • Foster a continuous improvement mindset and provide mentorship to your engineering team.
  • Collaborate closely with product managers, data scientists, analysts, and engineering colleagues to translate complex business problems into efficient technical solutions.
  • Shape the engineering vision while owning the data models and toolchain for our data platform.
  • Design, develop, and optimize data models to meet the needs of the data team’s consumers.
  • Build and maintain robust batch and streaming pipelines, leveraging cloud technologies and distributed systems.
  • Develop tools and processes that make data accessible across products and for data consumers.
  • Enhance the maturity of our monitoring systems to improve visibility and failure detection within our infrastructure.
  • Become a subject matter expert on our diverse datasets and associated tooling.
  • Ensure privacy and security in the governance of our data in collaboration with cross-functional teams.
  • Advocate for a scalable, maintainable data platform that supports our evolving business needs.
Apply

Related Jobs

Apply

🧭 Full-Time

🔍 Blockchain intelligence and financial services

  • Bachelor's degree (or equivalent) in Computer Science or a related field.
  • A proven track record, with 5+ years of hands-on experience in architecting scalable API development and distributed systems.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • Experience with data stores such as BigQuery and Postgres.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow and DBT.
  • Expertise in data processing technologies and streaming workflows including Dataflow, Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.

  • Build highly scalable features that integrate with dozens of blockchains.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Collaborate across departments to design and implement novel data models that enhance TRM’s products.
Posted 11 days ago
Apply
Apply

🧭 Full-Time

🔍 Blockchain intelligence and financial technology

  • Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 5+ years of hands-on experience in architecting scalable API development and distributed system architecture.
  • Exceptional programming skills in Python, with proficiency in SQL or SparkSQL.
  • Experience with data stores such as BigQuery and Postgres.
  • Proficiency in data workflow tools like Airflow and DBT.
  • Expertise in data processing technologies and streaming workflows including Dataflow, Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms using tools like Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.

  • Build highly scalable features that integrate with multiple blockchains.
  • Design and architect data models for optimal storage and retrieval supporting sub-second latency for blockchain data querying.
  • Collaborate with data scientists, backend engineers, and product managers to implement new data models.
Posted 11 days ago
Apply
Apply

🔍 Blockchain intelligence and financial technology

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B about 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 5+ years of hands-on experience in architecting scalable API development and distributed system architecture.
  • Exceptional programming skills in Python and proficiency in SQL or SparkSQL.
  • Experience with data stores like BigQuery and Postgres.
  • Proficiency in data pipeline tools such as Airflow and DBT.
  • Expertise in data processing technologies and streaming workflows, including Dataflow, Spark, Kafka, and Flink.
  • Competence in deploying and monitoring cloud infrastructure using Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming large datasets.

  • Build highly scalable features that integrate with various blockchains.
  • Design and architect data models for optimal storage and real-time querying.
  • Collaborate with data scientists, backend engineers, and product managers.
Posted 19 days ago
Apply
Apply

📍 United States, India, United Kingdom

🔍 B2B Technology

🏢 Company: Demandbase👥 501-1000💰 $175,000,000 Debt Financing almost 2 years agoSales AutomationAdvertisingBig DataSaaSAnalyticsB2BMarketingMarketing AutomationSoftware

  • Bachelor’s or master’s degree in computer science, Mathematics, or Statistics from a top engineering institution.
  • 4+ years of data engineering experience in building enterprise data/analytics solutions.
  • Practical experience with complex analytics projects and advanced SQL for data analysis.
  • Strong practical experience in databases, Advanced SQL, and Python/R.
  • Good understanding of data strategies and data model design.

  • Design, model, and implement data analysis and analytics solutions.
  • Contribute hand-on to data projects involving high-level design, analysis, experiments, data architecture, and data modeling.
  • Support ETL pipeline modules through effective data transformation, data cleaning, reporting, and statistical analysis.
  • Apply analysis techniques such as segmentation, regression, clustering, and data profiling to analyze trends and report KPIs.
  • Collaborate with cross-functional teams in an Agile setting to build a scalable, high-availability data analytics platform.

PythonSQLAgileData AnalysisETLJavaJavascriptProduct DevelopmentData engineeringSparkCommunication SkillsProblem SolvingData modeling

Posted 19 days ago
Apply
Apply

📍 US

🧭 Full-Time

💸 200000 - 255000 USD per year

🔍 Financial services, Blockchain

  • A Bachelor's degree (or equivalent) in Computer Science or a related field.
  • A proven track record with 8+ years of hands-on experience in architecting distributed system architecture.
  • Exceptional programming skills in Python and adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Iceberg, Trino, BigQuery, and StarRocks.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms using Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.

  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with a focus on performance and high availability.
  • Collaborate across departments to design and implement novel data models that enhance TRM’s products.

DockerPythonSQLBlockchainETLKafkaKubernetesAirflowData engineeringSparkCollaborationTerraformDocumentation

Posted 19 days ago
Apply
Apply

📍 US

🔍 Blockchain intelligence and financial services

  • Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 5+ years of hands-on experience in architecting distributed system architecture.
  • Exceptional programming skills in Python.
  • Adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as ClickHouse, ElasticSearch, Postgres, Redis, and Neo4j.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, Luigi, Azkaban, and Storm.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools like Docker, Terraform, Kubernetes, and Datadog.

  • Build highly reliable data services to integrate with various blockchains.
  • Develop complex ETL pipelines for processing structured and unstructured data in real-time.
  • Design intricate data models to support optimal storage and retrieve with sub-second latency.
  • Oversee large database cluster deployment and monitoring with a focus on performance.
  • Collaborate with data scientists, engineers, and product managers to enhance TRM's products.

DockerPythonSQLBlockchainElasticSearchETLKafkaKubernetesAirflowClickhouseData engineeringPostgresRedisSparkCollaborationTerraformDocumentation

Posted 19 days ago
Apply
Apply

📍 United States

🔍 Marketing technology / E-commerce

🏢 Company: Northbeam👥 11-50💰 $15,000,000 Series A over 2 years agoMachine LearningInformation Technology

  • Bachelor's degree in Mathematics, Computer Science, or a highly quantitative STEM field.
  • Strong understanding of statistics, linear algebra, probability, and machine learning.
  • Experience with regression, time series analysis, causal inference, statistical testing, Bayesian modeling, optimization, or signal processing.
  • Strong programming skills to implement data science solutions in production code.
  • Expertise in Python and experience with differentiable or probabilistic programming (e.g., TensorFlow, PyTorch).
  • Experience with cloud infrastructure (e.g., Google Cloud Platform, Azure, AWS).
  • Expertise in software architecture, distributed systems architecture, or data engineering.

  • Lead research and development for new and existing data science powered products at Northbeam.
  • Collaborate with engineering and product teams to transition working proofs of concept into production.
  • Provide technical expertise and support for non-technical stakeholders.
  • Educate colleagues in other departments about data science applications in marketing and advertising.

AWSPythonMachine LearningPyTorchSoftware ArchitectureAzureData engineeringData scienceTensorflowCollaboration

Posted 2 months ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 $240,000 - $270,000 per year

🔍 Blockchain intelligence data platform

  • Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 5+ years of experience in building distributed system architecture, with a particular focus on incremental updates from inception to production.
  • Strong programming skills in Python and SQL.
  • Deep technical expertise in advanced data structures and algorithms for incremental updating of data stores (e.g., Graphs, Trees, Hash Maps).
  • Comprehensive knowledge across all facets of data engineering, including implementing and managing incremental updates in data stores like BigQuery, Snowflake, RedShift, Athena, Hive, and Postgres.
  • Orchestrating data pipelines and workflows focused on incremental processing using tools such as Airflow, DBT, Luigi, Azkaban, and Storm.
  • Developing and optimizing data processing technologies and streaming workflows for incremental updates (e.g., Spark, Kafka, Flink).
  • Deploying and monitoring scalable, incremental update systems in public cloud environments (e.g., Docker, Terraform, Kubernetes, Datadog).
  • Expertise in loading, querying, and transforming large datasets with a focus on efficiency and incremental growth.

  • Design and build our Cloud Data Warehouse with a focus on incremental updates to improve cost efficiency and scalability.
  • Research innovative methods to incrementally optimize data processing, storage, and retrieval to support efficient data analytics and insights.
  • Develop and maintain ETL pipelines that transform and incrementally process petabytes of structured and unstructured data to enable data-driven decision-making.
  • Collaborate with cross-functional teams to design and implement new data models and tools focused on accelerating innovation through incremental updates.
  • Continuously monitor and optimize the Data Platform's performance, focusing on enhancing cost efficiency, scalability, and reliability.

DockerPythonSQLETLKafkaKubernetesMachine LearningSnowflakeAirflowAlgorithmsData engineeringData scienceData StructuresPostgresSparkCollaborationTerraformData analytics

Posted 3 months ago
Apply
Apply

📍 United States

💸 200000 - 255000 USD per year

🔍 Blockchain intelligence and financial services

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B about 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • A Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 8+ years of hands-on experience in architecting distributed systems.
  • Exceptional programming skills in Python and proficiency in SQL or SparkSQL.
  • In-depth experience with data stores such as Iceberg, Trino, BigQuery, and StarRocks.
  • Proficiency in tools like Airflow and DBT for data pipeline orchestration.
  • Expertise in technologies like Spark, Kafka, and Flink.
  • Competence in deploying and monitoring cloud infrastructure with tools like Docker and Terraform.
  • Proven ability in managing extensive datasets.

  • Build highly reliable data services to integrate with blockchains.
  • Develop complex ETL pipelines for real-time data processing.
  • Design intricate data models for optimal storage and retrieval.
  • Oversee deployment and monitoring of large database clusters.
  • Collaborate with data scientists and engineers to enhance products.

DockerPythonSQLBlockchainETLKafkaKubernetesAirflowData engineeringSparkCollaborationTerraform

Posted 4 months ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 240000 - 270000 USD per year

🔍 Blockchain intelligence and financial services

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B about 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • Bachelor's degree (or equivalent) in Computer Science or related field.
  • 5+ years of experience in building distributed system architecture, focusing on incremental updates.
  • Strong programming skills in Python and SQL.
  • Deep technical expertise in advanced data structures and algorithms for incremental updating.
  • Knowledge of data stores like BigQuery, Snowflake, RedShift, and more.
  • Experience orchestrating data pipelines using Airflow, DBT, Luigi, etc.
  • Proficiency in data processing technologies like Spark, Kafka, Flink.
  • Ability to deploy and monitor systems in public cloud environments.

  • Design and build our Cloud Data Warehouse focusing on incremental updates for cost efficiency.
  • Research methods to optimize data processing, storage, and retrieval.
  • Develop and maintain ETL pipelines for structured and unstructured data.
  • Collaborate with teams on new data models and tools for innovation.
  • Continuously monitor and optimize performance for cost, scalability, and reliability.

DockerPythonSQLETLKafkaKubernetesSnowflakeAirflowAlgorithmsData engineeringData StructuresPostgresSparkCollaborationTerraform

Posted 5 months ago
Apply

Related Articles

Posted 4 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 4 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 4 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 4 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.

Posted 4 months ago

The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.