Apply

Staff Software Engineer, Data

Posted 5 months agoViewed

View full description

💎 Seniority level: Staff, 11+ years

📍 Location: U.S.

💸 Salary: 190000.0 - 244000.0 USD per year

🔍 Industry: Commercial aviation

🏢 Company: Acubed👥 51-100Innovation ManagementAerospaceManufacturing

🗣️ Languages: English

⏳ Experience: 11+ years

🪄 Skills: PythonSQLETLMachine LearningData engineeringPandasData modeling

Requirements:
  • Bachelor's or Master's degree in Computer Science, Software Engineering, or related field.
  • Strong coding skills in Python (Pandas, PySpark) and experience with SQL and analytics databases.
  • 11+ years of experience in software/data engineering, focusing on data quality and ownership.
  • Deep understanding of data pipelines with experience in building and optimizing them.
  • Familiarity with sensor data processing (visual, IR cameras, LiDAR) and embeddings.
  • Proficiency in machine learning-based labeling techniques.
  • Experience developing metrics and dashboards for data performance tracking.
  • Knowledge of data management technologies (e.g., ETL, data lakes).
  • Excellent problem-solving skills and attention to detail in safety-critical environments.
Responsibilities:
  • Identify and address gaps in data coverage for robust ML model performance.
  • Develop and implement metrics to monitor and improve data quality.
  • Collaborate with data scientists and ML engineers to refine datasets.
  • Implement machine learning-based labeling techniques for sensor data accuracy.
  • Design and maintain data pipelines for high-quality data ingestion.
  • Partner with cross-functional teams to enhance data availability.
Apply

Related Jobs

Apply

📍 USA

🧭 Full-Time

🔍 Software Development

🏢 Company: Mysten Labs👥 11-50💰 $300,000,000 Series B over 2 years agoCryptocurrencyBlockchainWeb3Software

  • 8+ years of software engineering experience with a focus on systems architecture, databases, and performance optimization with experience programming in either Rust or C++.
  • Extensive experience building and scaling high-throughput systems.
  • Strong knowledge of data storage architectures, database technologies (SQL, NoSQL), and distributed computing.
  • Postgres experience is highly desirable.
  • Expertise in performance tuning and optimizing both system architecture and low-level services.
  • Ability to lead technically complex projects and drive solutions from concept to production.
  • Build a robust, high-performance indexing framework that supports both real-time and batch processing needs.
  • Optimize Sui’s data infrastructure from end-to-end: write performance, storage footprint, read performance, scaling, reliability, and costs.
  • Collaborate with cross-functional teams and external partners to ensure seamless integration of data platform solutions with first-party applications and the ecosystem at large.
  • Serve as the technical lead on key projects, ensuring the delivery of high-quality solutions on time.

Backend DevelopmentPostgreSQLSQLBlockchainSoftware ArchitectureC++AlgorithmsData engineeringData StructuresNosqlRustCI/CDRESTful APIsData modelingSoftware EngineeringData management

Posted 15 days ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Software Development

🏢 Company: NerdWallet👥 501-1000💰 Secondary Market almost 4 years ago🫂 Last layoff 8 months agoInternetConsumerFinancial ServicesPersonal Finance

  • 8+ years in software engineering, with a strong background in backend development, distributed systems and data pipelines.
  • 3+ years of experience with AWS, Snowflake, DBT, Airflow or any other compatible systems.
  • 3+ years of experience building APIs and building scalable backend systems.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field (or equivalent practical experience).
  • Proficiency in modern programming languages such as Java, Python or Typescript.
  • Experience with microservices architecture, RESTful APIs, and cloud infrastructure (AWS, GCP, or Azure).
  • Strong understanding of database systems (both SQL and NoSQL), with experience in high-volume data processing.
  • Knowledge of security best practices, particularly in financial services.
  • Familiarity with CI/CD pipelines, containerization, and orchestration technologies like Docker and Kubernetes.
  • Experience in consumer credit, lending, loans, or insurance, with a solid understanding of industry regulations, underwriting processes, and risk assessment.
  • Driving meaningful and real revenue for Nerdwallet’s CLAW division
  • Serving as a mentor to the engineers on the team you are assigned to
  • Serving as a trusted advisor and tech lead for our Engineering Managers
  • Delivering large amounts of features to live in a high quality fashion, serving as an example of what good looks like to the team
  • Helping to drive our existing culture towards better engineering practices
  • Helping to drive our existing culture towards strong continuous improvement thinking
  • Helping to drive our existing culture towards failing fast and repairing faster
  • Partnering with Management to continuously develop the engineering roadmap for our team

AWSBackend DevelopmentDockerLeadershipPythonSQLJavaKubernetesSnowflakeTypeScriptAirflowData engineeringNosqlCommunication SkillsAnalytical SkillsCI/CDProblem SolvingAgile methodologiesRESTful APIsMentoringDevOpsMicroservicesTeamworkData visualizationData modelingSoftware EngineeringData analytics

Posted 20 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 240000.0 - 265000.0 USD per year

🔍 Software Development

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • 8+ years of hands-on experience in architecting scalable API development, distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as BigQuery and Postgres.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow and DBT.
  • Expertise in data processing technologies and streaming workflows including Dataflow, Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets
  • Build highly scalable features that integrate with dozens of blockchains.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.

AWSBackend DevelopmentDockerPythonSQLBlockchainData AnalysisKafkaKubernetesSoftware ArchitectureAirflowAlgorithmsAPI testingData engineeringData StructuresPostgresREST APISparkTerraformMicroservicesJSONData modeling

Posted 22 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 240000.0 - 265000.0 USD per year

🔍 Software Development

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • 7+ years of hands-on experience in architecting distributed system architecture, guiding projects from initial ideation through to successful production deployment.
  • Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL.
  • In-depth experience with data stores such as Icerberg, Trino, BigQuery, and StarRocks, and Citus.
  • Proficiency in data pipeline and workflow orchestration tools like Airflow, DBT, etc.
  • Expertise in data processing technologies and streaming workflows including Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.
  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
  • Oversee the deployment and monitoring of large database clusters with an unwavering focus on performance and high availability.
  • Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.

AWSDockerPythonSQLCloud ComputingETLKafkaKubernetesAirflowData engineeringPostgresSparkTerraformData modeling

Posted 22 days ago
Apply
Apply

📍 United States, Canada

🧭 Full-Time

🔍 Software Development

  • 7+ years of software development experience
  • Experience with Java and Python applications
  • Current cloud technology experience with AWS and Kubernetes
  • Develop core functionality using cloud-native Java
  • Work with Data Science teams on machine learning solutions
  • Ensure secure, efficient solutions in a determined timeframe

AWSDockerPostgreSQLPythonETLJavaKafkaMachine LearningSpringCI/CD

Posted about 2 months ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 240000 - 270000 USD per year

🔍 Blockchain intelligence and financial services

🏢 Company: TRM Labs👥 101-250💰 $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • Bachelor's degree (or equivalent) in Computer Science or related field.
  • 5+ years of experience in building distributed system architecture, focusing on incremental updates.
  • Strong programming skills in Python and SQL.
  • Deep technical expertise in advanced data structures and algorithms for incremental updating.
  • Knowledge of data stores like BigQuery, Snowflake, RedShift, and more.
  • Experience orchestrating data pipelines using Airflow, DBT, Luigi, etc.
  • Proficiency in data processing technologies like Spark, Kafka, Flink.
  • Ability to deploy and monitor systems in public cloud environments.
  • Design and build our Cloud Data Warehouse focusing on incremental updates for cost efficiency.
  • Research methods to optimize data processing, storage, and retrieval.
  • Develop and maintain ETL pipelines for structured and unstructured data.
  • Collaborate with teams on new data models and tools for innovation.
  • Continuously monitor and optimize performance for cost, scalability, and reliability.

DockerPythonSQLETLKafkaKubernetesSnowflakeAirflowAlgorithmsData engineeringData StructuresPostgresSparkCollaborationTerraform

Posted 8 months ago
Apply