Motion MSK

Motion Musculoskeletal Institute is building the next generation of MSK care and is on a mission to create a new gold standard for musculoskeletal care that delivers outstanding patient outcomes and experience, that is scalable and standardized, and that aligns the incentives of payers, PCPs, and specialist The current system forces patients to navigate a frustrating maze of provider types, treatment options and insurance hoops - a confusing mess that leads to frustration, wasted dollars, misdiagnosis and mistreatment. Our goal is to radically improve the way that musculoskeletal issues are cared for within the broader context of the US healthcare ecosystem and we are looking for clinicians who are dedicated to high-quality, evidence-based medicine, interested in new models of care, and excited about utilization of high-value technology to be a part of our journey

Related companies:

🏢 Life360
👥 251-500💰 $33,038,258 Post-IPO Equity over 2 years ago🫂 Last layoff over 2 years agoAndroidFamilyAppsMobile AppsMobile
Website LinkedIn Email Facebook Twitter

Jobs at this company:

Apply
🔥 Sr. Data Engineer
Posted 2 months ago

📍 United States

🧭 Full-Time

💸 175000.0 - 220000.0 USD per year

🔍 Healthcare

  • 4+ years of experience working with big data processing frameworks, such as Apache Spark (Scala or PySpark).
  • Hands-on experience building ETL pipelines for structured and semi-structured healthcare claims data.
  • Proficient in SQL and have experience working with distributed databases and data lakes (e.g., Delta Lake, Redshift, Snowflake).
  • Experience orchestrating workflows using Apache Airflow and managing data processing on AWS Glue or EMR.
  • Comfortable working in a cloud-native environment (AWS preferred) and using infrastructure-as-code tools like Terraform.
  • Design, build, and maintain scalable data pipelines to process claims data efficiently using Spark (Scala/PySpark), AWS Glue, and Airflow.
  • Work closely with analysts, backend engineers, and other stakeholders to ensure high data quality, reliability, and accessibility.
  • Optimize and tune distributed data processing workflows for performance and cost efficiency.
  • Develop and enforce best practices for data governance, schema evolution, and pipeline observability while understanding the nuances of security and compliance.
  • Implement automated testing, monitoring, and alerting for data workflows to ensure operational excellence.
  • Collaborate with stakeholders to define and implement data models that support analytics and business intelligence initiatives.

AWSSQLApache AirflowETLSnowflakeData engineeringTerraformScalaData visualizationData modelingData management

Posted 2 months ago
Apply