Senior Databricks Engineer

New
Fully remoteFull-TimeSenior
Salary not disclosed
Apply NowOpens the employer's application page

Job Details

Experience
Minimum of 8 years professional experience in data engineering or data platform development is required. Minimum of 5 years of hands-on experience with Databricks and Apache Spark in production environments is required.
Required Skills
AWSSQLGCPSnowflakeAzureSparkDatabricks

Requirements

  • Bachelors degree or equivalent
  • Minimum of 8 years professional experience in data engineering or data platform development
  • Minimum of 5 years of hands-on experience with Databricks and Apache Spark in production environments
  • Demonstrated expertise with Snowflake
  • Demonstrated expertise with Apache Iceberg
  • Strong proficiency in SQL and experience optimizing queries on large, distributed datasets
  • Proven experience with cloud-based data platforms (Azure preferred; AWS or GCP acceptable)
  • Strong understanding of data modeling
  • Strong understanding of ETL/ELT pipelines
  • Strong understanding of data governance practices
  • Experience implementing Unity Catalog or CI/CD pipelines for data workflows (preferred)
  • Experience in life sciences, biotech, or manufacturing environment (preferred)
  • Strong interpersonal and communication skills

Responsibilities

  • Architect, build, and optimize data solutions that support Thermo Fisher Scientific’s digital transformation strategy.
  • Build connections and workflows within cloud-based systems.
  • Build, develop, and deploy scalable data pipelines and ETL/ELT processes using Databricks.
  • Engineer robust data solutions to integrate enterprise data sources, including ERP, CRM, laboratory, and manufacturing systems.
  • Develop reusable frameworks and templates to accelerate data delivery and ensure consistency across domains.
  • Implement and maintain high-performance data connections across Databricks, Snowflake, and Iceberg environments.
  • Author and optimize complex SQL queries, transformations, and data models for analytics and reporting use cases.
  • Support data Lakehouse and data mesh initiatives to enable seamless access to trusted data across the organization.
  • Apply data governance, lineage, and security controls using Unity Catalog, Delta Live Tables, and related technologies.
  • Partner with compliance and cybersecurity teams to uphold data privacy, GxP, and regulatory standards.
  • Establish monitoring, auditing, and optimization processes for ongoing data quality assurance.
  • Collaborate with data scientists, architects, and business partners to build and implement end-to-end data solutions.
  • Serve as a technical mentor and leader with vision within the CRG data engineering community.
  • Contribute to critical initiatives for digital platform modernization and advanced analytics enablement.
View Full Description & ApplyYou'll be redirected to the employer's site
View details
Apply Now