ApplySr. Platform and DataLake Engineer
Posted 3 months agoViewed
View full description
Requirements:
- Experienced in designing and implementing complex, scalable data pipelines/ETL services.
- Expert level of Python, Java, and Typescript.
- Extensive in cloud-based data storage and processing technologies, particularly AWS services such as S3, Step Functions, Lambda, and Airflow.
- Expert level of understanding and hands-on experience with Lake House architecture.
- Knowledge of basic DevOps and MLOps principles
- 3+ year of experience with the DataBricks ecosystem.
- Expert level of experience with Spark/Glue and Delta tables/iseberg.
- Working knowledge of Snowflake
Responsibilities:
- Design, develop, and optimize data lake solutions to support our scientific data pipelines and analytics capabilities.
- Design, develop, and optimize data pipelines and workflows within the Databricks platform.
- Design and architect services to meet customer data processing needs.
- Implement data quality and governance frameworks to ensure data integrity and compliance.
Apply