Senior Data & Automation Engineer (Databricks)
F
Fluent, LLCCommerce Media
Ontario, CanadaFull-TimeSenior
Salary90000 - 100000 CAD per year
Job Details
- Languages
- English
- Experience
- 5+ years
- Required Skills
- AWSSQLGitKafkaSparkCI/CD
Requirements
- 5+ years of professional experience in data engineering, including Spark (PySpark) and SQL.
- 3+ years of hands-on experience building pipelines on Databricks (Workflows, Notebooks, Delta Lake).
- Deep understanding of Apache Spark distributed processing concepts and optimization.
- Strong experience with streaming architectures and Kafka.
- Familiarity with Databricks monitoring and observability tooling.
- Understanding of Lakehouse architecture, Unity Catalog, and governance principles.
- Proven proficiency in Git-based CI/CD workflows and automated deployment.
- Strong troubleshooting, optimization, and performance tuning skills.
- Experience designing and building large-scale, automated data pipelines.
- Experience with relevant AWS services (S3, IAM, Secrets Manager) and DevOps practices.
Responsibilities
- Design, build, and support scalable real-time and batch data pipelines using PySpark and Spark Structured Streaming on Databricks.
- Implement process automation and end-to-end workflows following Bronze → Silver → Gold architecture using Delta Lake best practices.
- Handle event-driven ingestion with Kafka and integrate into automated pipelines.
- Orchestrate workflows using Databricks Workflows/Jobs and CI/CD automation.
- Implement strong monitoring, observability, and alerting for reliability and performance (Databricks metrics, dashboards).
- Collaborate cross-functionally in agile sprints with Product, Analytics, and Data Science teams.
- Translate enterprise logical data models into optimized physical and performance-tuned implementations.
- Write modular, version-controlled code in Git; contribute to code reviews and enforce quality standards.
- Implement robust logging, error handling, and data quality validation across automation layers.
- Utilize relevant AWS services (S3, IAM, Secrets Manager) and DevOps practices.
- Promote best practices through documentation, knowledge sharing, tech talks, and training.