Architect, develop, test, and maintain ELT/ETL pipelines and data workflows Implement advanced data processing solutions and observability techniques Design and refine data models and semantic layers Translate complex business and analytics requirements into efficient, scalable data solutions Apply best practices for version control, documentation, CI/CD, Infrastructure as Code, and data governance Participate in code reviews and contribute to continuous improvement efforts Partner with data engineers, DBAs, managers, and business stakeholders Provide technical guidance and mentorship to other engineers Communicate technical decisions, risks, and recommendations Optimize data pipelines and warehouse performance Evaluate, prototype, and influence adoption of new tools, frameworks, and architectural patterns Contribute to data observability, incident response, and root-cause analysis Design and deliver AI-ready data products