- Evaluate and pilot AI tools and design future enhancements.
- Define adoption metrics and collect user feedback for AI tools.
- Create enablement materials for AI-driven workflows.
- Support A/B testing frameworks and validate experiment configurations.
- Partner with teams to improve experimentation workflows.
- Recommend tooling enhancements for experiment lifecycle.
- Evaluate and design enhancements for data platform offerings.
- Promote adoption of OpenMetadata and improve metadata coverage.
- Design and track AI readiness metrics.
- Enforce data standards and governance.
- Review data quality audits and communicate results.
- Maintain data health scorecards and identify issues.
- Collaborate to prioritize and resolve data quality issues.
- Translate regulatory obligations into system requirements.
- Work with Legal, Security, and Data Engineering on compliance.
- Document workflows and maintain audit artifacts.
PythonSQLAgile+9 more