Apply📍 Spain
🔍 HealthTech and AI
- 5+ years of experience in DataOps, data engineering, or a similar role, preferably in a regulated industry like healthcare or medical devices.
- Exceptional Python programming skills with a strong focus on automation.
- Proficiency with SQL and NoSQL databases and experience designing complex data pipelines.
- Expertise in cloud platforms (AWS, GCP, or Azure) and tools like Terraform, Kubernetes, and Docker.
- Advanced understanding of data security and compliance, especially GDPR or ISO standards.
- Proven ability to work autonomously, drive projects independently, and deliver measurable results.
- Strong communication and leadership skills to mentor team members and coordinate with diverse stakeholders.
- Fluency in English (Spanish not required).
- Lead the Design and Implementation: Architect and maintain high-performance, secure, and scalable data pipelines to support AI-driven clinical workflows.
- Drive Integration: Collaborate with the DataOps Manager and cross-functional teams to autonomously manage the ingestion, transformation, and integration of large datasets from external partners.
- Enhance Data Quality and Compliance: Build systems to ensure data quality, integrity, and regulatory compliance (e.g., ISO 13485, GDPR).
- Build and Automate: Develop reusable code and frameworks in Python to streamline data processes and ensure operational excellence.
- Monitor and Troubleshoot: Proactively monitor data pipelines, resolve issues autonomously, and implement safeguards to minimize disruptions.
- Innovate: Identify and implement best practices and emerging technologies to enhance data infrastructure and team efficiency.
- Collaborate Globally: Serve as a technical expert, working closely with Product, Engineering, and AI teams to ensure alignment on project goals and deliverables.
AWSDockerPythonSQLETLGCPKubernetesMachine LearningAzureData engineeringNosqlTerraformComplianceData visualization
Posted 8 days ago
Apply