Data Engineering Intern (AI & Automation)
New
Remote, but you must be located in CA, AZ, CO, NC, or UT during the internship.InternshipEntry
Salary40 - 50 USD per hour
Apply NowOpens the employer's application page
Job Details
- Required Skills
- DockerPythonSQLArtificial IntelligenceKubernetesSnowflakeJiraTableauAirflow
Requirements
- Currently pursuing an undergraduate degree with a targeted graduation date in 2026 or early 2027, or pursuing a degree in Computer Science, Data Science, or a related quantitative field.
- Expertise in Python.
- Expertise in SQL.
- Strong understanding of Data Warehousing (Snowflake).
- Strong understanding of ETL orchestration (Airflow).
- Familiarity with CLI.
- Familiarity with Docker.
- Familiarity with Kubernetes.
- Experience with Jupyter Notebooks.
- Experience with Tableau.
- Experience with Streamlit.
- A proactive approach to using AI/LLMs to automate repetitive tasks and improve system reliability.
Responsibilities
- Design, develop, and maintain ETL pipelines to ingest data into our Snowflake warehouse using Python, SQL, and Airflow.
- Implement AI-powered solutions to streamline engineering tasks, including automating code generation and documentation.
- Build AI-driven data quality checks and anomaly detection.
- Develop "self-healing" pipelines that can identify and alert on ingestion errors.
- Use Jupyter Notebooks and Streamlit to analyze data and build internal tools that help our product team make data-driven decisions.
- Create high-impact dashboards in Tableau that translate complex data into a clear narrative for stakeholders.
- Participate in daily Scrum huddles, manage tasks via Jira, and work closely with product owners and QA to promote code to production.
- Interact with cloud services via CLI and manage containerized environments using Docker and Kubernetes.
View Full Description & ApplyYou'll be redirected to the employer's site