Apply

Data Quality Intern (100% Remote)

Posted 19 days agoViewed

View full description

💎 Seniority level: Entry

📍 Location: USA

💸 Salary: 18.0 USD per hour

🔍 Industry: Personal finance

🏢 Company: GOBankingRates

🗣️ Languages: English

🪄 Skills: PythonSQLApache AirflowETLSnowflakeData engineering

Requirements:
  • Must be available to work 24 hours per week Monday-Friday for 16 weeks.
  • Currently pursuing a degree in Computer Science, Data Science, Engineering, or a related field.
  • Ability to write basic SQL queries for data validation.
  • Familiarity with Python for scripting and automation.
  • Strong problem-solving abilities and attention to detail.
  • Excellent verbal and written communication skills.
  • A passion for learning new tools and technologies in data engineering and quality assurance.
Responsibilities:
  • Actively participate in agile ceremonies to understand data requirements and ensure proper test coverage.
  • Work closely with data engineers, analysts, and product managers to refine requirements and validate data processes.
  • Assist in validating data transformations and ensuring accuracy across ETL pipelines using SQL queries.
  • Contribute to building automated test scripts using Python for data validation and regression testing.
  • Support testing of data pipelines developed using dbt and Apache Airflow.
  • Help monitor data pipeline performance and troubleshoot issues.
  • Create and maintain documentation for test cases and data pipelines.
Apply

Related Jobs

Apply

📍 USA

🧭 Internship

💸 18.0 USD per hour

🔍 Personal Finance

  • Must be available to work 24 hours per week Monday-Friday for the duration of the 16-week program.
  • Currently pursuing a degree in Computer Science, Data Science, Engineering, or a related field.
  • Ability to write basic SQL queries for data validation and analysis.
  • Familiarity with Python for scripting and automation.
  • Strong problem-solving abilities and attention to detail.
  • Excellent verbal and written communication skills.
  • A passion for learning new tools, technologies, and methodologies in data engineering and quality assurance.
  • Actively participate in agile ceremonies (e.g., sprint planning, daily stand-ups, retrospectives) with product owners and engineers.
  • Work closely with data engineers, analysts, and product managers to refine requirements and validate data processes.
  • Assist in validating data transformations and ensure data accuracy across ETL pipelines by writing and executing SQL queries.
  • Contribute to building automated test scripts using Python and PyTest for data validation and regression testing.
  • Support testing of data pipelines using tools like dbt and Apache Airflow.
  • Help monitor data pipeline performance and collaborate with the team to resolve issues.
  • Create and maintain comprehensive documentation for test cases, data pipelines, and test results.

PythonSQLApache AirflowETLSnowflake

Posted 18 days ago
Apply