Apply📍 United States
🏢 Company: Unreal Gigs
- Experience with cloud data platforms such as AWS, GCP, or Azure.
- Proficiency in Python, Java, or Scala for building data pipelines.
- Proven ability to maintain and optimize ETL processes using orchestration tools like Apache Airflow or Luigi.
- Strong SQL skills and experience with relational and NoSQL databases.
- Excellent problem-solving skills and ability to resolve data challenges.
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field, or equivalent experience.
- Develop and maintain data pipelines to support data ingestion, transformation, and integration using cloud technologies.
- Architect and maintain data lakes and data warehouses using cloud-based solutions like BigQuery, Redshift, or Snowflake.
- Work closely with data scientists, analysts, and business stakeholders to align data solutions with business goals.
- Implement processes for data validation and ensure the quality of data through robust testing.
- Build and automate ETL workflows for efficient data integration.
- Use monitoring tools to maintain system health and track data system performance.
AWSPythonSQLApache AirflowETLGCPJavaKafkaStrategyAirflowApache KafkaAzureData engineeringData StructuresNosql
Posted 2024-11-07
Apply