ApplyAWS Data Engineer - Fully Remote - US Only
Posted about 2 months agoViewed
View full description
Requirements:
- Minimum of 5 years of experience in data engineering.
- Proficiency in AWS services such as Step Functions, Lambda, Glue, S3, DynamoDB, and Redshift.
- Strong programming skills in Python with experience using PySpark and Pandas for large-scale data processing.
- Hands-on experience with distributed systems and scalable architectures.
- Knowledge of ETL/ELT processes for integrating diverse datasets.
- Familiarity with utilities-specific datasets is highly desirable.
- Strong analytical skills to work with unstructured datasets.
- Knowledge of data governance practices.
Responsibilities:
- Design and build scalable data pipelines using AWS services to process and transform large datasets from utility systems.
- Orchestrate workflows using AWS Step Functions.
- Implement ETL/ELT processes to clean, transform, and integrate data.
- Leverage distributed systems experience to ensure reliability and performance.
- Utilize AWS Lambda for serverless application development.
- Design data models for analytics tailored to utilities use cases.
- Continuously monitor and optimize data pipeline performance.
Apply