Apply

GCP Data Engineer

Posted 12 days agoViewed

View full description

💎 Seniority level: Senior, 4+ Years

📍 Location: India

🔍 Industry: Experience Management

🏢 Company: Experience.com👥 101-250💰 $14,575,000 Series A about 6 years agoCustomer ServiceConsumerInformation ServicesConsultingSaaSAnalyticsQuality AssuranceInformation TechnologySoftware

⏳ Experience: 4+ Years

🪄 Skills: PythonSQLElasticSearchETLGCPMongoDBAirflow

Requirements:
  • 4+ years of experience with PySpark and SQL for building scalable ETL pipelines.
  • Strong proficiency in Python programming.
  • Knowledge of GCP Data Analytics ecosystem (BigQuery, PySpark, SQL, etc.).
  • Experience with Airflow/Composer for workflow orchestration.
  • Experience with in-memory applications, database design, and data integration.
  • Strong analytical thinking and problem-solving abilities.
Responsibilities:
  • Design, build, and maintain scalable and robust ETL/ELT pipelines using PySpark and SQL.
  • Work on data extraction, transformation, and loading processes from multiple sources into data warehouses such as BigQuery.
  • Leverage GCP data analytics tools (BigQuery, DataProc, Cloud Functions, etc.) to process and analyze data.
  • Optimize data workflows for benchmarking, performance, and tuning to ensure efficiency and reliability.
  • Collaborate with engineering and analytics teams to develop data integration solutions that meet business needs.
  • Ensure the accuracy and quality of data by implementing strong in-memory applications and database designs.
  • Implement monitoring and alerting for pipelines and workflows to ensure data consistency and issue resolution.
Apply