Apply

Data Platform Engineer

Posted 6 months agoViewed

View full description

πŸ’Ž Seniority level: Senior, 5+ years

πŸ“ Location: Ukraine

πŸ” Industry: Transportation

🏒 Company: LyftπŸ‘₯ 5001-10000πŸ’° $400,000,000 Post-IPO Equity about 4 years agoπŸ«‚ Last layoff almost 2 years agoRide SharingTransportationAppsMobile AppsSoftware

πŸ—£οΈ Languages: English

⏳ Experience: 5+ years

πŸͺ„ Skills: AWSPythonSQLETLHadoopJavaKafkaKubernetesAirflowData engineeringSparkCollaborationDocumentationTroubleshooting

Requirements:
  • 5+ years of experience in software/data engineering, data architecture, or a related field.
  • Strong programming skills in at least one language: Java, Scala, Python, or Go.
  • Strong SQL and data modeling skills.
  • Hands-on experience with Apache Big Data frameworks such as Hadoop, Hive, Spark, Airflow, Iceberg, etc.
  • Proficiency in AWS cloud services.
  • Strong understanding of distributed systems, large-scale data processing, and data storage/retrieval.
  • Experience with data governance, security, and compliance will be a plus.
  • Familiarity with CI/CD and DevOps practices will be a plus.
  • Excellent problem-solving skills.
  • Ability to work independently or as part of a team.
  • Strong communication and collaboration skills.
Responsibilities:
  • Design, build, and maintain scalable and reliable data storage solutions to support diverse data processing needs.
  • Optimize and scale the platform to accommodate increasing data volumes and user requests.
  • Improve data storage, retrieval, query performance, and overall system performance.
  • Collaborate with data scientists, analysts, and other stakeholders to understand requirements and deliver tailored solutions.
  • Work with engineering teams to ensure data pipelines, analytics tools, ETL processes, and other systems are properly integrated with the Lyft Data Platform.
  • Troubleshoot and resolve data platform issues in a timely manner.
  • Participate in on-call rotations.
  • Develop and maintain monitoring and alerting systems to ensure platform availability and reliability.
  • Participate in code reviews, design discussions, and other collaborative team activities to maintain high-quality standards.
  • Continuously evaluate new technologies and tools to enhance the data platform.
  • Contribute to platform documentation, knowledge sharing, and best practice development.
Apply

Related Jobs

Apply

πŸ“ EMEA region

🧭 Full-Time

πŸ” IoT

🏒 Company: Canonical - Jobs

  • You have knowledge and experience of telemetry and connectivity systems and platforms including data streaming technologies (MQTT, Kafka, RabbitMQ, etc) observability (OpenTelemetry), industrial/engineering data exchange protocols (OPC-UA, ModBus), and the application of data governance/IAM models to such systems.
  • You have proficiency in the design and implementation of back-end web services, messaging/data pipelines, and REST APIs using Python and/or Golang.
  • You are familiar with Ubuntu as a development and deployment platform.
  • Collaborate remotely with a globally distributed team.
  • Architect scalable service APIs to provide streaming data services to other teams and products using Python and Golang.
  • Develop data governance, management and auditing systems within our telemetry platform.
  • Work with our infrastructure team to develop both a cloud-based SaaS offering as well as a containerised on-prem solution.
  • Design and implement new features and enhancements from spec to production and ongoing operations at scale.
  • Review code and technical designs produced by other engineers.
  • Discuss ideas and collaborate on finding optimal solutions.
  • Work remotely with global travel 2 to 4 weeks for internal and external events.

Backend DevelopmentDockerPythonSoftware DevelopmentCloud ComputingCybersecurityIoTKafkaKubernetesRabbitmqData engineeringREST APICI/CDLinuxMicroservicesData analyticsSaaS

Posted 12 days ago
Apply
Apply

πŸ“ EMEA

🧭 Full-Time

πŸ” Software / IoT

🏒 Company: CanonicalπŸ‘₯ 1001-5000πŸ’° $12,800,000 Crowdfunding over 11 years agoInternet of ThingsOpen SourceCloud ComputingLinuxSoftware

  • Passion for technology and collaboration with diverse, talented people.
  • Curiosity, flexibility, articulate communication, and accountability.
  • Valued soft skills with a passionate, enterprising, and self-motivated attitude.
  • Broad technology base with an emphasis on backend code and infrastructure.
  • Good understanding of cybersecurity challenges in IoT connectivity and data streaming.
  • Knowledge of telemetry and connectivity systems including data streaming technologies (e.g., MQTT, Kafka, RabbitMQ) and observability tools (e.g., OpenTelemetry).
  • Proficiency in designing and implementing backend web services, messaging/data pipelines, and REST APIs using Python and/or Golang.
  • Familiarity with Ubuntu as a development and deployment platform.
  • Bachelor's or equivalent in Computer Science, STEM, or similar field.
  • Collaborate remotely with a globally distributed team.
  • Architect scalable service APIs to provide streaming data services to other teams and products using Python and Golang.
  • Develop data governance, management and auditing systems within our telemetry platform.
  • Work with infrastructure team for cloud-based SaaS offerings and containerised solutions.
  • Design and implement new features and enhancements from spec to production and ongoing operations at scale.
  • Review code and technical designs produced by other engineers.
  • Discuss ideas and collaborate on finding optimal solutions.
  • Travel 2 to 4 weeks a year for internal and external events.

Backend DevelopmentPythonSoftware DevelopmentCybersecurityIoTGoREST APICommunication SkillsCollaborationC (Programming language)

Posted 7 months ago
Apply