Trafi

👥 101-250💰 Series B over 4 years agoCar SharingRide SharingTransportationPublic TransportationAppsInformation TechnologyMobileTravelSoftware💼 Private Company
Website LinkedIn Email Facebook Twitter

Trafi is revolutionizing urban mobility with its cutting-edge technology platform. We empower cities to create fully connected, multimodal transportation systems, offering seamless journey planning, booking, and payment options. Our platform integrates public transport, micromobility, and other services into a single, user-friendly experience, helping people navigate their cities more smoothly. As a leader in the Mobility-as-a-Service (MaaS) space, Trafi collaborates with global giants like Google, Apple, and the Volkswagen Group, along with cities across the globe. We're known for our innovative approach to data analysis, route planning, and data management software. Our engineering team thrives on using a modern cloud-native business intelligence stack and embraces engineering best practices, including thorough documentation, rigorous testing, and cost optimization. We are transitioning to Data Mesh architecture. Our tech stack leverages technologies like AWS, SQL, and Python. We also offer a remote-first work environment with flexible working arrangements. Our culture is defined by collaboration, continuous learning, and a strong commitment to employee growth. With a dedicated learning budget, health benefits, and options for share allocation, we offer a supportive environment for both personal and professional development. If you're a Senior Data Engineer looking for an opportunity to make a real impact, explore our openings in Vilnius.

Related companies:

Jobs at this company:

Apply
🔥 Senior Data Engineer
Posted 5 months ago

💸 5700.0 - 7400.0 EUR per month

🔍 Software Development

  • Experience in designing and operating data processing solutions, covering full lifecycle from collection to ingestion, storage, transformation and reporting
  • Hands-on experience using public cloud infrastructure, preferably AWS
  • Strong SQL experience. Ideally with modern data warehouses like BigQuery, Snowflake, Redshift, Firebolt
  • Good command of at least one general purpose programming language, ideally Python
  • Knowledge of Unix shell scripting, Docker, Git version control system.
  • Design and maintain our data pipelines by employing engineering best practices documentation, testing, cost optimisation, version control
  • Work in Data Platform team to transform business and analytical needs into accurate and reusable data models in our data warehouse
  • Identify gaps in data collection, diagnose and fix data discrepancies and maintain model code to fulfil business requirements and keep consistent ETL logic
  • Participate in transition towards Data Mesh architecture by building related tooling, infrastructure and providing consultations to domain teams
  • Stay up to date with the latest technologies and trends to improve our existing data analytics and data platform stack
Posted 5 months ago
Apply