Remote Jobs in Germany

Python
3,345 jobs found. to receive daily emails with new job openings that match your preferences.
3,345 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply
🔥 Orbital Analyst, UK
Posted about 11 hours ago

📍 United Kingdom

🧭 Full-Time

🔍 Space Surveillance and Tracking

🏢 Company: Slingshot Aerospace👥 101-250💰 $30,000,000 Debt Financing 9 months agoAerospaceAnalyticsSimulationSoftware

  • Bachelor's-level degree in Aerospace / Mechanical Engineering, Electrical Engineering, Computer Science, Applied Mathematics, Physics, Astronomy, related field, or equivalent experience
  • 3+ years of relevant work experience
  • Demonstrated expertise in orbital dynamics and satellite operations
  • Effective written and verbal communication skills
  • Experience reading and writing code in Python or other scripting languages
  • Ability to travel up to 5% of the time
  • Provide analytical and orbital analysis support to our customers and support related training events
  • Collaborating with cross-functional teams in conducting feasibility assessments in relation to customer space object tracking needs
  • Engage with our customers and other stakeholders to shape requirements and ensure successful outcomes for their mission-critical needs
  • Exercise Slingshot’s software using real-time and historical data, and perform quality assurance on its outputs (e.g. calibration activities)
  • Support ongoing and future transitions of Slingshot’s products and services to the SGSN, space operations centers, and other technology testbeds
  • Develop software and shell scripts to help automate, monitor, and enhance data processing related to space operations
  • Contribute your expertise and ideas to help shape our products and strategies
  • Prepare technical reports and briefing materials
  • Perform other duties as assigned (to be less than 10% of the responsibilities listed above)
  • Execute all position responsibilities in alignment with Slingshot’s core values, mission, and purpose

PythonSQLAlgorithmsCommunication SkillsAnalytical SkillsScripting

Posted about 11 hours ago
Apply
Apply
🔥 Senior Data Engineer
Posted about 11 hours ago

📍 Worldwide

🧭 Full-Time

🔍 Software Development

🏢 Company: Kit👥 11-50💰 over 1 year agoEducationFinancial ServicesApps

  • Strong command of SQL, including DDL and DML.
  • Proficient in Python
  • Strong understanding of DBMS internals, including an appreciation for platform-specific nuances.
  • A willingness to work with Redshift and deeply understand its nuances.
  • Familiarity with our key tools (Redshift, Segment, dbt, github)
  • 8+ years in data, with at least 3 years specializing in Data Engineering
  • Proven track record managing and optimizing OLAP clusters
  • Experience refactoring problematic data pipelines without disrupting business operations
  • History of implementing data quality frameworks and validation processes
  • Dive into our Redshift warehouse, dbt models, and workflows.
  • Evaluate the CRM data lifecycle, including source extraction, warehouse ingestion, transformation, and reverse ETL.
  • Refine and start implementing your design for source extraction and warehouse ingestion.
  • Complete the implementation of the CRM source extraction/ingestion project and use the learnings to refine your approach in preparation for other, similar initiatives including, but by no means limited to web traffic events and product usage logs.

PythonSQLETLGitData engineeringRDBMSData modelingData management

Posted about 11 hours ago
Apply
Apply

📍 Pan India, IN

🧭 Full-Time

🔍 Software Development

🏢 Company: Nexthire

  • Minimum 4 years of hands-on experience in full stack development.
  • Strong proficiency in Python, particularly Django and Pandas.
  • Solid experience with React for front-end development.
  • Excellent communication and interpersonal skills.
  • Ability to work independently and manage tasks in a remote environment.
  • Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Design, develop, and maintain web applications using Python (Django, Pandas) and React.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Write clean, scalable, and efficient code.
  • Debug and troubleshoot software issues in a timely manner.
  • Participate in code reviews and maintain coding standards.
  • Communicate effectively with team members and stakeholders.

PythonSQLDjangoFull Stack DevelopmentPandasReactRESTful APIs

Posted about 11 hours ago
Apply
Apply

📍 Poland

💸 22900.0 - 29900.0 PLN per month

🔍 Threat Intelligence

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • 2+ years of experience as a Data Engineer, working with large-scale distributed systems.
  • Proven expertise in Lakehouse architecture and Apache Hudi in production environments.
  • Experience with Airflow, Kafka, or streaming data pipelines.
  • Strong programming skills in Python and PySpark.
  • Comfortable working in a cloud-based environment (preferably AWS).
  • Design, build, and manage scalable data pipelines using Python, SQL, and PySpark.
  • Develop and maintain lakehouse architectures, with hands-on use of Apache Hudi for data versioning, upserts, and compaction.
  • Implement efficient ETL/ELT processes for both batch and real-time data ingestion.
  • Optimize data storage and query performance across large datasets (partitioning, indexing, compaction).
  • Ensure data quality, governance, and lineage, integrating validation and monitoring into pipelines.
  • Work with cloud-native services (preferably AWS – S3, Athena, EMR) to support modern data workflows.
  • Collaborate closely with data scientists, analysts, and platform engineers to deliver reliable data infrastructure

AWSPythonSQLCloud ComputingETLKafkaAirflowData engineeringCI/CDTerraform

Posted about 11 hours ago
Apply
Apply
🔥 Senior Machine Learning Engineer
Posted about 12 hours ago

📍 United Kingdom, Latvia, Spain, Germany, Denmark, Poland, Portugal, Ireland

🔍 Software Development

🏢 Company: Lokalise👥 101-250💰 $50,000,000 Series B over 3 years agoInformation ServicesDeveloper APIsSaaSInformation TechnologyCollaborationTranslation ServiceSoftwareCloud Infrastructure

  • 4+ years of experience building and operating backend systems in production
  • Strong proficiency with Python, FastAPI, and Pydantic
  • Solid understanding of microservice architecture, scalable distributed systems, and observability, with familiarity using tools like OpenTelemetry, Grafana, or Datadog to monitor both general system health and LLM-specific metrics like latency, token usage, and model performance
  • Hands-on experience with prompt engineering in production
  • Familiarity with CI/CD, containerisation, and cloud deployment
  • Strong sense of ownership
  • Build and own LLM-powered back end services using FastAPI, Pydantic, etc. — ensuring they are scalable, observable, and easy to extend
  • Design infrastructure that enables rapid experimentation with LLMs, including A/B testing, feature flagging, and usage analytics
  • Integrate and maintain LLM observability tooling (e.g., LiteLLM, Langfuse) to monitor quality, cost, and performance of model calls
  • Collaborate with Data and ML Scientists to productionize workflows, share feedback, and continuously improve experimentation speed
  • Ensure systems are reliable and deployable with strong CI/CD practices, including instrumentation and alerting
  • Contribute to team culture through pairing, mentoring, and sharing learnings to help others grow
  • Stay connected to the product by understanding our users' needs, localization workflows, and broader industry trends

AWSBackend DevelopmentDockerPythonSQLCloud ComputingMachine LearningAPI testingFastAPIGrafanaREST APICI/CDMicroservices

Posted about 12 hours ago
Apply
Apply

📍 United States of America

🧭 Full-Time

💸 190000.0 - 192000.0 USD per year

🔍 Software Development

🏢 Company: External_Career_Site_Barclays

Experience with Java, Spring Boot, Vert.X, Kubernetes/Docker, React, and Python.
  • Design and develop exotics analytics price and strategic trade booking platform for Structured Notes, and contribute to the overall system design, architecture and development on risk platform, booking tools for exotic products and price streaming platforms.
  • Enhance systems to facilitate Regulatory requirements (FRTB IM/SSA, CCAR, Libor cessation).
  • Manage stakeholders by working directly with front office trading, operations and other business users to collect requirements and provide regular status updates and support.
  • Reduce technical debt by introducing and adopting best practices in software development and collaborating with the team to ensure consistent adoption of best practices and migrating off legacy technical stacks.
  • Compute cost optimizations by working on cost reduction initiatives to reduce grid compute costs by moving to efficient valuation models and distributing across HPC/AWS clusters.
  • Lead the migration of stochastic valuation models, moving away from Black Scholes to better capture volatility risk exposure.
  • Develop strategic UI interfaces using HTML5/React for trade booking/risk management platform.
  • Integrate with OpenFin platform, defining data taxonomies for Macro/Rates business and seamlessly interact with other applications (RFQ activity, internal margins, Credit exposures, Axes etc.) in the space.
  • Enhance the server-side stack, moving to microservices stack using Vert.X, SpringBoot. service discovery via Zookeeper, asynchronous messaging on Confluent Kafka and deployments using Kubernetes/Docker containers.
  • Work with Front Office traders to provide risk/trade/static/market data via APIs to support their development on Python Notebooks.

AWSBackend DevelopmentDockerPythonSoftware DevelopmentSQLApache TomcatBashData AnalysisExpress.jsFrontend DevelopmentHTMLCSSJavaJavascriptJUNITKubernetesMachine LearningOracleOracle RDBMSReact.jsSoftware ArchitectureSpringSpring BootSpring MVCUI DesignAlgorithmsAPI testingData StructuresReactCI/CDAgile methodologiesRESTful APIsMicroservicesJSONRisk ManagementDebugging

Posted about 12 hours ago
Apply
Apply
🔥 Senior Data Governance Engineer
Posted about 12 hours ago

📍 Brazil

🔍 Data Governance

🏢 Company: TELUS Digital Brazil

  • At least 3 years of experience in Data Governance, Metadata Management, Data Cataloging, or Data Engineering.
  • Have actively participated in the design, implementation, and management of data governance frameworks and data catalogs.
  • Experience working with Colibra and a strong understanding of the Collibra Operating Model, workflows, domains, and policies.
  • Experience working with APIs.
  • Experience with a low-code/no-code ETL tool, such as Informatica
  • Experience working with databases and data modeling projects, as well as practical experience utilizing SQL
  • Effective English communication - able to explain technical and non-technical concepts to different audiences
  • Experience with a general-purpose programming language such as Python or Scala
  • Ability to work well in teams and interact effectively with others
  • Ability to work independently and manage multiple tasks simultaneously while meeting deadlines
  • Conduct detailed assessments of the customer’s data governance framework and current-state Collibra implementation
  • Translate business needs into functional specifications for Collibra use cases
  • Serve as a trusted advisor to the customer’s data governance leadership
  • Lead requirement-gathering sessions and workshops with business users and technical teams
  • Collaborate with cross-functional teams to ensure data quality and support data-driven decision-making to strive for greater functionality in our data systems
  • Collaborate with project managers and product owners to assist in prioritizing, estimating, and planning development tasks
  • Provide constructive feedback and share expertise with fellow team members, fostering mutual growth and learning
  • Demonstrate a commitment to accessibility and ensure that your work considers and positively impacts others

PythonSQLETLData engineeringRESTful APIsData visualizationData modelingData analyticsData management

Posted about 12 hours ago
Apply
Apply
🔥 Health Data Analytics Engineer
Posted about 12 hours ago

📍 United States

🧭 Full-Time

💸 135000.0 - 170000.0 USD per year

🔍 Healthcare

🏢 Company: SmarterDx👥 101-250💰 $50,000,000 Series B about 1 year agoArtificial Intelligence (AI)HospitalInformation TechnologyHealth Care

  • 3+ years of analytics engineering experience in the healthcare industry, involving clinical and/or billing/claims data.
  • You are very well-versed in SQL and ETL processes, significant experience in dbt is a must
  • You have experience in a general purpose programming language (Python, Java, Scala, Ruby, etc.)
  • You have strong experience in data modeling and their implementation in production  data pipelines.
  • You are comfortable with the essentials of data orchestration
  • Designing, developing, and maintaining dbt data models that support our healthcare analytics products.
  • Integrating and transforming customer data to conform to our data specifications and standards.
  • Collaborating with cross-functional teams to translate data and business requirements into effective data models.
  • Configuring and improving data pipelines that integrate and connect the data models.
  • Conducting QA and testing on data models to ensure data accuracy, reliability, and performance.
  • Applying industry standards and best practices around data modeling, testing, and data pipelining.
  • Participating in a rota with other engineers to help investigate and resolve data related issues.

AWSPostgreSQLPythonSQLApache AirflowETLSnowflakeData engineeringData modelingData analytics

Posted about 12 hours ago
Apply
Apply
🔥 Staff, Software Engineer
Posted about 13 hours ago

📍 United States

🧭 Full-Time

💸 140000.0 - 180000.0 USD per year

🔍 Software Development

🏢 Company: Seeq👥 101-250💰 $50,000,000 Series D 10 months agoIndustrialInternet of ThingsAnalyticsCommercialSoftware

  • 10 years of professional experience in software development
  • Experience with cloud platforms and technologies, like AWS, Azure, or GCP
  • Demonstrate a strong understanding of software design principles, patterns, and best practices
  • Lead the design and development of complex software systems
  • Mentor and coach junior engineers, guiding them in best practices and helping them grow their skills
  • Be a technical leader, driving innovation and spearheading the implementation of new technologies and frameworks

AWSBackend DevelopmentLeadershipPythonSoftware DevelopmentAgileCloud ComputingDesign PatternsJavaJavascriptSCRUMTypeScriptC#AlgorithmsData StructuresREST APIReactCommunication SkillsCollaborationCI/CDProblem SolvingMentoringAttention to detailSoftware EngineeringData analyticsDebugging

Posted about 13 hours ago
Apply
Apply
🔥 Staff, Machine Learning Engineer
Posted about 13 hours ago

📍 Canada

🧭 Full-Time

💸 172800.0 - 216000.0 CAD per year

🔍 Software Development

🏢 Company: Twilio👥 5001-10000💰 $378,215,525 Post-IPO Equity almost 4 years ago🫂 Last layoff over 1 year agoMessagingSMSMobile AppsEnterprise SoftwareSoftware

  • 5+ years of applied ML engineering experience
  • Develop and Deploy AI Models: Build and deploy machine learning models leveraging NLP techniques and GenAI-powered applications, to production environments, ensuring they meet the diverse needs of Twilio's verticals and customer base.
  • Collaborate Across Teams: Work closely with product, program, analytics, and engineering teams to implement and refine machine learning, statistical, and forecasting models that drive business outcomes.
  • Utilize Advanced Technical Stack: Leverage our technical stack, including Python, SQL, R, AWS (Sagemaker, Lambda, S3, Kendra), MySQL, Airtable, and libraries such as Pandas, NumPy, SciKit-Learn, XGBoost, Matplotlib, and Keras, to develop robust and scalable AI/ML solutions.
  • Integrate Enterprise Data Sources: Effectively utilize enterprise data sources like Salesforce and Zendesk to inform model development and enhance predictive accuracy.
  • Harness the Power of LLMs: Apply knowledge of Large Language Models (LLMs) such as OpenAI's GPT models, Claude, Gemini, Llama, Whisper, and Groq to develop innovative GenAI use cases and solutions
  • Develop and Deploy AI/ML Models: Build and deploy machine learning models by leveraging NLP, recommendation systems & Gen AI-powered applications, to production environments, ensuring they meet the diverse needs of Twilio's verticals and customer base.
  • Collaborate Across Teams: Work closely with product, program, analytics, and engineering teams to implement and refine machine learning, statistical, and forecasting models that drive business outcomes.
  • Utilize Advanced Technical Stack: Leverage our technical stack, including Python, SQL, R, AWS (Sage maker, Lambda, S3, Kendra), MySQL, Air table, and libraries such as Pandas, NumPy, SciKit-Learn, XGBoost, Matplotlib, and Keras, to develop robust and scalable AI/ML solutions.
  • Integrate Enterprise Data Sources: Effectively utilize enterprise data sources like Salesforce and Zen desk to inform model development and enhance predictive accuracy.
  • Harness the Power of LLMs: Apply knowledge of Large Language Models (LLMs) such as OpenAI's GPT models, Claude, Gemini, Llama, Whisper, and Groq to develop innovative Gen AI use cases and solutions.

AWSPythonSQLKerasMachine LearningMySQLNumpyAirflowData sciencePandas

Posted about 13 hours ago
Apply
Shown 10 out of 3345

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.