Remote Jobs in Poland

Remote work is becoming increasingly popular, especially for those who speak foreign languages. If you're looking for a remote job with Polish from home or want to join international companies, Remoote.app will help you find the right opportunities. Here, you can find online jobs in Poland with flexible schedules, competitive salaries, and great career growth potential!


Airflow
132 jobs found. to receive daily emails with new job openings that match your preferences.
132 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ Brazil, the U.S., and Canada

🧭 Full-Time

πŸ” Payments

  • Bachelor’s or Master’s degree in CS/Engineering/Data-Science or other technical disciplines.
  • Solid experience in DS/ML engineering.
  • Proficiency in programming languages such as Python, Scala, or Java.
  • Hands-on experience in implementing batch and real-time streaming pipelines, using SQL and NoSQL database solutions
  • Familiarity with monitoring tools for data pipelines, streaming systems, and model performance.
  • Experience in AWS cloud services (Sagemaker, EC2, EMR, ECS/EKS, RDS, etc.).
  • Experience with CI/CD pipelines, infrastructure-as-code tools (e.g., Terraform, CloudFormation), and MLOps platforms like MLflow.
  • Experience with Machine Learning modeling, notably tree-based and boosting models supervised learning for imbalanced target scenarios.
  • Experience with Online Inference, APIs, and services that respond under tight time constraints.
  • Proficiency in English.
  • Design the data-architecture flow for the efficient implementation of real-time model endpoints and/or batch solutions.
  • Engineer domain-specific features that can enhance model performance and robustness.
  • Build pipelines to deploy machine learning models in production with a focus on scalability and efficiency, and participate in and enforce the release management process for models and rules.
  • Implement systems to monitor model performance, endpoints/feature health, and other business metrics; Create model-retraining pipelines to boost performance, based on monitoring metrics; Model recalibration.
  • Design and implement scalable architectures to support real-time/batch solutions; Optimize algorithms and workflows for latency, throughput, and resource efficiency; Ensure systems adhere to company standards for reliability and security.
  • Conduct research and prototypes to explore novel approaches in ML engineering for addressing emerging risk/fraud patterns.
  • Partner with fraud analysts, risk managers, and product teams to translate business requirements into ML solutions.

AWSBackend DevelopmentDockerPythonSQLAmazon RDSAWS EKSFrontend DevelopmentJavaKafkaKubernetesMachine LearningMLFlowAirflowAlgorithmsData engineeringData scienceREST APINosqlPandasSparkCI/CDTerraformScalaData modelingEnglish communication

Posted about 2 hours ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 183600.0 - 216000.0 USD per year

πŸ” Software Development

  • 6+ years of experience in a data engineering role building products, ideally in a fast-paced environment
  • Good foundations in Python and SQL.
  • Experience with Spark, PySpark, DBT, Snowflake and Airflow
  • Knowledge of visualization tools, such as Metabase, Jupyter Notebooks (Python)
  • Collaborate on the design and improvements of the data infrastructure
  • Partner with product and engineering to advocate best practices and build supporting systems and infrastructure for the various data needs
  • Create data pipelines that stitch together various data sources in order to produce valuable business insights
  • Create real-time data pipelines in collaboration with the Data Science team

PythonSQLSnowflakeAirflowData engineeringSparkData visualizationData modeling

Posted 3 days ago
Apply
Apply

πŸ“ LATAM, Canada

🏒 Company: OfferFitπŸ‘₯ 51-100πŸ’° $25,000,000 Series B over 1 year agoArtificial Intelligence (AI)Machine LearningMarketing Automation

  • Exceptional coder: you write clean, object-oriented code; you care about good design and terse, testable APIs
  • Tinkerer: you regularly explore and learn new technologies and methods
  • Entrepreneurial: you proactively identify opportunities and risks, work around obstacles, and always seek creative ways to improve processes and outcomes
  • Structured and organized: you can structure a plan, align stakeholders, and see it through to execution
  • Clear communicator: you are able to express yourself clearly and persuasively, both in writing and verbally
  • Collaborate with customer analytics/BI teams and OfferFit colleagues on use case setups (design, data integration, pipeline and ML model configuration, etc.) and ongoing value delivery and optimization
  • Extend product capabilities by improving architecture and developing reusable data pipelines, APIs, and components
  • Work closely with the Applied Science team to improve self-learning (reinforcement learning) algorithms
  • Participate in determining OfferFit’s product strategy and roadmap

PythonSQLGCPKubernetesAirflowFastAPIPandasTensorflowCI/CDRESTful APIsTerraformSoftware Engineering

Posted 4 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 183600.0 - 216000.0 USD per year

πŸ” Mental Healthcare

🏒 Company: HeadwayπŸ‘₯ 201-500πŸ’° $125,000,000 Series C over 1 year agoMental Health Care

  • 6+ years of experience in a data engineering role building products, ideally in a fast-paced environment
  • Good foundations in Python and SQL.
  • Experience with Spark, PySpark, DBT, Snowflake and Airflow
  • Knowledge of visualization tools, such as Metabase, Jupyter Notebooks (Python)
  • A knack for simplifying data, expressing information in charts and tables
  • Collaborate on the design and improvements of the data infrastructure
  • Partner with product and engineering to advocate best practices and build supporting systems and infrastructure for the various data needs
  • Create data pipelines that stitch together various data sources in order to produce valuable business insights
  • Create real-time data pipelines in collaboration with the Data Science team

PythonSQLETLSnowflakeAirflowData engineeringRDBMSSparkRESTful APIsData visualizationData modeling

Posted 4 days ago
Apply
Apply

πŸ“ Poland, Ukraine, Cyprus

🧭 Full-Time

πŸ” Software Development

🏒 Company: CompeteraπŸ‘₯ 51-100πŸ’° $3,000,000 Seed about 1 year agoArtificial Intelligence (AI)Big DataE-CommerceRetailMachine LearningAnalyticsRetail TechnologyInformation TechnologyEnterprise SoftwareSoftware

  • 5+ years of experience in data engineer role.
  • Strong knowledge of SQL, Spark, Python, Airflow, binary file formats.
  • Contribute to the development of the new data platform.
  • Collaborate with platform and ML teams to create ETL pipelines that efficiently deliver clean and trustworthy data.
  • Engage in architectural decisions regarding the current and future state of the data platform.
  • Design and optimize data models based on business and engineering needs.

PythonSQLETLKafkaAirflowSparkData modeling

Posted 4 days ago
Apply
Apply

πŸ“ United States, Canada, Mexico, Europe

🧭 Full-Time

πŸ” Software Development

  • 5+ years of experience as a Data Engineer
  • Snowflake architecture expertise for multi-tenant B2B applications
  • Performance optimization for customer-facing data models and analytics.
  • Advanced SQL skills for complex query optimization
  • Proficiency in Python, Scala, or Go for data pipeline development
  • Experience analyzing source data structures and recommending improvements
  • Ability to collaborate with engineering teams on data design
  • Experience with ETL/ELT pipelines (Airflow, dbt)
  • Integration experience with Power BI, Tableau, and Sigma
  • Mentoring skills for report creation using BI tools
  • Data quality management for customer-facing products
  • Experience with GitHub/source control and CI/CD pipelines (GitHub Actions or Jenkins)
  • Understanding of multi-tenant data security and governance
  • Develop and enhance AI workflows in support of the various QAD applications.
  • Complete delivery work committed during the sprint to achieve business goals.
  • Help the business maintain a competitive edge by leveraging the latest AI technologies.
  • Provide subject matter expertise during incidents to resolve customer issues quickly.
  • Participate in forums to explore interests outside of the sprint work and contribute ideas to continuously improve the system.
  • Commit to the team to help the team and the wider business achieve our goals.
  • Write testable and maintainable code.

PythonSQLETLJenkinsSnowflakeTableauAirflowAPI testingData engineeringGoCI/CDRESTful APIsScalaData modeling

Posted 5 days ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Cybersecurity

🏒 Company: crowdstrikecareers

  • 3+ years experience developing and deploying machine learning solutions to production. Familiarity with typical machine learning algorithms from an engineering perspective (how they are built and used, not necessarily the theory); familiarity with supervised / unsupervised approaches: how, why, and when and labeled data is created and used
  • 3+ years experience with ML Platform tools like Jupyter Notebooks, NVidia Workbench, MLFlow, Ray, Vertex AI etc.
  • Experience building data platform product(s) or features with (one of) Apache Spark, Flink or comparable tools in GCP. Experience with Iceberg is highly desirable.
  • Proficiency in distributed computing and orchestration technologies (Kubernetes, Airflow, etc.)
  • Production experience with infrastructure-as-code tools such as Terraform, FluxCD
  • Expert level experience with Python; Java/Scala exposure is recommended. Ability to write Python interfaces to provide standardized and simplified interfaces for data scientists to utilize internal Crowdstrike tools
  • Expert level experience with CI/CD frameworks such as GitHub Actions
  • Expert level experience with containerization frameworks
  • Strong analytical and problem solving skills, capable of working in a dynamic environment
  • Exceptional interpersonal and communication skills. Work with stakeholders across multiple teams and synthesize their needs into software interfaces and processes.
  • Help design, build, and facilitate adoption of a modern Data+ML platform
  • Modularize complex ML code into standardized and repeatable components
  • Establish and facilitate adoption of repeatable patterns for model development, deployment, and monitoring
  • Build a platform that scales to thousands of users and offers self-service capability to build ML experimentation pipelines
  • Leverage workflow orchestration tools to deploy efficient and scalable execution of complex data and ML pipelines
  • Review code changes from data scientists and champion software development best practices
  • Leverage cloud services like Kubernetes, blob storage, and queues in our cloud first environment

PythonData AnalysisGCPKubernetesMachine LearningMLFlowAirflowData engineeringData scienceSparkCI/CDRESTful APIsTerraform

Posted 5 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 115000.0 - 129000.0 USD per year

πŸ” Education

🏒 Company: amplify_careers

  • BS in Computer Science, Data Science, or equivalent experience.
  • 3+ years of professional software development or data engineering experience
  • Strong computer, data, and analytics engineering fundamentals.
  • Proven fluency in SQL and its use in code-based ETL frameworks preferably dbt
  • Understanding of ETL/ELT pipelines, analytical data modeling, aggregations, and metrics
  • Strong understanding of analytical modeling architectures, including the Kimball dimensional data model design
  • Ability to clearly communicate and present technical concepts to a broad audience both verbally and in written form
  • Build well-tested and documented ELT data pipelines for both full and incremental dbt models to funnel into a fact and dimensional data mart.
  • Work closely with sales on logistics pipeline forecasting and sales pipeline tracking to help focus our sales teams in the right areas for the biggest impact.
  • Align with finance on making sure we have well audited data inline with established financial best practices.
  • Engineer novel datasets which express a student's progress and performance through an adaptive learning experience which allows for flexible comparison across students and deep analysis of individual students.
  • Work with the data science team to measure the impact of design changes on an administrator reporting application.
  • Contribute to leading industry data standards, such as Caliper Analytics, EdFi, or xAPI
  • Craft slowly changing dimensional models that take into account the nuances of K-12 education such as School Year changes and students moving schools or classes.

AWSPostgreSQLPythonSQLBusiness IntelligenceData AnalysisETLSnowflakeTableauAirflowData engineeringAnalytical SkillsCI/CDData visualizationData modelingData analytics

Posted 7 days ago
Apply
Apply
πŸ”₯ Data Engineer
Posted 10 days ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 153000.0 - 216000.0 USD per year

πŸ” Software Development

  • 3+ years of experience in a data engineering role building products, ideally in a fast-paced environment
  • Good foundations in Python and SQL.
  • Experience with Spark, PySpark, DBT, Snowflake and Airflow
  • Knowledge of visualization tools, such as Metabase, Jupyter Notebooks (Python)
  • Collaborate on the design and improvements of the data infrastructure
  • Partner with product and engineering to advocate best practices and build supporting systems and infrastructure for the various data needs
  • Create data pipelines that stitch together various data sources in order to produce valuable business insights
  • Create real-time data pipelines in collaboration with the Data Science team

PythonSQLETLSnowflakeAirflowData engineeringSparkRESTful APIsData visualization

Posted 10 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 175000.0 - 220000.0 USDC per year

πŸ” Software Development

🏒 Company: OrcaπŸ‘₯ 11-50πŸ’° $18,000,000 Series A over 3 years agoCryptocurrencyBlockchainOnline PortalsInformation Technology

  • A strong track record of working on high-performance, scalable systems with expertise in release engineering, infrastructure, and operations.
  • Extensive experience with AWS services (e.g., ECS, copilot, Cloudwatch) and the ability to troubleshoot and optimize cloud-based systems.
  • Hands-on experience with tools like GitHub Action for reliable and efficient deployment workflows.
  • Familiarity with tools like Datadog to build actionable monitoring and alerting systems.
  • Proficiency in infrastructure-as-code tools like Terraform, and containerization tools like Docker. Experience with orchestrators like Kubernetes or Airflow is a plus.
  • Comfortable working independently in an async environment while collaborating effectively with a team. You understand trade-offs and advocate for pragmatic solutions.
  • Familiarity with Decentralized Finance (DeFi) concepts, AMMs, and the Solana ecosystem is a plus but not required.
  • Design, manage, and optimize AWS infrastructure with a focus on scalability, reliability, and cost efficiency.
  • Triage and resolve critical infrastructure issues proactively.
  • Build and refine CI/CD processes using modern tools, ensuring seamless, secure, and efficient deployments.
  • Develop robust monitoring, logging, and alerting systems using tools like Datadog or Grafana to improve visibility and system performance.
  • Architect systems that handle growth effortlessly, minimize downtime, and maintain high performance.
  • Implement effective alerting mechanisms to prioritize and address critical issues proactively.
  • Optimize and document infrastructure processes, leveraging tools like Terraform, Docker, and Airflow to create scalable and maintainable systems.
  • Partner with engineering teams to design and refine infrastructure that powers features like real-time monitoring, automated transaction execution, and analytics.

AWSDockerPostgreSQLKubernetesAirflowGrafanaRustCI/CDLinuxDevOpsTerraform

Posted 11 days ago
Apply
Shown 10 out of 132

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Key Features of Remote Work in Poland

Poland has been actively developing its online job market. Many companies are adapting to flexible work models and are open to hiring specialists who are not tied to a physical office. This applies to both international corporations and local businesses looking for employees with Polish language skills.

Professionals in IT, marketing, customer support, finance, and translation are in high demand. Thanks to its flexibility, remote job in Poland offers comfortable working conditions and competitive salaries.

For job seekers who want to work remotely for Polish companies but live abroad, remote work provides an opportunity to collaborate with local and international employers without the need to relocate.

Who is Remote Work Suitable for?

We’ve gathered hundreds of up-to-date offers from Polish employers and international companies looking for Polish-speaking professionals. Remote work is available for various categories of professionals:

  1. For residents of Poland who need a flexible schedule and the ability to work from home.
  2. For expats – foreigners who have moved to Poland and speak the language.
  3. For specialists from other countries who want to work with Polish companies.
  4. For beginners looking to gain experience and build their portfolio.
  5. For experienced professionals seeking a high-paying position with career growth opportunities.

Regardless of your experience and location, remote jobs with the Polish language open up new opportunities. Remoote.app will help you find a suitable position that matches your skill level and career ambitions.

Which Specialists are Most in Demand?

The most in-demand remote jobs for Polish speakers include:

  • Technical specialists β€” development, testing, and support of IT products.
  • Customer service and sales managers β€” communication with Polish clients, business correspondence management, and handling sales processes.
  • Content marketers and SEO specialists β€” creating advertising content and promoting websites.
  • Finance professionals and accountants β€” bookkeeping, tax consulting, and financial analysis.
  • Interpreters and editors β€” adapting content to and from Polish.
  • Project management specialists β€” coordinating processes, monitoring deadlines, and ensuring quality execution.
  • HR managers β€” recruiting, onboarding, and managing Polish and international teams.
  • Analysts β€” data processing, market analysis, and evaluating business strategy effectiveness.

Our platform offers work opportunities for specialists of all levels β€” from beginners to experts. Beginners can gain their first experience in international companies and develop new skills. Mid-level professionals will find job openings with career growth potential and opportunities to expand their professional competencies. Experienced professionals can apply for high-paying positions with managerial roles and strategic tasks.

Employment Options

Remoote.app offers various formats of employment:

  • Full-time β€” stable work with a fixed schedule and a long-term contract.
  • Part-time β€” an opportunity to combine work with studies or other projects.
  • Contract-based β€” short-term assignments or collaboration for the duration of a project.
  • Temporary work β€” positions with a specific timeframe, such as seasonal projects, employee replacements, or urgent tasks.
  • Internships β€” a chance for beginners to gain experience in an international company.

This variety of work formats allows each candidate to choose the optimal employment option based on their goals, schedule, and experience level. Whether you are searching for a stable career, a temporary project, or an opportunity to gain your first professional experience, our platform will help you find the right job for any request.

Advantages of Finding Remote Work through Remoote.app

We have created a convenient tool for quickly finding remote jobs with Polish language skills:

  • AI-Powered Job Processing 

Our platform uses artificial intelligence algorithms to analyze thousands of job listings. The system highlights key job characteristics, saving you from reading long descriptions.

  • Advanced Filters 

You can customize your search based on skills, employment type, and experience level. This ensures you receive only the most relevant vacancies.

  • Up-to-date Database

Job listings are updated several times a day. We automatically remove outdated vacancies, leaving only those that are open for applications.

  • Personalized Notifications

Receive relevant job offers in Poland directly to your email or Telegram. This way, you won’t miss any exciting positions.

  • Resume Builder

Our service will help you create a professional resume tailored specifically for your skills, even if you have no experience writing CVs.

  • Flexible Pricing

You can apply for up to 5 jobs per day for free. If you need more opportunities, convenient subscriptions are available for a week, month, or year.

  • Data Security

We use state-of-the-art encryption technologies to ensure that your personal data remains secure.

With Remoote.app, finding online jobs in Poland becomes simple and convenient. Register now and start searching for remote work from home today!



Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.