Remote Jobs in Poland

Remote work is becoming increasingly popular, especially for those who speak foreign languages. If you're looking for a remote job with Polish from home or want to join international companies, Remoote.app will help you find the right opportunities. Here, you can find online jobs in Poland with flexible schedules, competitive salaries, and great career growth potential!

ETL
709 jobs found. to receive daily emails with new job openings that match your preferences.
709 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

📍 Poland

💸 22900.0 - 29900.0 PLN per month

🔍 Threat Intelligence

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • 2+ years of experience as a Data Engineer, working with large-scale distributed systems.
  • Proven expertise in Lakehouse architecture and Apache Hudi in production environments.
  • Experience with Airflow, Kafka, or streaming data pipelines.
  • Strong programming skills in Python and PySpark.
  • Comfortable working in a cloud-based environment (preferably AWS).
  • Design, build, and manage scalable data pipelines using Python, SQL, and PySpark.
  • Develop and maintain lakehouse architectures, with hands-on use of Apache Hudi for data versioning, upserts, and compaction.
  • Implement efficient ETL/ELT processes for both batch and real-time data ingestion.
  • Optimize data storage and query performance across large datasets (partitioning, indexing, compaction).
  • Ensure data quality, governance, and lineage, integrating validation and monitoring into pipelines.
  • Work with cloud-native services (preferably AWS – S3, Athena, EMR) to support modern data workflows.
  • Collaborate closely with data scientists, analysts, and platform engineers to deliver reliable data infrastructure

AWSPythonSQLCloud ComputingETLKafkaAirflowData engineeringCI/CDTerraform

Posted 31 minutes ago
Apply
Apply

📍 Brazil

🔍 Data Governance

🏢 Company: TELUS Digital Brazil

  • At least 3 years of experience in Data Governance, Metadata Management, Data Cataloging, or Data Engineering.
  • Have actively participated in the design, implementation, and management of data governance frameworks and data catalogs.
  • Experience working with Colibra and a strong understanding of the Collibra Operating Model, workflows, domains, and policies.
  • Experience working with APIs.
  • Experience with a low-code/no-code ETL tool, such as Informatica
  • Experience working with databases and data modeling projects, as well as practical experience utilizing SQL
  • Effective English communication - able to explain technical and non-technical concepts to different audiences
  • Experience with a general-purpose programming language such as Python or Scala
  • Ability to work well in teams and interact effectively with others
  • Ability to work independently and manage multiple tasks simultaneously while meeting deadlines
  • Conduct detailed assessments of the customer’s data governance framework and current-state Collibra implementation
  • Translate business needs into functional specifications for Collibra use cases
  • Serve as a trusted advisor to the customer’s data governance leadership
  • Lead requirement-gathering sessions and workshops with business users and technical teams
  • Collaborate with cross-functional teams to ensure data quality and support data-driven decision-making to strive for greater functionality in our data systems
  • Collaborate with project managers and product owners to assist in prioritizing, estimating, and planning development tasks
  • Provide constructive feedback and share expertise with fellow team members, fostering mutual growth and learning
  • Demonstrate a commitment to accessibility and ensure that your work considers and positively impacts others

PythonSQLETLData engineeringRESTful APIsData visualizationData modelingData analyticsData management

Posted about 1 hour ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 135000.0 - 170000.0 USD per year

🔍 Healthcare

🏢 Company: SmarterDx👥 101-250💰 $50,000,000 Series B about 1 year agoArtificial Intelligence (AI)HospitalInformation TechnologyHealth Care

  • 3+ years of analytics engineering experience in the healthcare industry, involving clinical and/or billing/claims data.
  • You are very well-versed in SQL and ETL processes, significant experience in dbt is a must
  • You have experience in a general purpose programming language (Python, Java, Scala, Ruby, etc.)
  • You have strong experience in data modeling and their implementation in production  data pipelines.
  • You are comfortable with the essentials of data orchestration
  • Designing, developing, and maintaining dbt data models that support our healthcare analytics products.
  • Integrating and transforming customer data to conform to our data specifications and standards.
  • Collaborating with cross-functional teams to translate data and business requirements into effective data models.
  • Configuring and improving data pipelines that integrate and connect the data models.
  • Conducting QA and testing on data models to ensure data accuracy, reliability, and performance.
  • Applying industry standards and best practices around data modeling, testing, and data pipelining.
  • Participating in a rota with other engineers to help investigate and resolve data related issues.

AWSPostgreSQLPythonSQLApache AirflowETLSnowflakeData engineeringData modelingData analytics

Posted about 1 hour ago
Apply
Apply

📍 United States

💸 175000.0 - 200000.0 USD per year

🔍 Manufacturing

  • 10+ years of experience in data, analytics, and solution architecture roles.
  • Proven success in a pre-sales or client-facing architecture role.
  • Deep understanding of Microsoft Azure data services, including: Azure Synapse, Data Factory, Data Lake, Databricks/Spark, Fabric, etc.
  • Experience aligning modern data architectures to business strategies in the manufacturing or industrial sector.
  • Experience scoping, pricing, and estimating delivery efforts in early-stage client engagements.
  • Clear and confident communicator who can build trust with technical and business stakeholders alike.
  • Lead strategic discovery sessions with executives and business stakeholders to understand business goals, challenges, and technical landscapes.
  • Define solution architectures and delivery roadmaps that align with client objectives and Microsoft best practices.
  • Scope, estimate, and present proposals for complex Data, AI, and Azure-based projects.
  • Architect end-to-end data and analytics solutions using Microsoft Azure (Data Lake, Synapse, ADF, Power BI, Microsoft Fabric, Azure AI Services, etc.).
  • Oversee solution quality from ideation through handoff to delivery teams.
  • Support and guide sales consultants in an effort to align solutions, progress deals and effectively close business.
  • Collaborate with Delivery to ensure project success and customer satisfaction.
  • Contribute to thought leadership, IP development, and reusable solution frameworks.

SQLCloud ComputingETLMachine LearningMicrosoft AzureMicrosoft Power BIAzureData engineeringRESTful APIsData modelingData analyticsData management

Posted about 3 hours ago
Apply
Apply
🔥 Senior Data Engineer
Posted about 4 hours ago

📍 United States of America

🏢 Company: IDEXX

  • Bachelor’s degree in Computer Science, Computer Engineering, Information Systems, Information Systems Engineering or a related field and 5 years of experience or Master’s degree in Computer Science, Computer Engineering, Information Systems, Information Systems Engineering or a related field and 3 years of related professional experience.
  • Advanced SQL knowledge and experience working with relational databases, including Snowflake, Oracle, Redshift.
  • Experience with AWS or Azure cloud platforms
  • Experience with data pipeline and workflow scheduling tools: Apache Airflow, Informatica.
  • Experience with ETL/ELT tools and data processing techniques
  • Experience in database design, development, and modeling
  • 3 years of related professional experience with object-oriented languages: Python, Java, and Scala
  • Design and implement scalable, reliable distributed data processing frameworks and analytical infrastructure
  • Design metadata and schemas for assigned projects based on a logical model
  • Create scripts for physical data layout
  • Write scripts to load test data
  • Validate schema design
  • Develop and implement node cluster models for unstructured data storage and metadata
  • Design advanced level Structured Query Language (SQL), data definition language (DDL) and Python scripts
  • Define, design, and implement data management, storage, backup and recovery solutions
  • Design automated software deployment functionality
  • Monitor structural performance and utilization, identifying problems and implements solutions
  • Lead the creation of standards, best practices and new processes for operational integration of new technology solutions
  • Ensures environments are compliant with defined standards and operational procedures
  • Implement measures to ensure data accuracy and accessibility, constantly monitoring and refining the performance of data management systems

AWSPythonSQLApache AirflowCloud ComputingETLJavaOracleSnowflakeAzureData engineeringScalaData modelingData management

Posted about 4 hours ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 138000.0 - 150000.0 USD per year

🔍 Education

🏢 Company: Gradient Learning

  • 5+ years of experience working in the K-12 space
  • 3+ years of experience in data product development
  • 3+ years of experience translating complex data into educator-friendly visualizations using Tableau
  • 3+ years of people management experience
  • 3+ years of experience with Snowflake or comparable cloud-based data warehousing platforms (strongly preferred)
  • Experience using AI or machine learning to enhance data analysis or deliver scalable, educator-facing insights (strongly preferred)
  • Familiarity with LTI standards, LMS platforms, and education data interoperability; direct experience with Canvas LMS (strongly preferred)
  • Knowledge of data privacy, security, and protection standards, particularly as they relate to PII and educational data (FERPA, COPPA, etc.) (preferred)
  • Design, Refine, and Lead the Data & Insights Product Strategy
  • Oversee Data & Insights Product Development and Delivery
  • Strengthen Data Infrastructure in Partnership with Information Systems
  • Lead the Data & Insights Product Delivery Team

SQLData AnalysisETLMachine LearningPeople ManagementProduct ManagementSnowflakeUser Experience DesignCross-functional Team LeadershipTableauProduct DevelopmentData engineeringCommunication SkillsAgile methodologiesRESTful APIsData visualizationStakeholder managementStrategic thinkingData modelingData analyticsData management

Posted about 5 hours ago
Apply
Apply
🔥 Senior Analytics Engineer
Posted about 6 hours ago

📍 USA

🏢 Company: Engine

  • 5+ years of industry experience as an Analytics Engineer in high-growth environments.
  • Strong expertise using SQL, Snowflake, Airflow, and BI tools such as Looker.
  • A Bachelor's degree in Computer Science, Information Technology, Engineering, or a related technical field, or equivalent practical experience
  • Develop and implement tools and strategies to improve the data quality, reliability, and governance at Engine.
  • Collaborate with engineering, analytics, and business stakeholders to ensure high quality data empowers every business decision to drive measurable business impact.
  • Enhance data infrastructure and analytics capabilities by working closely with our data infrastructure and analyst teams.
  • Design and build our data pipelines to support long term business growth without compromising on our day to day execution speed.

AWSDockerSQLETLGitSnowflakeAirflow

Posted about 6 hours ago
Apply
Apply
🔥 Data Analyst - Finance
Posted about 8 hours ago

📍 USA

🧭 Full-Time

🔍 Fintech

🏢 Company: Comun

  • Expert-level SQL knowledge with demonstrated ability to optimize complex queries (non-negotiable)
  • At least 3+ years of practical experience in data engineering or analytics roles with financial data
  • 3+ years of experience at a fintech or financial services industries in the payments or lending space at a similar role
  • Solid understanding of finance concepts and principles
  • Proven track record building data pipelines and ETL processes
  • Experience implementing cost modeling and optimization analytics
  • Problem-solving mindset with strong analytical skills
  • Excellent communication skills to explain complex technical and financial concepts
  • Design and implement scalable data pipelines to establish and maintain a solid fund flow process
  • Automate financial reconciliation processes and generate actionable reports
  • Develop and maintain revenue and cost models to identify growth opportunities and provide insights for strategic decision-making
  • Build analytical tools to identify and quantify cost optimization opportunities across the organization
  • Monitor vendor performance metrics and evaluate new vendor opportunities
  • Implement data solutions to detect financial anomalies and uncover efficiency opportunities that drive business value
  • Perform cohort-level performance analysis to develop a deeper understanding on customer unit economics
  • Collaborate with finance, data, growth, product, and engineering teams to develop robust financial data architecture
  • Contribute to our mission of financial inclusion by enabling data-informed product and pricing decisions

AWSPostgreSQLPythonSQLData AnalysisETLSnowflakeData engineeringFastAPIFinancial analysisData modelingFinance

Posted about 8 hours ago
Apply
Apply
🔥 Data Engineer (Contract)
Posted about 9 hours ago

📍 LatAm

🧭 Contract

🏢 Company: AbleRentalProperty ManagementReal Estate

  • 10+ years of data engineering experience with enterprise-scale systems
  • Expertise in Apache Spark and Delta Lake, including ACID transactions, time travel, Z-ordering, and compaction
  • Deep knowledge of Databricks (Jobs, Clusters, Workspaces, Delta Live Tables, Unity Catalog)
  • Experience building scalable ETL/ELT pipelines using tools like Airflow, Glue, Dataflow, or ADF
  • Advanced SQL for data modeling and transformation
  • Strong programming skills in Python (or Scala)
  • Hands-on experience with data formats such as Parquet, Avro, and JSON
  • Familiarity with schema evolution, versioning, and backfilling strategies
  • Working knowledge of at least one major cloud platform: AWS (S3, Athena, Redshift, Glue Catalog, Step Functions), GCP (BigQuery, Cloud Storage, Dataflow, Pub/Sub), or Azure (Synapse, Data Factory, Azure Databricks)
  • Experience designing data architectures with real-time or streaming data (Kafka, Kinesis)
  • Consulting or client-facing experience with strong communication and leadership skills
  • Experience with data mesh architectures and domain-driven data design
  • Knowledge of metadata management, data cataloging, and lineage tracking tools
  • Shape large-scale data architecture vision and roadmap across client engagements
  • Establish governance, security frameworks, and regulatory compliance standards
  • Lead strategy around platform selection, integration, and scaling
  • Guide organizations in adopting data lakehouse and federated data models
  • Lead technical discovery sessions to understand client needs
  • Translate complex architectures into clear, actionable value for stakeholders
  • Build trusted advisor relationships and guide strategic decisions
  • Align architecture recommendations with business growth and goals
  • Design and implement modern data lakehouse architectures with Delta Lake and Databricks
  • Build and manage ETL/ELT pipelines at scale using Spark (PySpark preferred)
  • Leverage Delta Live Tables, Unity Catalog, and schema evolution features
  • Optimize storage and queries on cloud object storage (e.g., AWS S3, Azure Data Lake)
  • Integrate with cloud-native services like AWS Glue, GCP Dataflow, and Azure Synapse Analytics
  • Implement data quality monitoring, lineage tracking, and schema versioning
  • Build scalable pipelines with tools like Apache Airflow, Step Functions, and Cloud Composer
  • Develop cost-optimized, scalable, and compliant data solutions
  • Design POCs and pilots to validate technical approaches
  • Translate business requirements into production-ready data systems
  • Define and track success metrics for platform and pipeline initiatives

AWSPythonSQLCloud ComputingETLGCPKafkaAirflowAzureData engineeringScalaData modeling

Posted about 9 hours ago
Apply
Apply
🔥 Data Engineer (m/f/d)
Posted about 10 hours ago

📍 Germany

🧭 Full-Time

🏢 Company: Roadsurfer👥 501-1000💰 $5,330,478 almost 4 years agoLeisureRentalTourismRecreational Vehicles

  • Experience with Segment, Braze, or similar CDP/CEP platforms
  • Basic knowledge of data transformation tools
  • Familiarity with data governance practices, such as data ownership, naming conventions, and data lineage
  • Experience implementing data privacy measures such as consent tracking and anonymization
  • Familiarity with data quality metrics and monitoring techniques
  • Understanding of data privacy regulations (GDPR, CCPA)
  • Good communication skills, with the ability to work with cross-functional teams and stakeholders
  • Ensure reliability through automated tests, versioned models, and data lineage
  • Assist in implementing data governance policies to ensure data consistency, quality, and integrity across the CDP and CEP platforms
  • Support the automation of data validation and quality checks, including schema validation and data integrity monitoring
  • Help define and track data quality metrics and provide regular insights on data cleanliness and health
  • Assist in ensuring compliance with data privacy regulations (e.g., GDPR, CCPA), including implementing consent tracking and anonymization measures
  • Work with cross-functional teams to standardize data definitions, naming conventions, and ownership practices
  • Help maintain data cleanliness through automated data cleanup processes and identify areas for improvement
  • Support the analytics team by ensuring data is structured correctly for reporting and analysis

SQLApache AirflowETLData engineeringPostgresRESTful APIsComplianceJSONData visualizationData modelingData analyticsData management

Posted about 10 hours ago
Apply
Shown 10 out of 709

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Key Features of Remote Work in Poland

Poland has been actively developing its online job market. Many companies are adapting to flexible work models and are open to hiring specialists who are not tied to a physical office. This applies to both international corporations and local businesses looking for employees with Polish language skills.

Professionals in IT, marketing, customer support, finance, and translation are in high demand. Thanks to its flexibility, remote job in Poland offers comfortable working conditions and competitive salaries.

For job seekers who want to work remotely for Polish companies but live abroad, remote work provides an opportunity to collaborate with local and international employers without the need to relocate.

Who is Remote Work Suitable for?

We’ve gathered hundreds of up-to-date offers from Polish employers and international companies looking for Polish-speaking professionals. Remote work is available for various categories of professionals:

  1. For residents of Poland who need a flexible schedule and the ability to work from home.
  2. For expats – foreigners who have moved to Poland and speak the language.
  3. For specialists from other countries who want to work with Polish companies.
  4. For beginners looking to gain experience and build their portfolio.
  5. For experienced professionals seeking a high-paying position with career growth opportunities.

Regardless of your experience and location, remote jobs with the Polish language open up new opportunities. Remoote.app will help you find a suitable position that matches your skill level and career ambitions.

Which Specialists are Most in Demand?

The most in-demand remote jobs for Polish speakers include:

  • Technical specialists — development, testing, and support of IT products.
  • Customer service and sales managers — communication with Polish clients, business correspondence management, and handling sales processes.
  • Content marketers and SEO specialists — creating advertising content and promoting websites.
  • Finance professionals and accountants — bookkeeping, tax consulting, and financial analysis.
  • Interpreters and editors — adapting content to and from Polish.
  • Project management specialists — coordinating processes, monitoring deadlines, and ensuring quality execution.
  • HR managers — recruiting, onboarding, and managing Polish and international teams.
  • Analysts — data processing, market analysis, and evaluating business strategy effectiveness.

Our platform offers work opportunities for specialists of all levels — from beginners to experts. Beginners can gain their first experience in international companies and develop new skills. Mid-level professionals will find job openings with career growth potential and opportunities to expand their professional competencies. Experienced professionals can apply for high-paying positions with managerial roles and strategic tasks.

Employment Options

Remoote.app offers various formats of employment:

  • Full-time — stable work with a fixed schedule and a long-term contract.
  • Part-time — an opportunity to combine work with studies or other projects.
  • Contract-based — short-term assignments or collaboration for the duration of a project.
  • Temporary work — positions with a specific timeframe, such as seasonal projects, employee replacements, or urgent tasks.
  • Internships — a chance for beginners to gain experience in an international company.

This variety of work formats allows each candidate to choose the optimal employment option based on their goals, schedule, and experience level. Whether you are searching for a stable career, a temporary project, or an opportunity to gain your first professional experience, our platform will help you find the right job for any request.

Advantages of Finding Remote Work through Remoote.app

We have created a convenient tool for quickly finding remote jobs with Polish language skills:

  • AI-Powered Job Processing 

Our platform uses artificial intelligence algorithms to analyze thousands of job listings. The system highlights key job characteristics, saving you from reading long descriptions.

  • Advanced Filters 

You can customize your search based on skills, employment type, and experience level. This ensures you receive only the most relevant vacancies.

  • Up-to-date Database

Job listings are updated several times a day. We automatically remove outdated vacancies, leaving only those that are open for applications.

  • Personalized Notifications

Receive relevant job offers in Poland directly to your email or Telegram. This way, you won’t miss any exciting positions.

  • Resume Builder

Our service will help you create a professional resume tailored specifically for your skills, even if you have no experience writing CVs.

  • Flexible Pricing

You can apply for up to 5 jobs per day for free. If you need more opportunities, convenient subscriptions are available for a week, month, or year.

  • Data Security

We use state-of-the-art encryption technologies to ensure that your personal data remains secure.

With Remoote.app, finding online jobs in Poland becomes simple and convenient. Register now and start searching for remote work from home today!



Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.