Data engineering Jobs

Find remote positions requiring Data engineering skills. Browse through opportunities where you can utilize your expertise and grow your career.

Data engineering
351 jobs found. to receive daily emails with new job openings that match your preferences.
351 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ United States, Latin America, India

🧭 Full-Time

πŸ” Data services, Cloud services, AI services

  • 5+ years of B2B sales experience in technology services, with cloud and data services experience being a plus.
  • Experience co-selling with technology partners, especially Snowflake, Azure, AWS.
  • Strong verbal, written and presentation skills.
  • Ability to craft proposals, negotiate contracts and deliverables.
  • Enthusiasm for working in a fast-paced, high-growth environment.
  • Ability to work cross-functionally and include others in the sales process.
  • Natural curiosity about customers and their business.
  • Proven record of sourcing, negotiating, and closing $250K+ deals.

  • Exceed quarterly/annual sales targets in all practice areas (Data Engineering, Managed Services, Machine Learning, and Analytics).
  • Co-Sell with Snowflake, AWS, and other ecosystem partners to sell transformative data and AI engineering services and solutions into enterprise companies.
  • Drive and execute lead generation activities to grow market share and awareness.
  • Partner with managing directors and other team members to scope and present solutions.
  • Take accountability for the phData brand in your territory.
  • Navigate complex organizational structures to identify key stakeholders.
  • Accurately forecast, update deals, and manage sales pipeline.
  • Complete training for continuous improvement.

AWSSQLCloud ComputingETLGCPMachine LearningSnowflakeAzureData engineeringREST APINegotiationLead GenerationData analytics

Posted about 16 hours ago
Apply
Apply

πŸ“ California, Colorado, Illinois, Florida, Georgia, Massachusetts, Nevada, New Jersey, New York, North Carolina, Oregon, Pennsylvania, South Carolina, Texas, Virginia, Washington

🧭 Full-Time

πŸ’Έ 190000.0 - 210000.0 USD per year

πŸ” Technology

🏒 Company: CrunchbaseπŸ‘₯ 101-250πŸ’° $50,000,000 Series D over 2 years agoπŸ«‚ Last layoff over 1 year agoDatabaseInformation ServicesBusiness IntelligenceLead GenerationMarketing AutomationSoftware

  • Excellent communication skills to simplify complex issues.
  • Proven track record in dimensional data modeling and ETL pipeline development.
  • Proficiency in Python.
  • Familiarity with data warehousing technologies and cloud platforms.
  • Experience implementing NLP and ML algorithms in production environments.
  • Strong problem-solving skills.
  • Demonstrated leadership skills to guide and mentor team members.
  • Proficient with Google Office Suite or related software.

  • Architect and design new dimensional data models aligning with business objectives.
  • Build, monitor, and maintain analytics and production data ETL pipelines.
  • Develop tools for engineers and the Product team to query and analyze datasets.
  • Enable implementation of NLP and ML algorithms at scale.
  • Contribute to architecture design considering business requirements.
  • Mentor junior engineers and foster professional growth.
  • Collaborate with various teams to integrate solutions.
  • Evaluate and recommend tools to enhance data capabilities.
  • Communicate complex technical concepts to non-technical stakeholders.

PythonSQLETLMachine LearningData engineeringData science

Posted 3 days ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Healthcare

  • Have 8+ years of related experience with a BS, 6+ years with a MS, or 3+ years with a PhD.
  • Demonstrated significant track record of delivery.
  • At least 4 years of experience dealing with regulated data in a startup environment.
  • Deep experience with AWS technologies such as Redshift, S3, MSK, and Lake Formation.
  • Proficiency in Kafka and distributed systems.
  • Expertise in programming languages like Clojure or Python for data applications.
  • Experience in converting data science projects into usable product modules.
  • Familiarity with systems engineering and orchestration tools like Terraform.
  • Notable open-source contributions to relevant communities.

  • Support the development of next-generation, privacy-aware data architecture using technologies like Kafka and AWS.
  • Prototype new algorithms for data products.
  • Build and scale data ingestion and ETL processes.
  • Familiarize with regulatory frameworks like HIPAA and GDPR.
  • Guide development of technical solutions for challenging problems.
  • Drive best practices for high-performance, large-scale systems.
  • Present proposals at architectural design reviews.
  • Deliver large features and improvements independently.
  • Focus on ethical data practices and business needs.

AWSPythonETLKafkaData engineeringTerraform

Posted 3 days ago
Apply
Apply

πŸ“ Canada

🧭 Full-Time

πŸ” Threat intelligence and investigations

🏒 Company: LifeRaftπŸ‘₯ 51-100πŸ’° about 4 years agoOpen SourceSecurityInformation Technology

  • Bachelor’s degree in Computer Science, Information Systems, or a related field; advanced degrees preferred.
  • 15+ years of progressive experience in software engineering, with a focus on managing and leading teams.
  • Proven experience in a leadership role within a data engineering or related function.
  • Experience with PHP (or Python or Ruby), Javascript, and React.
  • In-depth knowledge of software architecture and system scalability.
  • Demonstrated success in building and leading high-performance software engineering teams.
  • Strong people management and mentorship skills.
  • Experience in developing and executing strategic plans for data engineering initiatives.
  • Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP).
  • Excellent communication skills, able to convey complex technical concepts to non-technical stakeholders.

  • Collaborate with the CTO and executive leadership to establish the overarching technical vision, engineering processes, and best practices.
  • Build, inspire, and manage a high-performing team of software engineers, ensuring robust professional development and mentorship.
  • Direct the design and development of scalable and secure backend architectures, ensuring robust data pipelines and infrastructure.
  • Define and execute strategies for data ingestion, processing, and storage to ensure accuracy, scalability, and reliability.
  • Oversee UI/UX practices and continuously refine front-end architecture.
  • Ensure high availability and performance of our SaaS offering using leading cloud platforms.
  • Partner with data science, analytics, and business intelligence teams.
  • Evaluate and manage relationships with third-party vendors and service providers.
  • Implement processes that drive ongoing optimization of engineering workflows.

AWSPHPPythonCloud ComputingGCPJavascriptSoftware ArchitectureUI DesignAzureData engineeringReactMicroservicesCompliance

Posted 3 days ago
Apply
Apply

πŸ“ US

🧭 Full-Time

πŸ’Έ 110000.0 - 125000.0 USD per year

πŸ” Beauty industry

  • Bachelor’s degree in data engineering or relevant discipline.
  • 5+ years of hands-on experience as a data engineer managing data pipelines.
  • Advanced proficiency in modern ETL/ELT stacks - expertise in Fivetran, DBT, and Snowflake.
  • Understanding of data analytics and tools, including Metabase and PowerBI.
  • Expert-level Python and SQL skills with deep experience in DBT transformations.
  • Strong understanding of cloud-native data architectures and modern data warehousing principles.
  • Familiarity with data security, governance, and compliance standards.
  • Adept at designing and delivering interactive dashboards.

  • Building and managing robust ETL/ELT pipelines using Fivetran, DBT, and Snowflake.
  • Developing and optimizing data models and analytical reports.
  • Collaborating with stakeholders to create data pipelines that align with BI needs.
  • Designing and developing SQL-based solutions to transform data.
  • Building the reporting infrastructure from ambiguous requirements.
  • Continuously improving customer experience and providing technical leadership.
  • Evangelizing best practices and technologies.

PythonSQLBusiness IntelligenceETLSnowflakeData engineeringData modeling

Posted 3 days ago
Apply
Apply

πŸ“ Slovakia, Czechia

🧭 Full-Time

πŸ” Cybersecurity

🏒 Company: SentinelOneπŸ‘₯ 1001-5000πŸ’° Post-IPO Equity over 3 years agoπŸ«‚ Last layoff over 1 year agoArtificial Intelligence (AI)SecurityCyber SecurityNetwork Security

  • At least Bachelor’s degree and 5+ years of related experience or equivalent.
  • Expertise in building core platform technology for unstructured, structured, and semi-structured data processing and search, preferably in a cybersecurity company.
  • Familiarity with cloud services, batch/stream data processing, distributed search, and data storage formats.
  • Familiarity with open-source frameworks like OCSF and OTel.
  • Strong interpersonal and communication skills.
  • Ability to work effectively with local and remote teams in different time zones.
  • Quantitative and business analysis skills.
  • Experience in evangelizing products to customers and stakeholders.

  • Develop and execute federated search roadmap, including defining the priority of data lakes.
  • Gather requirements for federated search while collaborating with internal teams and customers.
  • Collaborate with engineering teams to define technical requirements and ensure scalable solutions.
  • Create documentation for customers regarding the value of federated search.
  • Implement analytics tools to track usage and optimize features based on data.
  • Stay abreast of cybersecurity trends to keep the product innovative.

AWSSQLCloud ComputingCybersecurityElasticSearchMachine LearningApache KafkaData engineeringData scienceREST API

Posted 3 days ago
Apply
Apply

πŸ“ US, Canada, Mexico

🧭 Full-Time

πŸ’Έ 175000.0 USD per year

πŸ” Digital tools for hourly employees

🏒 Company: TeamSenseπŸ‘₯ 11-50πŸ’° Seed 11 months agoInformation ServicesInformation TechnologySoftware

  • Bachelor's or Master's degree in Computer Science, Software Engineering, or a related technical field.
  • 7+ years of professional experience in software engineering, including 5+ years in data engineering.
  • Proven expertise in building and managing scalable data platforms.
  • Proficiency in Python.
  • Strong knowledge of SQL and data modeling.
  • Experience with data migration and database systems like PostgreSQL and MongoDB.
  • Exceptional problem-solving skills optimizing data systems.

  • Contribute to the design, development, and maintenance of a scalable and reliable data platform.
  • Analyze the current database and warehouse.
  • Design and develop scalable ETL/ELT pipelines to support data migration.
  • Build and maintain robust, scalable, and high-performing data platforms, including data lakes and/or warehouses.
  • Lead by example and implement data engineering best practices and design patterns.
  • Be the subject matter expert and guide design reviews for new features impacting data.

PostgreSQLPythonSQLETLMongoDBData engineeringData modeling

Posted 3 days ago
Apply
Apply

πŸ“ Ukraine

  • 4+ years of experience in software/data engineering, data architecture, or a related field.
  • Strong programming skills in at least one language: Java, Scala, Python, or Go.
  • Experience with SQL and data modeling.
  • Hands-on experience with Apache Big Data frameworks such as Hadoop, Hive, Spark, Airflow, etc.
  • Proficiency in AWS cloud services.
  • Strong understanding of distributed systems, large-scale data processing, and data storage/retrieval.
  • Experience with data governance, security, and compliance is a plus.
  • Familiarity with CI/CD and DevOps practices is a plus.
  • Excellent communication and problem-solving skills.

  • Design, build, and maintain scalable and reliable data storage solutions.
  • Optimize and scale the platform for increasing data volumes and user requests.
  • Improve data storage, retrieval, query performance, and overall system performance.
  • Collaborate with data scientists, analysts, and stakeholders for tailored solutions.
  • Ensure proper integration of data pipelines, analytics tools, and ETL processes.
  • Troubleshoot and resolve platform issues in a timely manner.
  • Develop monitoring and alerting systems to ensure platform reliability.
  • Participate in code reviews and design discussions.
  • Evaluate new technologies to enhance the data platform.

AWSPythonSQLApache AirflowApache HadoopKafkaKubernetesData engineeringScalaData modeling

Posted 3 days ago
Apply
Apply

πŸ“ Canada

🧭 Full-Time

πŸ’Έ 166000.0 - 197000.0 CAD per year

πŸ” Technology

🏒 Company: SyndioπŸ‘₯ 101-250πŸ’° $50,000,000 Series C over 3 years agoHuman ResourcesAnalyticsSoftware

  • Compassionate mentor with the ability to raise the performance of teammates.
  • Proven collaboration skills across product and engineering teams.
  • Excellent communication skills for interfacing with technical and non-technical teams.
  • Strong problem-solving and critical thinking abilities.
  • Experience in enterprise technology applications architecture, design, development, and maintenance.
  • Focus on performance, reliability, and security.

  • Manage a team of highly skilled backend developers.
  • Provide technical leadership to your team and make responsible technical decisions.
  • Share expertise in Google Cloud Platform Data toolsets.
  • Contribute to data-related best practices and process improvements.
  • Collaborate with various departments to ensure project requirements are met.
  • Create a culture of diversity and belonging while leading hiring processes.
  • Document, plan, and drive execution of team decisions.

Backend DevelopmentLeadershipPythonSoftware DevelopmentGCPKubernetesData engineering

Posted 4 days ago
Apply
Apply

πŸ“ France, Europe

🧭 Full-Time

πŸ” Fintech

  • 4-5 years of experience in Data Engineering, ideally in scale-up environments.
  • At least 2 years of experience managing Data Engineers.
  • Proficient in Python and cloud technologies, ideally AWS.
  • Desire for a hands-on management role (around 50% management).
  • Ability to collaborate with cross-functional stakeholders.
  • Strong problem-solving and communication skills.
  • Fluency in English; French is a plus.

  • As a Data Engineering Manager, lead a team of data engineers to design and implement data solutions.
  • Collaborate with data scientists and analysts to develop tailored analytics and machine learning solutions.
  • Work with Product Managers to ensure high-impact data engineering work.
  • Grow the team and establish the right culture and processes.
  • Ensure data quality and integrity in collaboration with software engineers.

AWSLeadershipProject ManagementPythonETLData engineering

Posted 4 days ago
Apply
Shown 10 out of 351