Remote JavaScript Jobs

Data engineering
498 jobs found. to receive daily emails with new job openings that match your preferences.
498 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

📍 Canada

💸 98400.0 - 137800.0 CAD per year

🔍 Software Development

  • A degree in Computer Science or Engineering, and 5-8 years of experience in developing and maintaining software or an equivalent level of education or work experience, and a track record of substantial contributions to software projects with high business impact
  • Experience writing clean code that performs well at scale; ideally experienced with languages like Python, Scala, Java, Go, and shell script
  • Passionate interest in data engineering and infrastructure; ingestion, storage and compute in relational, NoSQL, and serverless architectures
  • Experience with various types of data stores, query engines and frameworks, e.g. PostgreSQL, MySQL, S3, Redshift/Spectrum, Presto/Athena, Spark
  • Experience working with message queues such as Kafka and Kinesis
  • Experience developing data pipelines and integrations for high volume, velocity and variety of data
  • Experience with data warehousing and data modeling best practices
  • Work within a cross-functional team (including analysts, product managers, and other developers) to deliver data products and services to our internal stakeholders
  • Conduct directed research and technical analysis of new candidate technologies that fill a development team’s business or technical need
  • Provide technical advice, act as a role model for your teammates, flawlessly execute complicated plans, and navigate many levels of the organization
  • Contribute enhancements to development, build, deployment, and monitoring processes with an emphasis on security, reliability and performance
  • Implement our technical roadmap as we scale our services and build new data products
  • Participate in code reviews, attend regular team meetings, and apply software development best practices
  • Take ownership of your work, and work autonomously when necessary
  • Recognize opportunities to improve efficiency in our data systems and processes, increase data quality, and enable consistent and reliable results
  • Participate in the design and implementation of our next generation data platform to empower Hootsuite with data
  • Participate in the development of the technical hiring process and interview scripts with an aim of attracting and hiring the best developers

AWSPostgreSQLPythonSoftware DevelopmentSQLAgileApache AirflowCloud ComputingData AnalysisData MiningETLJavaKafkaMySQLSoftware ArchitectureAlgorithmsAPI testingData engineeringData StructuresGoServerlessSparkCI/CDRESTful APIsMicroservicesScalaData visualizationData modelingData management

Posted about 1 hour ago
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Software Development

🏢 Company: Anrok👥 51-100💰 $30,000,000 Series B 11 months agoSaaSSoftwareTax Preparation

  • Product-minded software engineer with 5+ years of experience
  • Experience talking to users and synthesizing needs
  • Deep understanding of data platforms and building for scale
  • Design, build, and scale systems that let customers reliably sync their Anrok financial data to their data warehouses (e.g. Snowflake, BigQuery), allowing them to reconcile invoicing, payment, and bank account data.
  • Interface directly with customers to ensure the formats and mechanisms are tailored to the domain, but still flexible enough to accommodate different accounting practices.
  • Increase the flexibility and scale of Anrok's internal data infrastructure, allowing us to more effectively leverage data in our decision making.

AWSBackend DevelopmentPostgreSQLPythonSQLCloud ComputingData AnalysisETLGitSnowflakeAlgorithmsData engineeringData StructuresREST APICI/CDMicroservicesData visualizationData modelingNodeJSSoftware EngineeringData analyticsData managementDebuggingCustomer Success

Posted about 2 hours ago
Apply
Apply
🔥 Lead Partner Sales Engineer
Posted about 3 hours ago

📍 United States

🧭 Full-Time

💸 166372.19 - 207965.24 USD per year

🔍 Software Development

  • Deep understanding of GSI and Reseller pre-sales motions, including how solution architects, pre-sales technical specialists and technical teams participate in discovery workshops, opportunity qualification, proof-of-concepts, and technical validation sessions. This includes understanding how technology choices impact project economics and delivery timelines.
  • Experience with GSI and Reseller sales methodologies, particularly how they identify and qualify opportunities through both client-direct and vendor-partner channels and how they manage complex, multi-stakeholder, multi-vendor, multi-tech option sales cycles.
  • Knowledge of how GSIs and Resellers approach technical enablement and skills development, including their internal training programs and certification paths, and how they maintain technical excellence across delivery teams while managing utilization targets.
  • Cloud data platforms and data lakes: Snowflake, Databricks, Redshift, BigQuery, Synapse, S3, ADLS, OneLake, Hadoop - Hands-on experience with modern cloud data platforms and data lake architectures
  • Data workloads including data warehousing, data lakes, AI, ML, Gen AI, data applications, data engineering
  • Open table formats: Iceberg, Delta Lake, Hudi
  • Data movement technologies: Fivetran, HVR, Matillion, Airbyte, Informatica, Talend, Datastage, ADF, AWS Glue. Hands-on experience with traditional (or more modern) ETL/ELT solutions
  • Databases: Oracle, SQL Server, PostgreSQL, MySQL, MongoDB, CosmosDB, SAP HANA.
  • Applications: Salesforce, Google Analytics, ServiceNow, Hubspot, etc.
  • ERP solutions: SAP, Oracle, Workday, etc.
  • Data transformation solutions: dbt Core, dbt Cloud, Coalesce, Informatica
  • Gen AI: approaches, concepts, and the Gen AI technology ecosystem for hyperscalers, cloud data platforms, and solution-specific ISVs
  • REST APIs: Experience programmatically interacting with REST APIs
  • Data Infrastructure, Data Security, Data Governance and Application Development: strong understanding of technical architectures and approaches
  • Programming languages: Python, SQL
  • Create, develop, build, and take ownership of ongoing relationships with GSI and Reseller pre-sales technical team members, lead architects, and other technical recommenders to create multiple Fivetran technical evangelists and champions within each GSI and Reseller partner.
  • Interact daily with Fivetran’s top GSI and Reseller partners while tracking, measuring, and driving 3 primary KPIs - partners enabled, gtm activities supported, and technical content created.
  • Represent Fivetran as the partner technical liaison plus product, data and solution expert. Collaborate with the strategic GSI and Reseller Partner Sales Managers to enable new and existing GSI and Reseller partners to understand Fivetran's value, differentiation, messaging, and technical/business value.
  • Assist Fivetran Partner Sales Managers in communicating and demonstrating Fivetran’s value prop to existing and prospective partners to get them excited about partnering with Fivetran
  • Get hands-on with all aspects of Fivetran and the broader data ecosystem technologies listed above in building out live demos, hands-on labs, solution prototypes, and Fivetran/GSI/Reseller joint solutions.
  • Communicate customer requirements and competitor solutions. Provide a strong technical response to partner requests. Engage the customer SE team and communicate client requirements for individual client opportunities.

PostgreSQLPythonSQLBusiness IntelligenceCloud ComputingETLHadoopMongoDBMySQLOracleSalesforceSAPSnowflakeGoogle AnalyticsData engineeringAnalytical SkillsMentoringPresentation skillsExcellent communication skillsRelationship buildingStrong communication skills

Posted about 3 hours ago
Apply
Apply

📍 United States

💸 92000.0 - 180000.0 USD per year

🔍 Software Development

  • Strong experience leading/managing a team of software engineers to develop applications
  • Ability to work closely with other leaders to scale our technology and the Engineering organization
  • Ability to work closely with key internal stakeholders to develop and manage the engineering roadmap
  • Ability to drive technical projects and planning
  • Strong communication skills; able to disseminate information across the organization and have strategic influence
  • A minimum of 7 years’ experience developing production-grade systems
  • A minimum of 5 years’ experience managing developers and DevOps teams
  • Define and enforce IT policies related to cloud security, software development and DevOps best practices to ensure scalable and secure systems
  • Provide strategic support and challenge decisions through thought-provoking questions.
  • Understand data governance; create and maintain policies and procedures while implementing data management systems and ensuring compliance with regulations.
  • Simplify complex problems to make them manageable for the organization
  • Drive organizational change effectively
  • Foster strong cross-functional communication and collaboration
  • Lead software engineering teams in cloud-native environments, particularly Microsoft Azure
  • Drive the adoption of cloud best practices, microservices, and scalable architecture
  • Establish coding standards, best practices, and governance policies to ensure maintainable, scalable, and secure software solutions.
  • Cloud Optimization – Ensure efficient use of Azure services while optimizing performance, scalability, and cost.
  • Effectively manage remote engineers
  • Provide technical and organizational leadership across the engineering team
  • Coach/mentor team members to encourage professional and career growth

LeadershipSoftware DevelopmentSQLCloud ComputingETLMachine LearningMicrosoft AzurePeople ManagementCross-functional Team LeadershipAlgorithmsAzureData engineeringData scienceCommunication SkillsAnalytical SkillsCI/CDDevOpsMicroservicesData visualizationStakeholder managementData modelingData management

Posted about 4 hours ago
Apply
Apply

📍 United Kingdom

🔍 Software Development

  • Experience developing and productionizing machine learning models, including their supported data pipeline.
  • Experience with machine learning using packages such as TensorFlow, PyTorch, Spark MLlib, XGBoost, Sklearn, etc.
  • Strong coding skills in Python or equivalent (Python, Java and C++).
  • Solid understanding of engineering and infrastructure best practices.
  • Conduct end-to-end analyses, wrangling data via SQL or Python, to statistical modeling, to hypothesizing and presenting business ideas.
  • Work with large and complex datasets.
  • Support the development and deployment of projects involving machine learned models for offline, batch-based data products as well as models deployed to online, real-time services.
  • Work in the content and contributor intelligence team on text and visual understanding, along with fine tuning transformer models to derive embeddings for multiple input types
  • Productionize and automate model pipelines within Python services.
  • Drive and advocate adoption of best practices in MLOps (Machine Learning Operations).

PythonSQLApache AirflowData AnalysisImage ProcessingKubeflowMachine LearningNLTKNumpyPyTorchC++AlgorithmsData engineeringData StructuresREST APIPandasSparkTensorflowData modeling

Posted about 4 hours ago
Apply
Apply

📍 Canada

🧭 Full-Time

💸 135000.0 - 150000.0 CAD per year

🔍 Software Development

🏢 Company: Datatonic

  • 4+ years of experience in the DevOps field or as a Platform Engineer
  • Proficiency in Python, Java, or Go programming languages
  • In-depth knowledge of CI/CD tools and processes
  • Experience with Google Cloud technologies such as Cloud Run, Cloud Functions, Cloud Scheduler, Workflow, and Cloud Composer for automating deployments and managing infrastructure.
  • Proficiency with technologies such as Terraform and Kubernetes
  • Design, implement, and automate scalable and resilient CI/CD & DevOps pipelines on Google Cloud Platform, tailored to client-specific AI/ML and data engineering needs.
  • Develop and maintain IaC pipelines using tools like Terraform to automate infrastructure provisioning and deployment on GCP, improving efficiency and consistency.
  • Define and embed best practices into our internal processes to enhance the quality and consistency of our work
  • Introduce innovative ideas and approaches to enhance Datatonic’s capabilities and methodologies
  • Collaborate with sales teams to provide technical expertise during client engagements and help scope solutions for requirements
  • Contribute to Datatonic’s internal knowledge base, including technical collateral and thought leadership materials
  • Work in a dynamic, agile environment alongside data scientists, machine learning experts, data analysts, architects and data engineers
  • Collaborate closely with partners such as Google to leverage their technologies effectively
  • Guide team members, fostering a culture of growth and innovation
  • Provide expert advice to customers on DevOps best practices
  • Establish and enhance DevOps functions within clients’ organisations
  • Engage with customers and project teams throughout the project development lifecycle to ensure seamless delivery

LeadershipPythonAgileGCPKubernetesMachine LearningData engineeringCommunication SkillsAnalytical SkillsCI/CDProblem SolvingCustomer serviceRESTful APIsMentoringLinuxDevOpsTerraformExcellent communication skillsTeamworkScripting

Posted about 5 hours ago
Apply
Apply

📍 United States of America

🧭 Full-Time

🔍 Biotechnology, Pharmaceutical, Healthcare

🏢 Company: clarivate_careers

  • PhD (or equivalent work experience) in Bioinformatics, Computational or Systems Biology, Statistics, Machine Learning, Computer Science (latter two as applied to solving problems in areas of biology or medicine) or a related field.
  • A minimum of five years of relevant work experience in bioinformatics or computational biology in professional setting, preferably in a biotechnology, pharmaceutical, or healthcare setting.
  • Experience working in a customer-oriented consulting environment: an ability to earn customer trust through effective communication, efficiency, integrity, and deep technical expertise.
  • Proficiency in statistical programming languages (R), Unix/Linux shell with a strong understanding of statistics and biological data analysis.
  • Conduct data analysis and synthesis to support project objectives using established methods and advanced analytical tools to generate actionable insights.
  • Assist in designing and executing research methodologies to address business challenges and provide recommendations.
  • Collaborate with consulting teams in problem-solving and the development of innovative solutions tailored to customer needs and strategic goals.
  • Contribute to the development of client presentations and reports by organizing data, visualizing findings, and ensuring accuracy, relevance and clarity in all communications.
  • Engage in customer interactions and meetings, with oversight, to understand customer needs, gather feedback, and ensure alignment with project objectives.
  • Support proposal development by contributing to specific elements of the proposal.
  • Collaborate with senior consultants on project execution, coordinating tasks, managing timelines, and ensuring the successful delivery of high-quality consulting services.

AWSDockerPythonSQLApache AirflowApache HadoopCloud ComputingData AnalysisData MiningETLFlaskImage ProcessingKafkaKubernetesMachine LearningNumpyPyTorchAlgorithmsData engineeringData scienceData StructuresRDBMSNosqlPandasTensorflowCI/CDRESTful APIsTerraformJSONData visualizationAnsibleData modelingData analyticsData management

Posted about 6 hours ago
Apply
Apply

📍 United States

💸 64000.0 - 120000.0 USD per year

  • Strong PL/SQL, SQL development skills
  • Proficient in multiple languages used in data engineering such as Python, Java
  • Minimum 3-5 years of experience in Data engineering working with Oracle and MS SQL
  • Experience with data warehousing concepts and technologies including cloud-based services (e.g. Snowflake)
  • Experience with cloud platforms like Azure and knowledge of infrastructure
  • Experience with data orchestration tools (e.g. Azure Data Factory, DataBricks workflows)
  • Understanding of data privacy regulations and best practices
  • Experience working with remote teams
  • Experience working on a team with a CI/CD process
  • Familiarity using tools like Git, Jira
  • Bachelor's degree in Computer Science or Computer Engineering
  • Design, implement and maintain scalable pipelines and architecture to collect, process, and store data from various sources.
  • Unit test and document solutions that meet product quality standards prior to release to QA.
  • Identify and resolve performance bottlenecks in pipelines due to data, queries and processing workflows to ensure efficient and timely data delivery.
  • Implement data quality checks and validations processes to ensure accuracy, completeness and consistency of data delivery.
  • Work with Data Architect and implement best practices for data governance, quality and security.
  • Collaborate with cross-functional teams to identify and address data needs.
  • Ensure technology solutions support the needs of the customer and/or organization.
  • Define and document technical requirements.

PythonSQLETLGitJavaOracleSnowflakeAzureData engineeringCI/CDRESTful APIs

Posted about 16 hours ago
Apply
Apply
🔥 Senior Growth Engineer
Posted about 17 hours ago

📍 United States

🧭 Full-Time

💸 140000.0 - 160000.0 USD per year

🔍 Email Security

🏢 Company: Valimail

  • 4+ years of experience with Python, SQL, or similar languages, with hands-on experience in automation and data flow management in platforms like Snowflake, Segment, Zapier, and Planhat.
  • 2+ years in an operations role supporting Sales, Marketing, Customer Success, or Finance, with a demonstrated ability to align data operations with strategic business goals.
  • Required proficiency in Snowflake, Segment (or other CDP solutions), and BI tools (e.g., Sigma, PowerBI).
  • Familiarity with Zapier, Planhat (or similar tools such as Gainsight or ChurnZero), Salesforce, and Atlassian are a plus.
  • Oversee and optimize data flow across core platforms, including Snowflake, Sigma, Segment, Salesforce, and Planhat, ensuring seamless integration and reliable data access for cross-functional teams.
  • Build strong, collaborative relationships with Marketing, Sales, Finance, Product, and Engineering, facilitating data-driven insights and project alignment across departments.
  • Analyze customer data to derive insights that inform strategic decision-making, delivering actionable reports that support growth objectives.
  • Spearhead automation and optimization projects, implementing solutions that improve data flow and system efficiency.
  • Lead key cross-functional initiatives, ensuring smooth execution and alignment across multiple departments.
  • Provide valuable insights and recommendations to leadership, aligning data-driven findings with broader business objectives.
  • Tackle a wide range of tasks, from technical troubleshooting to strategic planning and cross-departmental collaboration.

Project ManagementPythonSQLBusiness IntelligenceData AnalysisETLSalesforceSnowflakeOperations ManagementAPI testingData engineeringREST APICommunication SkillsAnalytical SkillsCI/CDProblem SolvingDevOpsCross-functional collaborationData visualizationStrategic thinkingData modelingData management

Posted about 17 hours ago
Apply
Apply

📍 Germany

🔍 AI and data analytics consulting

🏢 Company: Unit8 SA

  • MSc level in the field of Computer Science, Machine Learning, Applied Statistics, Mathematics, or equivalent work experience.
  • Proficient software engineer who has experience in applying a blend of software engineering, machine learning, and statistical methods to solve real-world business problems
  • Proficient in one of the following languages: Python, Scala, Java.
  • Experience with cloud technologies is a strong plus.
  • Work with our customers to understand their challenges, design and implement solutions.
  • Closely collaborate with other data scientists, software engineers and business stakeholders.
  • Evaluate, compare and present results to technical and non-technical audience.
  • Contribute to the implementation and engineering of systems at different scales: from small proof-of-concepts to larger end-to-end data systems.
  • Implement best practices in CI/CD

AWSDockerPythonSQLCloud ComputingData AnalysisETLJavaKubernetesMachine LearningAlgorithmsData engineeringData scienceSparkCI/CDRESTful APIsScalaSoftware Engineering

Posted about 20 hours ago
Apply
Shown 10 out of 498

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.