ETL Jobs

Find remote positions requiring ETL skills. Browse through opportunities where you can utilize your expertise and grow your career.

ETL
500 jobs found. to receive daily emails with new job openings that match your preferences.
500 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ US

🧭 Full-Time

πŸ’Έ 206700 - 289400 USD per year

πŸ” Social media / Online community

  • MS or PhD in a quantitative discipline: engineering, statistics, operations research, computer science, informatics, applied mathematics, economics, etc.
  • 7+ years of experience with large-scale ETL systems, building clean, maintainable, object-oriented code (Python preferred).
  • Strong programming proficiency in Python, SQL, Spark, Scala.
  • Experience with data modeling, ETL concepts, and manipulating large structured and unstructured data.
  • Experience with data workflows (e.g., Airflow) and data visualization tools (e.g., Looker, Tableau).
  • Deep understanding of technical and functional designs for relational and MPP databases.
  • Proven track record of collaboration and excellent communication skills.
  • Experience in mentoring junior data scientists and analytics engineers.

  • Act as the analytics engineering lead within Ads DS team and contribute to data science data quality and automation initiatives.
  • Ensure high-quality data through ETLs, reporting dashboards, and data aggregations for business tracking and ML model development.
  • Develop and maintain robust data pipelines and workflows for data ingestion, processing, and transformation.
  • Create user-friendly tools for internal use across Data Science and cross-functional teams.
  • Lead efforts to build a data-driven culture by enabling data self-service.
  • Provide mentorship and coaching to data analysts and act as a thought partner for data teams.

LeadershipPythonSQLData AnalysisETLTableauStrategyAirflowData analysisData engineeringData scienceSparkCommunication SkillsCollaborationMentoringCoaching

Posted 2024-11-21
Apply
Apply

πŸ“ Mexico, Gibraltar, Colombia, USA, Brazil, Argentina

🧭 Full-Time

πŸ” Cryptocurrency

🏒 Company: Bitso

  • 4+ years of professional experience working with analytics, ETLs, and data systems as an individual contributor.
  • 3+ years of experience in engineering management at tech companies.
  • Expertise in defining and implementing data architectures, including ETL/ELT pipelines, data lakes, data warehouses, and real-time data processing systems.
  • Expertise with cloud platforms (AWS preferred), data engineering tools (Databricks, Spark, Kafka), and SQL/NoSQL databases.
  • Expertise translating business requirements into technical solutions and data architecture.
  • Expertise with orchestration tools (e.g. AWS step functions, Databricks workflows, or Dagster).
  • Proven experience in building data migration services or implementing change data capture (CDC) processes.
  • Experience with CI/CD tools (Github actions).
  • Experience with CDP platforms and handling behavioral data (e.g. Segment, Amplitude, AVO).
  • Experience in infrastructure as code technologies (e.g. terraform) and serverless for data engineering tasks.

  • Lead the Data Engineering team and Data Governance lead on daily tasks with technical expertise and mentoring.
  • Prioritize workload, set clear goals and drive accountability to ensure the team delivers exceptional data products in a timely manner.
  • Mentor and coach all the Data Engineering division; fostering their professional development and an innovation culture.
  • Partner with Data Science divisions to drive data products that solve business problems.
  • Engage with stakeholders to define roadmaps according to Bitso’s priorities.
  • Recruit and retain top talent.
  • Define and drive Bitso’s data strategy in partnership with the SVP of Data Science.

AWSLeadershipSQLBusiness IntelligenceETLKafkaStrategyData engineeringData scienceServerlessNosqlSparkCollaborationCI/CDMentoringDevOpsTerraform

Posted 2024-11-21
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 100000 - 130000 USD per year

πŸ” Data analytics, subscription economy

🏒 Company: Antenna

  • Expert in SQL, Excel, and business intelligence tools like Looker or Redash with 3+ years experience.
  • Strong problem-solving skills and attention to detail.
  • 4+ years experience in data-driven products, with strategic thinking ability.
  • Passionate about using critical thinking and data analysis to solve customer challenges.
  • Skilled in organizing, analyzing, and vetting large datasets without losing accuracy.
  • Experience in navigating complex ETL systems and collaborating with technical teams.
  • Excellent written and verbal communication skills to translate nuanced requests.
  • Self-starter able to prioritize initiatives and meet deadlines in a fast-paced environment.
  • Bonus: Experience with process automation using Python or R for data manipulation.

  • Partner with our commercial team to understand how stakeholders use data for business decisions and create high-quality, frictionless data solutions.
  • Ensure timely product updates in alignment with business teams.
  • Guarantee reliable and scalable data fulfillment for solutions while optimizing speed and accuracy.
  • Analyze large datasets to identify data quality issues and collaborate with stakeholders and clients.
  • Develop QA efficiencies to ensure accurate data delivery.
  • Create documentation processes to enhance stakeholder understanding.
  • Support the launch of new data solutions with systems to ensure reliability and accessibility.
  • Lead, manage, or mentor Data Analysts in Data Operations activities.

SQLBusiness IntelligenceData AnalysisETLQAData analysisCommunication SkillsAttention to detailDocumentation

Posted 2024-11-21
Apply
Apply

πŸ“ United States

πŸ” Data and technology

  • 5+ years experience in security engineering or site reliability engineering.
  • Excellent Terraform skills required.
  • Experience working with and developing CI/CD pipelines for Infrastructure as Code required.
  • Knowledge of programming/scripting fundamentals (python/golang) required.
  • Expertise in performing ETL onboarding for diverse log feed technologies required.
  • Experience supporting a Splunk platform administration, new content dashboards, applications, and use cases.
  • Hands-on experience developing Rest API's to capture data from external sources.
  • Experience with Agile methodologies.
  • Understanding of multiple log formats and source data for SIEM Analysis.
  • Solid background with Windows and Linux platforms (security or system administration).
  • Experience with technical concepts including networking and several cyber attacks.

  • Understand data feeds of multiple security tools and logs that feed the SIEM & UEBA technologies.
  • Identify capabilities and quality of these feeds and recommend improvements.
  • Create new content use cases based on threat intelligence, analyst feedback, available log data, and previous incidents.
  • Perform daily activities of the content life cycle including creating, testing, tuning, and maintaining associated documentation.
  • Improve vulnerabilities across different application environments.
  • Work with other security teams and product SMEs to identify capability gaps.
  • Develop parsers and field extractions to support content development.
  • Develop custom scripts to enhance default SIEM functionality.
  • Participate in root cause analysis on security incidents and provide recommendations for new data sources and enrichment.

PythonAgileETLGolangREST APICI/CDLinuxTerraformDocumentation

Posted 2024-11-21
Apply
Apply

πŸ“ US

πŸ” Consumer insights

  • Strong PL/SQL and SQL development skills.
  • Proficient in multiple data engineering languages such as Python and Java.
  • Minimum 3-5 years of experience in Data engineering with Oracle and MS SQL.
  • Experience with data warehousing concepts and cloud-based technologies like Snowflake.
  • Experience with cloud platforms such as Azure.
  • Knowledge of data orchestration tools like Azure Data Factory and DataBricks workflows.
  • Understanding of data privacy regulations and best practices.
  • Experience working with remote teams.
  • Familiarity with tools like Git, Jira.
  • Bachelor's degree in Computer Science or Computer Engineering.

  • Design, implement and maintain scalable pipelines and architecture to collect, process, and store data from various sources.
  • Unit test and document solutions that meet product quality standards prior to release to QA.
  • Identify and resolve performance bottlenecks in pipelines to ensure efficient data delivery.
  • Implement data quality checks and validation processes.
  • Work with Data Architect to implement best practices for data governance, quality, and security.
  • Collaborate with cross-functional teams to address data needs.

PythonSQLETLGitJavaOracleQASnowflakeJiraAzureData engineeringCI/CD

Posted 2024-11-21
Apply
Apply

πŸ“ Colombia

🧭 Full-Time

πŸ” Fintech

  • Experience working on an agile team.
  • Strong communication and collaboration skills.
  • Minimum 4 years creating distributed, data-intensive, and highly scalable applications with Java 8/11, SpringBoot, and AWS.
  • Experience with Rest API/Microservices and concepts like domain-driven design.
  • Proven experience with AWS services like SQS, SNS, DynamoDB, and Lambda.
  • Strong SQL knowledge and experience with data transformation processes.

  • Leading and delivering development tasks through all SDLC phases.
  • Completing research and proof of concepts that can be converted into MVPs.
  • Collaborating with other teams in Caseware, including cloudops, devops, and product.
  • Participating in code reviews and understanding existing architecture.
  • Taking ownership of modules in the DA architecture and providing technical guidance.

AWSSQLAgileDynamoDBETLHadoopJavaJavascriptJavaScriptAngularREST APISparkCollaborationMentoringMicroservices

Posted 2024-11-21
Apply
Apply

πŸ“ US

πŸ’Έ 100000 - 120000 USD per year

πŸ” Healthcare software solutions

  • Bachelor’s degree in Computer Science, Information Technology, Healthcare Informatics, or equivalent.
  • Minimum of 3-5 years of experience in healthcare implementation support.
  • Strong focus on expressions, flow building, SQL, and data manipulation.
  • Experience with ETL and scripting or complex expressions using XPath, XQuery, JSONPath, jq.
  • Ability to work with structured data formats, such as JSON, XML, or X12 EDI standards.
  • Experience with database query languages (e.g., SQL, NoSQL, etc.).
  • Knowledge of data manipulation techniques and tools.
  • Proven troubleshooting skills in microservice/cloud architecture.
  • Familiarity with healthcare data standards like HL7 or ICD-10.

  • Create and manage expressions to manipulate data and automate workflows.
  • Design workflows to streamline healthcare processes.
  • Write and optimize database queries for data analysis.
  • Ensure data accuracy and integrity between systems.
  • Provide technical support and troubleshoot issues.
  • Assist in the implementation of healthcare software solutions.
  • Maintain documentation for expressions, workflows, and queries.
  • Collaborate with IT professionals for integration.
  • Identify process improvement opportunities.
  • Ensure workflows comply with healthcare regulations.

SQLETLNosqlCollaborationDocumentationCompliance

Posted 2024-11-20
Apply
Apply
πŸ”₯ Data Engineer
Posted 2024-11-20

πŸ“ Argentina, Spain, England, United Kingdom, Lisbon, Portugal

🧭 Full-Time

πŸ” Web3

🏒 Company: Reown

  • 5+ years working in the analytics stack within a fast-paced environment.
  • 3+ years production experience with SQL templating engines like DBT.
  • Experience with distributed query engines (Bigquery, Athena, Spark), data warehouses, and BI tools.
  • Strong understanding of software engineering principles, coding standards, design patterns, version control (e.g., Git), testing methodologies, and CI/CD processes.
  • Experience with AWS/GCP/Azure services for deployment and management.
  • Familiarity with GitHub, CI/CD pipelines, GitHub Actions, and Terraform.
  • Ability to write Python scripts for ETL processes and data manipulation.
  • Proficient with libraries like pandas for analysis and transformation.
  • Experience handling various data formats (e.g., CSV, JSON, Parquet).
  • Strong problem-solving skills and communication abilities to discuss technical concepts.

  • Write complex SQL queries that extract and combine data from on-chain and off-chain logs for analytics.
  • Create dashboards and tools for team data discoverability and KR tracking.
  • Perform deep-dive analyses into specific topics for internal stakeholders.
  • Help design, implement, and evolve Reown's on-chain data infrastructure.
  • Build, maintain, and monitor end-to-end data pipelines for new datasets and features.
  • Write health-checks and alerts to ensure data correctness, consistency, and freshness.
  • Meet with product managers and stakeholders to understand data needs and detect new product opportunities.

AWSPythonSQLData AnalysisDesign PatternsETLGCPGitTableauAzureClickhouseData analysisPandasSparkCommunication SkillsCI/CDTerraformWritten communication

Posted 2024-11-20
Apply
Apply

πŸ“ United States of America

πŸ’Έ 80000 - 135000 USD per year

🏒 Company: VSPVisionCareers

  • Bachelor’s degree in computer science, data science, statistics, economics, or related functional area; or equivalent experience.
  • Excellent written and verbal communication skills.
  • 6 years of experience working with end users in development of analytical capabilities.
  • 6 years of hands-on experience in data modeling, SQL-based database management systems, ETL/data pipeline design, and data visualization.
  • Expert-level SQL coding experience.

  • Work with business stakeholders to design data and analytics capabilities supporting business strategies.
  • Develop data models and structures for data-driven solutions.
  • Collaborate in an agile, multi-disciplinary team to deliver data solutions.
  • Research, promote, and develop data architecture best practices.

PythonSQLAgileBusiness IntelligenceETLSCRUMSnowflakeData scienceData StructuresCommunication Skills

Posted 2024-11-20
Apply
Apply

πŸ“ United States of America

🧭 Full-Time

πŸ” Pest Control and related services

🏒 Company: terminix

  • 3+ years of analytics experience required.
  • 1+ years of digital marketing experience preferred.
  • Knowledge of digital tools such as GA4, Google Ads, Google Search Console, Moz, BrightLocal, Power BI, Looker Studio.
  • Demonstrated experience in BI data architecture, ETL, APIs, and data warehousing concepts.
  • Strong verbal and written communication skills.
  • Bachelor's degree in Marketing Analytics, a quantitative discipline, or similar experience.
  • Experience using SQL/Python and/or R is preferred.

  • Monitor and audit branch local listings to improve online visibility.
  • Analyze branch reputation and generate actionable reports.
  • Stay updated with digital marketing trends and search engine algorithms.
  • Drive customer engagement through eCommerce strategies.
  • Streamline reporting metrics to identify customer behavior.
  • Interpret marketing reports to create actionable plans.

PythonSQLETLCommunication SkillsCollaborationWritten communication

Posted 2024-11-20
Apply
Shown 10 out of 500