Apply

Senior Data Engineer

Posted over 1 year ago

View full description

📍 Location: Anywhere in the united states

💸 Salary: $170k to $195k

🗣️ Languages: English

Requirements:
Experience with designing and implementing scalable solutions, ability to refactor and simplify existing processes, experience with relational and non-relational databases, proficient in python, experience with handling and working with large datasets, experience with etl pipelines and data warehousing best practices, value collaboration and feedback, excellent documentation and verbal communication skills.Apply

Related Jobs

Apply
🔥 Senior Data Engineer
Posted about 3 hours ago

📍 United States, United Kingdom, Spain, Estonia

🔍 Identity verification

🏢 Company: Veriff👥 501-1000💰 $100,000,000 Series C almost 3 years ago🫂 Last layoff over 1 year agoArtificial Intelligence (AI)Fraud DetectionInformation TechnologyCyber SecurityIdentity Management

  • Expert-level knowledge of SQL, particularly with Redshift.
  • Strong experience in data modeling with an understanding of dimensional data modeling best practices.
  • Proficiency in data transformation frameworks like dbt.
  • Solid programming skills in languages used in data engineering, such as Python or R.
  • Familiarity with orchestration frameworks like Apache Airflow or Luigi.
  • Experience with data from diverse sources including RDBMS and APIs.

  • Collaborate with business stakeholders to design, document, and implement robust data models.
  • Build and optimize data pipelines to transform raw data into actionable insights.
  • Fine-tune query performance and ensure efficient use of data warehouse infrastructure.
  • Ensure data reliability and quality through rigorous testing and monitoring.
  • Assist in migrating from batch processing to real-time streaming systems.
  • Expand support for various use cases including business intelligence and analytics.

PythonSQLApache AirflowETLData engineeringJSONData modeling

Posted about 3 hours ago
Apply
Apply
🔥 Senior Data Engineer
Posted about 10 hours ago

📍 TX, MN, FL

💸 130000.0 - 195000.0 USD per year

🔍 Healthcare

🏢 Company: NeueHealth

  • Bachelor’s degree in Computer Science, Computer Engineering, Information Systems, or equivalent.
  • Around five years of experience in an enterprise data engineering role in an Azure environment.
  • Healthcare IT background preferred.
  • Experience coding in Scala and building batch and streaming data pipelines.
  • Experience with API design.
  • Extensive experience developing data solutions in Azure Cloud.
  • Experience with event sourcing and/or Big Data architectures.

  • Write traditional code and server-less functions mainly in Scala.
  • Build APIs, data microservices, and ETL pipelines for data sharing and analytics.
  • Develop and optimize processes for large language models and AI enhancements.
  • Support Data Ingestion frameworks deployed in Azure.
  • Participate in cultivating a culture of DevOps and Quality Assurance.
  • Act as tech lead and mentor junior engineers.
  • Continuously document code and team processes.

ETLC#AzureSparkCI/CDDevOpsMicroservicesScala

Posted about 10 hours ago
Apply
Apply

📍 Poland

🔍 Financial services

🏢 Company: Capco👥 101-250Electric VehicleProduct DesignMechanical EngineeringManufacturing

  • Strong cloud provider’s experience on GCP
  • Hands-on experience using Python; Scala and Java are nice to have
  • Experience in data and cloud technologies such as Hadoop, HIVE, Spark, PySpark, DataProc
  • Hands-on experience with schema design using semi-structured and structured data structures
  • Experience using messaging technologies – Kafka, Spark Streaming
  • Strong experience in SQL
  • Understanding of containerisation (Docker, Kubernetes)
  • Experience in design, build and maintain CI/CD Pipelines
  • Enthusiasm to pick up new technologies as needed

  • Work alongside clients to interpret requirements and define industry-leading solutions
  • Design and develop robust, well tested data pipelines
  • Demonstrate and help clients adhere to best practices in engineering and SDLC
  • Lead and mentor the team of junior and mid-level engineers
  • Contribute to security designs and have advanced knowledge of key security technologies
  • Support internal Capco capabilities by sharing insight, experience and credentials

DockerPythonSQLETLGCPGitHadoopKafkaKubernetesSnowflakeAirflowSparkCI/CD

Posted about 10 hours ago
Apply
Apply
🔥 Senior Data Engineer
Posted about 11 hours ago

🔍 AI-first SaaS

  • 6+ years of experience in data engineering, with a focus on building large-scale data systems.
  • In-depth knowledge of ELT processes and Lakehouse architecture.
  • Expertise in Python and Scala for data pipeline development.
  • Proven experience with Apache Spark and Databricks for big data processing and analytics.
  • Hands-on experience with observability tools like Prometheus, Grafana, and ELK stack/OpenSearch.
  • Experience with real-time streaming systems such as Kafka and Flink.
  • Experience in working with large datasets and high-throughput data environments.
  • Familiarity with cloud platforms like Azure, GCP, or AWS.
  • Strong problem-solving skills and ability to work in fast-paced environments.
  • Excellent communication and collaboration skills.

  • Development and implementation of advanced data models to extract insights from complex datasets.
  • Collaborate with cross-functional teams to devise data-driven solutions for service optimization.
  • Conduct experiments and analyses to enhance service predictions and outcomes.
  • Present findings and actionable recommendations to stakeholders, translating complex data insights into clear, understandable concepts.
  • Mentor junior data scientists and foster a collaborative team environment.
Posted about 11 hours ago
Apply
Apply

📍 Ireland, United Kingdom

🔍 IT, Digital Transformation

🏢 Company: Tekenable👥 51-100Information TechnologyEnterprise SoftwareSoftware

  • Experience with the Azure Intelligent Data Platform, including Data Lakes, Data Factory, Azure Synapse, Azure SQL, and Power BI.
  • Knowledge of Microsoft Fabric.
  • Proficiency in SQL and Python.
  • Understanding of data integration and ETL processes.
  • Ability to work with large datasets and optimize data systems for performance and scalability.
  • Experience working with JSON, CSV, XML, Open API, RESTful API integration and OData v4.0.
  • Strong knowledge of SQL and experience with relational databases.
  • Experience with big data technologies like Hadoop, Spark, or Kafka.
  • Familiarity with cloud platforms such as Azure.
  • Bachelor's degree in Computer Science, Engineering, or a related field.

  • Design, develop, and maintain scalable data pipelines.
  • Collaborate with data analysts to understand their requirements.
  • Implement data integration solutions to meet business needs.
  • Ensure data quality and integrity through testing and validation.
  • Optimize data systems for performance and scalability.

PythonSQLETLHadoopKafkaAzureSparkJSON

Posted 1 day ago
Apply
Apply
🔥 Senior Data Engineer
Posted 2 days ago

📍 Japan

🧭 Full-Time

🔍 Conversational commerce technology

  • 5+ years of experience in data engineering or related roles.
  • Strong analytical skills with the ability to manipulate and interpret complex datasets.
  • Proficiency in SQL and experience with databases such as MySQL and BigQuery.
  • Experience in both Python and Scala is preferred.
  • Knowledge of dbt is a must.
  • Experience with data platforms like Airflow, Dataproc, and Great Expectations.
  • Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams.
  • Demonstrated ability to work independently and manage multiple projects simultaneously.
  • English proficiency is a must. Conversational Japanese is nice to have.

  • Design, develop, and optimize data pipelines to ingest data into our data lakehouse.
  • Implement and manage ETL processes using Google Cloud Composer, Dataproc, and dbt.
  • Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
  • Implement data quality checks and monitoring.
  • Monitor and optimize the performance of BigQuery datasets and queries.
  • Document processes, designs, and architectural decisions.
  • Troubleshoot daily reporting pipelines.
  • Conduct code reviews and provide mentorship to junior engineers.

PythonSQLData engineeringScala

Posted 2 days ago
Apply
Apply
🔥 Senior Data Engineer
Posted 3 days ago

📍 United States

🧭 Full-Time

🏢 Company: Avalore, LLC

  • Master’s or PhD in statistics, mathematics, computer science, or related field.
  • 8+ years of experience as a Data Engineer within the IC.
  • Outstanding communication skills, influencing abilities, and client focus.
  • Professional proficiency in English is required.
  • Current, active Top Secret security clearance.
  • Applicants must be currently authorized to work in the United States on a full-time basis.

  • Develops and documents data pipelines for ingest, transformation, and preparation of data for AI applications.
  • Designs scalable technologies such as streaming and transformation, joining disparate data sets for predictive analytics.
  • Develops API interfaces for accessibility.
  • Leads technical efforts and guides development teams.
Posted 3 days ago
Apply
Apply

🔍 Digital product consultancy

🏢 Company: Nerdery👥 101-250InternetConsultingWeb DevelopmentInnovation ManagementAppsInformation TechnologyMobile

  • Bachelor's in Computer Science or equivalent experience required.
  • 6+ years of relevant experience.
  • In-depth knowledge of GCP data services including BigQuery and Dataflow.
  • Proficiency in Python and SQL.
  • Experience with data migration from AWS or Azure to GCP.
  • Understanding of data modeling, ETL processes, and data warehousing.
  • Familiarity with data pipeline orchestration tools.
  • Excellent problem-solving and analytical skills.
  • Ability to communicate with technical and non-technical stakeholders.

  • Produce capable data structures and performant queries using complex SQL.
  • Implement ELT processes and complex data aggregations with BigQuery SQL.
  • Support and optimize Looker dashboards based on BigQuery datasets.
  • Integrate BigQuery and Vertex AI for advanced analytics.
  • Design scalable and reliable data pipelines on GCP.
  • Manage Delta Live Tables for real-time data integration.
  • Design and configure Data Lakes in GCP using Google Cloud Storage.
  • Build infrastructure for ETL of data from various sources using SQL and GCP.
  • Write and maintain efficient Python scripts for data processing.
  • Collaborate with clients to address data security and compliance requirements.
Posted 5 days ago
Apply
Apply
🔥 Senior Data Engineer
Posted 5 days ago

📍 USA

🧭 Full-Time

💸 190000.0 - 220000.0 USD per year

🔍 B2B data / Data as a Service (DaaS)

🏢 Company: People Data Labs👥 101-250💰 $45,000,000 Series B about 3 years agoDatabaseArtificial Intelligence (AI)Developer APIsMachine LearningAnalyticsB2BSoftware

  • 5-7+ years industry experience with strategic technical problem-solving.
  • Strong software development fundamentals.
  • Experience with Python.
  • Expertise in Apache Spark (Java, Scala, or Python-based).
  • Proficiency in SQL.
  • Experience building scalable data processing systems.
  • Familiarity with data pipeline orchestration tools (e.g., Airflow, dbt).
  • Knowledge of modern data design and storage patterns.
  • Experience working in Databricks.
  • Familiarity with cloud computing services (e.g., AWS, GCP, Azure).
  • Experience in data warehousing technologies.
  • Understanding of modern data storage formats and tools.

  • Build infrastructure for ingestion, transformation, and loading of data using Spark, SQL, AWS, and Databricks.
  • Create an entity resolution framework for merging billions of entities into clean datasets.
  • Develop CI/CD pipelines and anomaly detection systems to enhance data quality.
  • Provide solutions to undefined data engineering problems.
  • Assist Engineering and Product teams with data-related technical issues.

AWSPythonSQLKafkaAirflowData engineeringPandasCI/CD

Posted 5 days ago
Apply
Apply

📍 United Kingdom

🔍 Esports, gaming, tournaments, leagues, events

🏢 Company: ESL FACEIT Group👥 501-1000🫂 Last layoff 10 months agoVideo GamesGamingDigital EntertainmenteSports

  • Experience shaping architecture for mature data platforms.
  • Hands-on building of resilient data pipelines (Airflow, Kafka, etc.) at scale.
  • CI/CD expertise (Github Actions, Jenkins) in data engineering.
  • Infrastructure management using IaC (Terraform).
  • Knowledge of data modeling in cloud warehouses (BigQuery, Snowflake).
  • Familiarity with database design principles.
  • Skills in operational procedures and data observability tools.

  • Serve as a leader in tech and understand customer needs.
  • Partner with stakeholders and promote data platform adoption.
  • Contribute to technical strategy and manage delivery.
  • Set high standards for documentation, testing, and code quality.
  • Drive efficiencies in code, infrastructure and data models.
  • Inspire and guide team members through code reviews and design sessions.

AWSLeadershipPythonSQLGCPJenkinsKafkaSnowflakeStrategyAirflowData engineeringData StructuresPrometheusCI/CDDevOpsTerraformDocumentationData modeling

Posted 7 days ago
Apply

Related Articles

Remote Job Certifications and Courses to Boost Your Career

Posted 4 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

How to Balance Work and Life While Working Remotely

Posted 4 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Weekly Digest: Remote Jobs News and Trends (August 11 - August 18, 2024)

Posted 4 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

How to Onboard Remote Employees Successfully

Posted 4 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.

Remote Work Statistics and Insights for 2024

Posted 4 months ago

The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.