Apply

Senior Data Engineer

Posted 2024-11-15

View full description

💎 Seniority level: Senior, 7+ years

📍 Location: United States

🔍 Industry: Innovation data

🏢 Company: Cypris

⏳ Experience: 7+ years

Requirements:
  • 7+ years of proven experience as a Data Engineer or in a similar role.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Proficiency in programming languages like Python, Java, or Scala.
  • Experience with cloud platforms such as GCP (preferred), AWS, Google Cloud, or Azure.
  • Hands-on experience with big data technologies such as Hadoop, Spark, or similar frameworks.
  • Knowledge of data warehousing concepts and experience with tools like Redshift, BigQuery, or Snowflake.
  • Familiarity with ETL tools and processes.
  • Strong problem-solving skills and attention to detail.
  • The desire to contribute and grow at an early-stage startup.
Responsibilities:
  • Design, develop, and optimize robust data pipelines to process and transform large datasets from various sources.
  • Optimize data stores' performance in terms of index response times and query response times.
  • Implement and maintain ETL processes to ensure data accuracy and integrity.
  • Collaborate with cross-functional teams to understand data requirements and deliver effective data solutions.
  • Develop and maintain data warehouses and data lakes to support business intelligence and analytics.
  • Monitor and troubleshoot data pipeline performance and reliability, implementing improvements as needed.
  • Ensure data security and compliance with relevant regulations and standards.
  • Stay up-to-date with the latest technologies and best practices in data engineering and incorporate them into our processes.
Apply

Related Jobs

Apply

🔍 GameTech

🏢 Company: Kaizen Gaming

  • Significant experience in software development using Python on a large scale, high-performance production environment.
  • Significant hands-on experience with Spark and PySpark.
  • Significant experience with data modeling and handling structured and unstructured data.
  • Significant experience with Apache Kafka.
  • Experience with cloud environments.
  • Expert skills in Distributed Software System design.
  • Excellent problem solving and communication skills.

  • Design, develop and maintain scalable data processing pipelines in a distributed, and data-intensive production environment.
  • Design, develop and maintain microservices in Python that serve these features on production.
  • Design, develop and maintain internal tools that enable CI/CD pipelines, experiment tracking and data versioning.
  • Guarantee data integrity by employing data quality processes.
Posted 2024-11-21
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-21

📍 Poland

🧭 Full-Time

🔍 Software development

🏢 Company: Sunscrapers sp. z o.o.

  • At least 5 years of professional experience as a data engineer.
  • Undergraduate or graduate degree in Computer Science, Engineering, Mathematics, or similar.
  • Excellent command in spoken and written English, at least C1.
  • Strong professional experience with Python and SQL.
  • Hands-on experience with DBT and Snowflake.
  • Experience in building data pipelines with Airflow or alternative solutions.
  • Strong understanding of various data modeling techniques like Kimball Star Schema.
  • Great analytical skills and attention to detail.
  • Creative problem-solving skills.
  • Great customer service and troubleshooting skills.

  • Modeling datasets and schemes for consistency and easy access.
  • Design and implement data transformations and data marts.
  • Integrating third-party systems and external data sources into data warehouse.
  • Building data flows for fetching, aggregation and data modeling using batch pipelines.

PythonSQLSnowflakeAirflowAnalytical SkillsCustomer serviceDevOpsAttention to detail

Posted 2024-11-21
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-21

📍 Belgium, Spain

🔍 Hospitality industry

🏢 Company: Lighthouse

  • 5+ years of professional experience using Python, Java, or Scala for data processing (Python preferred)
  • Experience with writing data processing pipelines and with cloud platforms like AWS, GCP, or Azure
  • Experience with data pipeline orchestration tools like Apache Airflow (preferred), Dagster or Prefect
  • Deep understanding of data warehousing strategies
  • Experience with transformation tools like dbt to manage data transformation in your data pipelines
  • Some experience in managing infrastructure with IaC tools like Terraform
  • Stay updated with industry trends, emerging technologies, and best practices in data engineering
  • Improve, manage, and teach standards for code maintainability and performance in code submitted and reviewed
  • Ship large features independently, generate architecture recommendations with the ability to implement them
  • Strong communicator that can describe complex topics in a simple way to a variety of technical and non-technical stakeholders.

  • Design and develop scalable, reliable data pipelines using the Google Cloud stack.
  • Ingest, process, and store structured and unstructured data from various sources into our data-lakes and data warehouses.
  • Optimise data pipelines for cost, performance and scalability.
  • Implement and maintain data governance frameworks, ensuring data accuracy, consistency, and compliance.
  • Monitor and troubleshoot data pipeline issues, implementing proactive measures for reliability and performance.
  • Mentor and provide technical guidance to other engineers working with data.
  • Partner with Product, Engineering & Data Science teams to operationalise new solutions.

PythonApache AirflowGCPJavaKafkaKubernetesAirflowData engineeringGrafanaPrometheusSparkCI/CDTerraformDocumentationCompliance

Posted 2024-11-21
Apply
Apply

🏢 Company: Jobgether

Posted 2024-11-20
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-15

🧭 Full-Time

💸 95000 - 230000 USD per year

🔍 Blockchain and cryptocurrency

🏢 Company: 0x

  • Passion for the benefits of decentralization and the 0x mission.
  • 5+ years of experience as a Data Engineer.
  • 1+ years of experience with Ethereum.
  • Experience building and operating highly available data pipelines.
  • Experience with Apache Kafka or similar pub/sub systems.
  • Familiarity with programming (ideally in Python and/or TypeScript/Node/Go).

  • Collaborate with the Data Team, Product Managers, and Engineers to enhance data accessibility and usability for decision-making.
  • Develop and maintain efficient, scalable ETL pipelines for real-time and batch data processing.
  • Contribute to the development of 0x Data APIs for internal and external applications.
  • Maintain data observability processes and standards.
  • Identify and implement process improvements to enhance data management and analysis efficiency.
  • Mentor team members and foster a collaborative environment.
Posted 2024-11-15
Apply
Apply

📍 Latin America and other parts of the world

🔍 Insurance

  • Experience in Data Engineering.
  • Proficient in Python or Scala.
  • Excellent communication skills.
  • Attention to detail and strong problem-solving abilities.
  • Ability to work in an Agile environment.

  • Responsible for development and maintenance of systems in enterprise data and analytics environments.
  • Optimize performance by building and supporting decision-making tools.
  • Collaborate closely with software engineering, AI/ML, Cybersecurity, and DevOps/SysOps teams.
  • Design data pipelines and databases, build infrastructure and alerting frameworks.
  • Process data using SFTPs to APIs.

PythonSoftware DevelopmentAgileData engineeringDevOpsAttention to detail

Posted 2024-11-15
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-15

📍 United States, Canada

🔍 Advanced analytics consulting

🏢 Company: Tiger Analytics

  • Bachelor’s degree in Computer Science or similar field.
  • 8+ years of experience in a Data Engineer role.
  • Experience with relational SQL and NoSQL databases like MySQL, Postgres.
  • Strong analytical skills and advanced SQL knowledge.
  • Development of ETL pipelines using Python & SQL.
  • Good experience with Customer Data Platforms (CDP).
  • Experience in SQL optimization and performance tuning.
  • Data modeling and building high-volume ETL pipelines.
  • Working experience with any cloud platform.
  • Experience with Google Tag Manager and Power BI is a plus.
  • Experience with object-oriented scripting languages: Python, Java, Scala, etc.
  • Experience extracting/querying/joining large data sets at scale.
  • Strong communication and organizational skills.

  • Designing, building, and maintaining scalable data pipelines on cloud infrastructure.
  • Working closely with cross-functional teams.
  • Supporting data analytics, machine learning, and business intelligence initiatives.
Posted 2024-11-15
Apply
Apply

🔍 Healthcare, AI

NOT STATED

  • As a Senior Data Engineer, you'll play a pivotal role in building data infrastructure for AI and machine learning projects.
  • You will design, develop, and scale systems that analyze complex health data.
  • Your work will empower AI to understand member behavior and improve the care experience through data-driven insights.
Posted 2024-11-15
Apply
Apply

📍 Arizona, California, Connecticut, Colorado, Florida, Georgia, Hawaii, Illinois, Maryland, Massachusetts, Michigan, Minnesota, Missouri, New Hampshire, New York, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, Texas, Utah, Vermont, Virginia, Washington, Washington D.C. and Wisconsin

🧭 Full-Time

💸 157791 - 183207 USD per year

🔍 Nonprofit, technology for political campaigns

🏢 Company: ActBlue

  • 3-5 years of experience in data engineering or related roles.
  • Experience building, deploying, and running Machine Learning models in a production environment.
  • Experience maintaining and deploying server-side web applications.
  • Good collaboration skills with remote teams and a team player mentality.
  • Eagerness to learn, support teammates’ growth, and an understanding of performance, scalability, and security.

  • Implement and deliver complex, high-impact data platform projects, managing them through their full lifecycle with minimal guidance.
  • Work closely with application developers, database administrators, and data scientists to create robust infrastructure for data-driven insights.
  • Identify and understand end-user data needs, design solutions, and build scalable data pipelines.
  • Create data frameworks and services for engineers and data scientists to ensure scalability and consistency.
  • Collaborate with data scientists to advance the production-level Machine Learning platform.
  • Cultivate strong relationships with stakeholders and engineering teams to inform technical decisions.

AWSPythonMachine LearningData engineeringTerraform

Posted 2024-11-14
Apply
Apply

📍 US

🧭 Full-Time

🔍 Cloud integration technology

🏢 Company: Cleo (US)

  • 5-7+ years of experience in data engineering focusing on AI/ML models.
  • Hands-on expertise in data transformation and building data pipelines.
  • Leadership experience in mentoring data engineering teams.
  • Strong experience with cloud platforms and big data technologies.

  • Lead the design and build of scalable, reliable, and efficient data pipelines.
  • Set data infrastructure strategy for data warehouses and lakes.
  • Hands-on data transformation for AI/ML models.
  • Build data structures and manage metadata.
  • Implement data quality controls.
  • Collaborate with cross-functional teams to meet data requirements.
  • Optimize ETL processes for AI/ML.
  • Ensure data pipelines support model training needs.
  • Define data governance practices.

LeadershipArtificial IntelligenceETLMachine LearningStrategyData engineeringData StructuresMentoring

Posted 2024-11-14
Apply

Related Articles

Remote Job Certifications and Courses to Boost Your Career

August 22, 2024

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

How to Balance Work and Life While Working Remotely

August 19, 2024

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Weekly Digest: Remote Jobs News and Trends (August 11 - August 18, 2024)

August 18, 2024

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

How to Onboard Remote Employees Successfully

August 16, 2024

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.

Remote Work Statistics and Insights for 2024

August 13, 2024

The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.