Apply

Senior Data Engineer

Posted 2 days agoViewed

View full description

πŸ’Ž Seniority level: Senior, 10+ YOE

πŸ” Industry: Healthcare

🏒 Company: PromptπŸ‘₯ 101-250πŸ’° $2,238,718 Seed over 4 years agoEducationiOSService IndustryUniversitiesSoftware

πŸ—£οΈ Languages: English

⏳ Experience: 10+ YOE

Requirements:
  • 10+ YOE working with cloud based databases, data lakes, data warehouses (S3, RDS, AWS Athena, AWS Redshift etc.)
  • Proficiency in data engineering tech stack; for example; Athena / Redshift / Glue / MySQL / Python / Spark / Kafka / SQL / AWS / Airflow/ DBT / containers and orchestration (Docker, Kubernetes) and others
  • Experience and understanding of distributed systems, data architecture design, and big data technologies
  • Experience with AWS technologies ( e.g., AWS Lambda, Redshift, RDS, S3, DMS, Glue, Kinesis, SNS etc.)
  • Knowledge of data quality management, data governance, and data security best practices
  • Good knowledge of DevOps engineering using Continuous Integration/Delivery tools like Kubernetes, Jenkins, Terraform, etc., thinking about automation, alerting, monitoring, security, and other declarative infrastructure
  • Experience with managing multiple data pipelines for internal and external data products/ pipelines
Responsibilities:
  • Creating robust and scalable architectures to manage the flow and storage of data across the organization.
  • Developing and managing Extract, Transform, Load processes to ensure data is accurately integrated from various sources into data warehouses or lakes.
  • Constructing automated pipelines for data processing and transformation, ensuring smooth data flow and timely availability for analysis.
  • Administering databases and ensuring their performance, integrity, and security.
  • Implementing data validation, cleansing, and governance practices to maintain high-quality and reliable data.
  • Working closely with AI engineers, BI engineers, Analysts, and business stakeholders to understand data requirements and provide support in their analytical tasks. Participate in code reviews and architecture discussions to exchange actionable feedback with peers. Contribute to engineering best practices and mentor junior team members. Help break down complex projects and requirements into sprints.
  • Monitoring and optimizing data systems for improved performance and efficiency.
  • Evaluating and integrating new tools and technologies to enhance data management capabilities.
Apply

Related Jobs

Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” B2B SaaS

🏒 Company: Sanity

  • 4+ years of experience building data pipelines at scale
  • Deep expertise in SQL, Python, and Node.js/TypeScript
  • Production experience with Airflow and RudderStack
  • Track record of building reliable data infrastructure
  • Design, develop, and maintain scalable ETL/ELT pipelines
  • Collaborate to implement and scale product telemetry
  • Establish best practices for data ingestion and transformation
  • Monitor and optimize data pipeline performance

Node.jsPythonSQLApache AirflowETLTypeScript

Posted 1 day ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” E-commerce

  • Bachelor's or Master's degree in Computer Science or related field
  • 5+ years of experience in data engineering
  • Strong proficiency in SQL and database technologies
  • Experience with data pipeline orchestration tools
  • Proficiency in programming languages like Python and Scala
  • Hands-on experience with AWS cloud data services
  • Familiarity with big data frameworks like Apache Spark
  • Knowledge of data modeling and warehousing
  • Experience implementing CI/CD for data pipelines
  • Real-time data processing architectures experience
  • Design, develop, and maintain ETL/ELT pipelines
  • Optimize data architecture and storage solutions
  • Work with AWS for scalable data solutions
  • Ensure data quality, integrity, and security
  • Collaborate with cross-functional teams
  • Monitor and troubleshoot data workflows
  • Create APIs for analytical information

AWSPostgreSQLPythonSQLApache AirflowETLKafkaMySQLSnowflakeCI/CDScala

Posted 1 day ago
Apply
Apply

πŸ“ Europe, APAC, Americas

🧭 Full-Time

πŸ” Software Development

🏒 Company: DockerπŸ‘₯ 251-500πŸ’° $105,000,000 Series C almost 3 years agoDeveloper ToolsDeveloper PlatformInformation TechnologySoftware

  • 4+ years of relevant industry experience
  • Experience with data modeling and building scalable pipelines
  • Proficiency with Snowflake or BigQuery
  • Experience with data governance and security controls
  • Experience creating ETL scripts using Python and SQL
  • Familiarity with a cloud ecosystem: AWS/Azure/Google Cloud
  • Experience with Tableau or Looker
  • Manage and develop ETL jobs, warehouse, and event collection tools
  • Build and manage the Central Data Model for reporting
  • Integrate emerging methodologies and technologies
  • Build data pipelines for ML and AI projects
  • Contribute to SOC2 compliance across the data platform
  • Document technical architecture

PythonSQLETLSnowflakeAirflowData engineeringData visualizationData modeling

Posted 2 days ago
Apply
Apply

🏒 Company: ProArchπŸ‘₯ 501-1000πŸ’° $25,000,000 about 3 years agoCRMInformation TechnologySoftware

Posted 4 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ” Health and Wellness Solutions

🏒 Company: Panasonic Well

  • 5+ years technology industry experience
  • Proficiency in building data pipelines in Python and/or Kotlin
  • Deep understanding of relational and non-relational database solutions
  • Experience with large-scale data pipeline construction
  • Familiarity with PCI, CCPA, GDPR compliance
  • Design, develop, and optimize automated data pipelines
  • Identify improvements for data reliability and quality
  • Own and evolve data architecture with a focus on privacy
  • Drive continuous improvement in data workflows
  • Collaborate with Data Scientists, AI Engineers, and Product Managers

PythonETLKafkaKotlinSnowflakeData engineeringComplianceData modeling

Posted 5 days ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” Software Development

🏒 Company: BioRenderπŸ‘₯ 101-250πŸ’° $15,319,133 Series A almost 2 years agoLife ScienceGraphic DesignSoftware

  • 7+ years of data engineering experience of relevant industry experience
  • Expertise working with Data Warehousing platforms (AWS RedShift or Snowflake preferred) and data lake / lakehouse architectures
  • Experience with Data Streaming platforms (AWS Kinesis / Firehose preferred)
  • Expertise with SQL and programming languages commonly used in data platforms (Python, Spark, etc)
NOT STATED

AWSPythonSQLApache AirflowSnowflakeData engineeringSparkData modeling

Posted 6 days ago
Apply
Apply

πŸ’Έ 110000.0 - 125000.0 USD per year

🏒 Company: EnformionπŸ‘₯ 101-250AnalyticsInformation Technology

  • 5+ years minimum experience in language such as Java, Scala, PySpark, Perl, Shell Scripting and Python
  • Working knowledge of the Hadoop ecosystem applications (MapReduce, YARN, Pig, Hbase, Hive, Spark and more!)
  • Strong Experience working with data pipelines in multi-terabyte data warehouses. Experience in dealing with performance and scalability issues
  • Strong SQL (MySQL, Hive, etc.) and No-SQL (MongoDB, Hbase, etc.) skills, including writing complex queries and performance tuning
  • Knowledge of data modeling, partitioning, indexing, and architectural database design.
  • Experience using Source Code and Version Control systems like GIT etc.
  • Experience on continuous build and test process using tools such as GitLab, SBT, Postman, etc.
  • Implement and maintain big data platform and infrastructure
  • Develop, optimize and tune MySQL stored procedures, scripts, and indexes
  • Develop Hive schemas and scripts, Spark Jobs using pyspark and Scala and UDFs in Java
  • Design, develop and maintain automated, complex, and efficient ETL processes to do batch records-matching of multiple large-scale datasets, including supporting documentation
  • Develop and maintains pipelines using Airflow or any other tools to monitor, debug, and analyze data pipelines
  • Troubleshoot Hadoop cluster and query issues, evaluate query plans, and optimize schemas and queries
  • Strong interpersonal skills to resolve problems in a professional manner, lead working groups, and negotiate consensus
Posted 6 days ago
Apply
Apply

πŸ“ Germany

🧭 Full-Time

πŸ” Insurtech

🏒 Company: Getsafe

  • 4+ years of experience in creating data pipelines using SQL/Python/Airflow
  • Experience designing Data Mart and Data Warehouse
  • Experience with cloud infrastructure, including Terraform
  • Analyze, design, develop, and deliver Data Warehouse solutions
  • Create ETL/ELT pipelines using Python and Airflow
  • Design, develop, maintain and support Data Warehouse & BI platform

PythonSQLApache AirflowETLTerraform

Posted 6 days ago
Apply
Apply

🧭 Full-Time

πŸ” Fintech

  • 7+ years of experience with ETL, SQL, PowerBI, Tableau, or similar technologies
  • Strong understanding of data modeling, database design, and SQL
  • Experience working with Apache Kafka or MSK solution
  • Extensive experience delivering solutions on Snowflake or other cloud-based data warehouses, and generally an understanding of data warehousing technologies and event-driven architectures
  • Proficiency in Python/R and familiarity with modern data engineering practices
  • Strong analytical and problem-solving skills with a focus on delivering high-quality solutions
  • Experience with machine learning (ML) and building natural language interfaces for business data
  • Proven track record in a fast-paced Agile development environment
  • Ability to work autonomously while effectively engaging with multiple business teams and stakeholders
  • Design, develop, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data from various sources
  • Implement data ingestion frameworks to efficiently collect data from internal and external sources
  • Optimize data pipelines for performance, reliability, and scalability
  • Develop and deliver scalable, unit-tested data assets and products that empower analysts and drive business workflows
  • Evaluate and continuously improve existing data products and solutions for performance, scalability and security
  • Experience in data quality management, including software implementation for data correction, reconciliation, and validation of data workflows to ensure accuracy and integrity in the data warehouse
  • Collaborate with engineers, data scientists, and product managers to analyze edge cases and plan for architectural scalability
  • Lead the deployment and maintenance of multiple data solutions such as business dashboards and machine learning models
  • Champion best practices in data development, design, and architecture
  • Conduct comprehensive code reviews, providing mentorship and meaningful feedback to junior team members
  • Collaborate with other team members to create and maintain process documentation, data flows, and ETL diagrams for both new and existing data pipelines and processes
  • Monitor data pipelines for performance, reliability, and security issues
  • Implement logging, monitoring, and alerting systems to detect and respond to data-related issues proactively
  • Drive the team's Agile process, ensuring high standards of productivity and collaboration
Posted 8 days ago
Apply
Apply

πŸ“ UK

🧭 Full-Time

πŸ” Technology, Data Engineering

🏒 Company: Aker SystemsπŸ‘₯ 101-250πŸ’° over 4 years agoCloud Data ServicesBusiness IntelligenceAnalyticsSoftware

  • Data pipeline development using processing technologies
  • Experience in Public Cloud services, especially AWS
  • Configuring and tuning Relational and NoSQL databases
  • Programming with Python
  • Code, test, and document data pipelines
  • Conduct database design
  • Expand data platform capabilities
  • Perform data analysis and root cause analysis

AWSPythonData modeling

Posted 10 days ago
Apply

Related Articles

Posted 6 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 6 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 6 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 6 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.

Posted 6 months ago

The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.