Apply

Senior Data Engineer

Posted 3 days agoViewed

View full description

πŸ’Ž Seniority level: Senior, 4+ years

πŸ“ Location: Europe, APAC, Americas

πŸ” Industry: Software Development

🏒 Company: DockerπŸ‘₯ 251-500πŸ’° $105,000,000 Series C almost 3 years agoDeveloper ToolsDeveloper PlatformInformation TechnologySoftware

πŸ—£οΈ Languages: English

⏳ Experience: 4+ years

πŸͺ„ Skills: PythonSQLETLSnowflakeAirflowData engineeringData visualizationData modeling

Requirements:
  • 4+ years of relevant industry experience
  • Experience with data modeling and building scalable pipelines
  • Proficiency with Snowflake or BigQuery
  • Experience with data governance and security controls
  • Experience creating ETL scripts using Python and SQL
  • Familiarity with a cloud ecosystem: AWS/Azure/Google Cloud
  • Experience with Tableau or Looker
Responsibilities:
  • Manage and develop ETL jobs, warehouse, and event collection tools
  • Build and manage the Central Data Model for reporting
  • Integrate emerging methodologies and technologies
  • Build data pipelines for ML and AI projects
  • Contribute to SOC2 compliance across the data platform
  • Document technical architecture
Apply

Related Jobs

Apply
πŸ”₯ Senior Data Engineer
Posted about 3 hours ago

πŸ“ Germany, Austria, Italy, Spain, Portugal

🧭 Full-Time

πŸ” Digital Solutions for Financial and Real Estate Industries

🏒 Company: PriceHubbleπŸ‘₯ 101-250πŸ’° Non-equity Assistance about 3 years agoArtificial Intelligence (AI)PropTechBig DataMachine LearningAnalyticsReal Estate

  • 3+ years experience building and maintaining production data pipelines
  • Proficient in working with geospatial data (bonus)
  • Work with backend engineers and data scientists to turn raw data into trusted insights
  • Navigate cost-value trade-offs to deliver value to customers
  • Develop solutions that work in over 10 countries
  • Lead a project from concept to launch
  • Drive the team to deliver high-quality products, services, and processes
  • Improve the performance, data quality, and cost-efficiency of data pipelines
  • Maintain and monitor the data systems

PostgreSQLPythonSQLApache AirflowETLData engineering

Posted about 3 hours ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” B2B SaaS

🏒 Company: Sanity

  • 4+ years of experience building data pipelines at scale
  • Deep expertise in SQL, Python, and Node.js/TypeScript
  • Production experience with Airflow and RudderStack
  • Track record of building reliable data infrastructure
  • Design, develop, and maintain scalable ETL/ELT pipelines
  • Collaborate to implement and scale product telemetry
  • Establish best practices for data ingestion and transformation
  • Monitor and optimize data pipeline performance

Node.jsPythonSQLApache AirflowETLTypeScript

Posted 2 days ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” E-commerce

  • Bachelor's or Master's degree in Computer Science or related field
  • 5+ years of experience in data engineering
  • Strong proficiency in SQL and database technologies
  • Experience with data pipeline orchestration tools
  • Proficiency in programming languages like Python and Scala
  • Hands-on experience with AWS cloud data services
  • Familiarity with big data frameworks like Apache Spark
  • Knowledge of data modeling and warehousing
  • Experience implementing CI/CD for data pipelines
  • Real-time data processing architectures experience
  • Design, develop, and maintain ETL/ELT pipelines
  • Optimize data architecture and storage solutions
  • Work with AWS for scalable data solutions
  • Ensure data quality, integrity, and security
  • Collaborate with cross-functional teams
  • Monitor and troubleshoot data workflows
  • Create APIs for analytical information

AWSPostgreSQLPythonSQLApache AirflowETLKafkaMySQLSnowflakeCI/CDScala

Posted 2 days ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” Software Development

🏒 Company: BioRenderπŸ‘₯ 101-250πŸ’° $15,319,133 Series A almost 2 years agoLife ScienceGraphic DesignSoftware

  • 7+ years of data engineering experience of relevant industry experience
  • Expertise working with Data Warehousing platforms (AWS RedShift or Snowflake preferred) and data lake / lakehouse architectures
  • Experience with Data Streaming platforms (AWS Kinesis / Firehose preferred)
  • Expertise with SQL and programming languages commonly used in data platforms (Python, Spark, etc)
  • Experience with data pipeline orchestration (e.g., Airflow) and data pipeline integrations (e.g. Airbyte, Stitch)
  • Building and maintaining the right architecture and tooling to support our data science, analytics, product, and machine learning initiatives.
  • Solve complex architectural problems
  • Translate deeply technical designs into business appropriate representations as well as analyze business needs and requirements ensuring implementation of data services directly correlates to the strategy and growth of the business

AWSPythonSQLApache AirflowSnowflakeData engineeringSparkData modeling

Posted 6 days ago
Apply
Apply

πŸ“ Germany

🧭 Full-Time

πŸ” Insurtech

🏒 Company: Getsafe

  • 4+ years of experience in creating data pipelines using SQL/Python/Airflow
  • Experience designing Data Mart and Data Warehouse
  • Experience with cloud infrastructure, including Terraform
  • Analyze, design, develop, and deliver Data Warehouse solutions
  • Create ETL/ELT pipelines using Python and Airflow
  • Design, develop, maintain and support Data Warehouse & BI platform

PythonSQLApache AirflowETLTerraform

Posted 7 days ago
Apply
Apply

πŸ“ UK

🧭 Full-Time

πŸ” Technology, Data Engineering

🏒 Company: Aker SystemsπŸ‘₯ 101-250πŸ’° over 4 years agoCloud Data ServicesBusiness IntelligenceAnalyticsSoftware

  • Data pipeline development using processing technologies
  • Experience in Public Cloud services, especially AWS
  • Configuring and tuning Relational and NoSQL databases
  • Programming with Python
  • Code, test, and document data pipelines
  • Conduct database design
  • Expand data platform capabilities
  • Perform data analysis and root cause analysis

AWSPythonData modeling

Posted 11 days ago
Apply
Apply

πŸ“ Poland

🧭 Full-Time

πŸ” Data Engineering

🏒 Company: Softeta

  • Advanced degree in computer science, mathematics, or related field
  • Proven work experience as a data engineer (5+ years)
  • Proficiency with AirFlow, DBT, or similar products
  • Strong knowledge of data structures and modeling
  • CI/CD pipeline and MLOPs experience advantageous
  • Experience with cloud data platforms, particularly GCP/BigQuery
  • Create and maintain pipeline architectures in AirFlow and DBT
  • Assemble large and/or complex datasets
  • Improve processes for scale, delivery, and automation
  • Maintain and improve data warehouse structure
  • Communicate technical details to stakeholders
  • Investigate and resolve anomalies in data

SQLGCPAirflowData engineeringCI/CDData modeling

Posted 11 days ago
Apply
Apply

πŸ“ LatAm

🧭 Full-Time

πŸ” B2B data and intelligence

🏒 Company: TruelogicπŸ‘₯ 101-250ConsultingWeb DevelopmentWeb DesignSoftware

  • 8+ years of experience as a Data/BI engineer.
  • Experience developing data pipelines with Airflow or equivalent code-based orchestration software.
  • Strong SQL abilities and hands-on experience with SQL and no-SQL DBs, performing analysis and performance optimizations.
  • Hands-on experience in Python or equivalent programming language
  • Experience with data warehouse solutions (like BigQuery/ Redshift/ Snowflake)
  • Experience with data modeling, data catalog concepts, data formats, data pipelines/ETL design, implementation, and maintenance.
  • Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, and Athena.
  • Experience in Quality Checks
  • Experience in DBT
  • EFront Knowledge
  • Strong and Clear Communication Skills
  • Building, and continuously improving our data gathering, modeling, reporting capabilities, and self-service data platforms.
  • Working closely with Data Engineers, Data Analysts, Data Scientists, Product Owners, and Domain Experts to identify data needs.

AWSPythonSQLCloud ComputingETLSnowflakeAirflowData engineeringCommunication SkillsData modeling

Posted 13 days ago
Apply
Apply

πŸ“ US, Canada

🧭 Full-Time

πŸ’Έ 95795.0 - 128800.0 USD per year

πŸ” Internet of Things (IoT)

  • BS degree in Computer Science, Statistics, Engineering, or a related quantitative discipline.
  • 6+ years experience in a data engineering and data science-focused role.
  • Proficiency in data manipulation and processing in SQL and Python.
  • Expertise building data pipelines with new API endpoints from their documentation.
  • Proficiency in building ETL pipelines to handle large volumes of data.
  • Demonstrated experience in designing data models at scale.
  • Build and maintain highly reliable computed tables, incorporating data from various sources, including unstructured and highly sensitive data.
  • Access, manipulate, and integrate external datasets with internal data.
  • Build analytical and statistical models to identify patterns, anomalies, and root causes.
  • Leverage SQL and Python to shape and aggregate data.
  • Incorporate generative AI tools into production data pipelines and automated workflows.
  • Collaborate closely with data scientists, data analysts, and Tableau developers to ship top quality analytic products.
  • Champion, role model, and embed Samsara’s cultural principles.

PythonSQLETLData engineeringData scienceData modeling

Posted 20 days ago
Apply
Apply

πŸ“ Spain

πŸ’Έ 80000.0 - 110000.0 EUR per year

πŸ” Financial services

  • 5+ years of professional experience in Data Engineering or similar roles.
  • Proficiency in SQL and DBT for data transformations.
  • Fluency in Python or other modern programming languages.
  • Experience with infrastructure as code languages like Terraform.
  • Knowledge of data modeling, data warehouse technologies, and cloud infrastructures.
  • Experience with AWS or other cloud platforms like Azure or GCP.
  • Ability to provide constructive code reviews.
  • Strong communication and collaboration skills.
  • Work with engineering managers and tech leads to identify and plan projects based on team goals.
  • Collaborate with teams across engineering, analytics, and product to deliver technology for analytical use cases.
  • Write high-quality, understandable code.
  • Review teammates' work and offer feedback.
  • Serve as a technical mentor for other engineers.
  • Promote a respectful and supportive team environment.
  • Participate in on-call rotation.

AWSPythonSQLTerraformData modeling

Posted 21 days ago
Apply