Apply

Senior Data Engineer, Remote

Posted 3 months agoViewed

View full description

๐Ÿ’Ž Seniority level: Senior, 5+ years

๐Ÿ“ Location: ANY STATE, east coast time zone, NOT STATED

๐Ÿ” Industry: Data and technology

โณ Experience: 5+ years

๐Ÿช„ Skills: PythonSQLETLGCPKubeflowMachine LearningAlgorithmsData engineeringData scienceData StructuresTensorflowCollaborationScala

Requirements:
  • 5+ years of experience making contributions in the form of code.
  • Experience with algorithms and data structures and knowing when to apply them.
  • Experience with machine learning techniques to develop better predictive and clustering models.
  • Experience working with high-scale systems.
  • Experience creating powerful machine learning tools for experimentation and productionalization at scale.
  • Experience in data engineering and warehousing to develop ingestion engines, ETL pipelines, and organizing data for consumption.
Responsibilities:
  • Be a senior member of the team by contributing to the architecture, design, and implementation of EMS systems.
  • Mentor junior engineers and promote their growth.
  • Lead technical projects and manage planning, execution, and success of complex technical projects.
  • Collaborate with other engineering, product, and data science teams to ensure optimal product development.
Apply

Related Jobs

Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ” Software Development

  • Experience in data engineering and analytics
  • Familiarity with data structures and algorithms
  • Build the graph underlying Sayari's products
  • Collaborate with product and software engineering teams

AWSGraphQLPostgreSQLSQLETLData engineering

Posted about 2 months ago
Apply
Apply

๐Ÿ“ Paris, New York, San Francisco, Sydney, Madrid, London, Berlin

๐Ÿ” Communication technology

  • Passionate about data engineering.
  • Experience in designing and developing data infrastructure.
  • Technical skills to solve complex challenges.
  • Play a crucial role in designing, developing, and maintaining data infrastructure.
  • Collaborate with teams across the company to solve complex challenges.
  • Improve operational efficiency and lead business towards strategic goals.
  • Contribute to engineering efforts that enhance customer journey.

AWSPostgreSQLPythonSQLApache AirflowETLData engineering

Posted 3 months ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 141000 - 174000 USD per year

๐Ÿ” Catering technology

๐Ÿข Company: ezCater, Inc

  • Strong experience with data warehousing, data lakes, and ELT processes across enterprise platforms like Snowflake, Redshift, or BigQuery.
  • Proficient in building performant data pipelines across disparate systems.
  • Familiarity with cloud platforms such as AWS, GCP, or Azure.
  • Expertise in SQL and experience with Python.
  • Ability to work both independently and collaboratively.
  • Willingness to adapt to a large and complex business landscape.
  • Experience with technologies such as Snowflake, dbt, Fivetran, Airflow, AWS, Sagemaker, MLFlow, Kubernetes, Docker, and Python for ETL and data science is advantageous.
  • Write and ship code mainly using dbt (SQL, Jinja).
  • Collaborate closely with analysts and stakeholders to refine requirements and debug data sets.
  • Design and develop high-performance data pipelines while adhering to software development lifecycle best practices.
  • Identify optimization opportunities within the existing data stack.
  • Utilize automation to enhance developer efficiency.
  • Monitor data systems for quality and availability while aiming to reduce costs.
  • Contribute to team processes and community, and mentor other Data Engineers.

AWSPythonSQLETLSnowflakeCI/CD

Posted 4 months ago
Apply