Apply

Senior Data Engineer

Posted 2024-11-24

View full description

πŸ’Ž Seniority level: Senior, At least 4+ years

πŸ“ Location: United States, Latin America, India

πŸ” Industry: Modern data stack and cloud data services

πŸ—£οΈ Languages: English

⏳ Experience: At least 4+ years

πŸͺ„ Skills: AWSPythonSoftware DevelopmentSQLElasticSearchGCPHadoopJavaKafkaSnowflakeAirflowAzureCassandraElasticsearchNosqlSparkCommunication SkillsDocumentation

Requirements:
  • At least 4+ years experience as a Software Engineer, Data Engineer, or Data Analyst.
  • Ability to develop end-to-end technical solutions in production environments.
  • Proficient in Java, Python, or Scala.
  • Experience with core cloud data platforms like Snowflake, AWS, Azure, Databricks, and GCP.
  • Strong SQL skills, capable of writing, debugging, and optimizing queries.
  • Client-facing communication skills for presentations and documentation.
  • Bachelor's degree in Computer Science or a related field.
Responsibilities:
  • Develop end-to-end technical solutions and ensure their performance, security, and scalability.
  • Help with robust data integration and contribute to the production deployment of solutions.
  • Create and deliver detailed presentations to clients.
  • Document solutions including POCs, roadmaps, diagrams, and logical system views.
Apply

Related Jobs

Apply

πŸ“ Ontario

πŸ” Customer engagement platform

🏒 Company: Braze

  • 5+ years of hands-on experience in data engineering, cloud data warehouses, and ETL development.
  • Proven expertise in designing and optimizing data pipelines and architectures.
  • Strong proficiency in advanced SQL and data modeling techniques.
  • A track record of leading impactful data projects from conception to deployment.
  • Effective collaboration skills with cross-functional teams and stakeholders.
  • In-depth understanding of technical architecture and data flow in a cloud-based environment.
  • Ability to mentor and guide junior team members on best practices for data engineering and development.
  • Passion for building scalable data solutions that enhance customer experiences and drive business growth.
  • Strong analytical and problem-solving skills, with a keen eye for detail and accuracy.
  • Extensive experience working with and aggregating large event-level data.
  • Familiarity with data governance principles and ensuring compliance with industry regulations.
  • Preferable experience with Kubernetes for container orchestration and Airflow for workflow management.

  • Lead the design, implementation, and monitoring of scalable data pipelines and architectures using tools like Snowflake and dbt.
  • Develop and maintain robust ETL processes to ensure high-quality data ingestion, transformation, and storage.
  • Collaborate closely with data scientists, analysts, and other engineers to design and implement data solutions that drive customer engagement and retention.
  • Optimize and manage data flows and integrations across various platforms and applications.
  • Ensure data quality, consistency, and governance by implementing best practices and monitoring systems.
  • Work extensively with large-scale event-level data, aggregating and processing it to support business intelligence and analytics.
  • Implement and maintain data products using advanced techniques and tools.
  • Collaborate with cross-functional teams including engineering, product management, sales, marketing, and customer success to deliver valuable data solutions.
  • Continuously evaluate and integrate new data technologies and tools to enhance our data infrastructure and capabilities.

SQLBusiness IntelligenceETLSnowflakeData engineeringCollaborationCompliance

Posted 2024-11-22
Apply
Apply

πŸ“ United States of America

πŸ’Έ 90000 - 154000 USD per year

🏒 Company: VSPVisionCareers

  • Bachelor’s degree in computer science, data science, statistics, economics or related area.
  • Excellent written and verbal communication skills.
  • 6+ years of experience in development teams focusing on analytics.
  • 6+ years of hands-on experience in data preparation and SQL.
  • Knowledge of data architectures like event-driven architecture and real-time data.
  • Familiarity with DataOps practices and multiple data integration platforms.

  • Design, build, and optimize data pipelines for analytics.
  • Collaborate with multi-disciplinary teams for data integration.
  • Analyze data requirements to develop scalable pipeline solutions.
  • Profile data for accuracy and completeness in data gathering.
  • Drive automation of data tasks to enhance productivity.
  • Participate in architecture and design reviews.

AWSSQLAgileETLKafkaOracleSCRUMSnowflakeApache KafkaData scienceData StructuresCommunication SkillsCollaboration

Posted 2024-11-22
Apply
Apply

πŸ“ Latin America and other parts of the world

πŸ” Insurance

  • Experience in Data Engineering.
  • Proficient in Python or Scala.
  • Excellent communication skills.
  • Attention to detail and strong problem-solving abilities.
  • Ability to work in an Agile environment.

  • Responsible for development and maintenance of systems in enterprise data and analytics environments.
  • Optimize performance by building and supporting decision-making tools.
  • Collaborate closely with software engineering, AI/ML, Cybersecurity, and DevOps/SysOps teams.
  • Design data pipelines and databases, build infrastructure and alerting frameworks.
  • Process data using SFTPs to APIs.

PythonSoftware DevelopmentAgileData engineeringDevOpsAttention to detail

Posted 2024-11-15
Apply
Apply

πŸ“ United States, Canada

πŸ” Advanced analytics consulting

🏒 Company: Tiger Analytics

  • Bachelor’s degree in Computer Science or similar field.
  • 8+ years of experience in a Data Engineer role.
  • Experience with relational SQL and NoSQL databases like MySQL, Postgres.
  • Strong analytical skills and advanced SQL knowledge.
  • Development of ETL pipelines using Python & SQL.
  • Good experience with Customer Data Platforms (CDP).
  • Experience in SQL optimization and performance tuning.
  • Data modeling and building high-volume ETL pipelines.
  • Working experience with any cloud platform.
  • Experience with Google Tag Manager and Power BI is a plus.
  • Experience with object-oriented scripting languages: Python, Java, Scala, etc.
  • Experience extracting/querying/joining large data sets at scale.
  • Strong communication and organizational skills.

  • Designing, building, and maintaining scalable data pipelines on cloud infrastructure.
  • Working closely with cross-functional teams.
  • Supporting data analytics, machine learning, and business intelligence initiatives.

PythonSQLBusiness IntelligenceETLJavaMySQLPostgresNosqlAnalytical SkillsOrganizational skills

Posted 2024-11-15
Apply
Apply

πŸ“ Arizona, California, Connecticut, Colorado, Florida, Georgia, Hawaii, Illinois, Maryland, Massachusetts, Michigan, Minnesota, Missouri, New Hampshire, New York, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, Texas, Utah, Vermont, Virginia, Washington, Washington D.C. and Wisconsin

🧭 Full-Time

πŸ’Έ 157791 - 183207 USD per year

πŸ” Nonprofit, technology for political campaigns

🏒 Company: ActBlue

  • 3-5 years of experience in data engineering or related roles.
  • Experience building, deploying, and running Machine Learning models in a production environment.
  • Experience maintaining and deploying server-side web applications.
  • Good collaboration skills with remote teams and a team player mentality.
  • Eagerness to learn, support teammates’ growth, and an understanding of performance, scalability, and security.

  • Implement and deliver complex, high-impact data platform projects, managing them through their full lifecycle with minimal guidance.
  • Work closely with application developers, database administrators, and data scientists to create robust infrastructure for data-driven insights.
  • Identify and understand end-user data needs, design solutions, and build scalable data pipelines.
  • Create data frameworks and services for engineers and data scientists to ensure scalability and consistency.
  • Collaborate with data scientists to advance the production-level Machine Learning platform.
  • Cultivate strong relationships with stakeholders and engineering teams to inform technical decisions.

AWSPythonMachine LearningData engineeringTerraform

Posted 2024-11-14
Apply
Apply

πŸ“ US

🧭 Full-Time

πŸ” Cloud integration technology

🏒 Company: Cleo (US)

  • 5-7+ years of experience in data engineering focusing on AI/ML models.
  • Hands-on expertise in data transformation and building data pipelines.
  • Leadership experience in mentoring data engineering teams.
  • Strong experience with cloud platforms and big data technologies.

  • Lead the design and build of scalable, reliable, and efficient data pipelines.
  • Set data infrastructure strategy for data warehouses and lakes.
  • Hands-on data transformation for AI/ML models.
  • Build data structures and manage metadata.
  • Implement data quality controls.
  • Collaborate with cross-functional teams to meet data requirements.
  • Optimize ETL processes for AI/ML.
  • Ensure data pipelines support model training needs.
  • Define data governance practices.

LeadershipArtificial IntelligenceETLMachine LearningStrategyData engineeringData StructuresMentoring

Posted 2024-11-14
Apply
Apply

πŸ“ USA

🧭 Full-Time

πŸ” Energy analytics and forecasting

  • Senior level experience within data engineering with primary focus using Python.
  • Experience with cloud-based infrastructure (Kubernetes/Docker) and data services (GCP, AWS, Azure, et al).
  • Experience building data pipelines with a proven track record of delivering results that impact the business.
  • Experience working on complex large codebase with a focus on refactoring and enhancements.
  • Experience building data monitoring pipelines with a focus on scalability.

  • Rebuilding systems to identify more efficient ways to process data.
  • Automate the entire forecasting pipeline, including data collection, preprocessing, model training, and deployment.
  • Continuously monitor system performance and optimize data processing workflows to reduce latency and improve efficiency.
  • Set up real-time monitoring for data feeds to detect anomalies or issues promptly.
  • Utilize distributed computing and parallel processing to handle large-scale data.
  • Design your data infrastructure to be scalable to accommodate future growth in data volume and sources.

AWSDockerPythonGCPKubernetesAzureData engineering

Posted 2024-11-11
Apply
Apply

πŸ“ Argentina, Colombia, Costa Rica, Mexico

πŸ” Data Analytics

  • Proficient with SQL and data visualization tools (i.e. Tableau, PowerBI, Google Data Studio).
  • Programming skills mainly SQL.
  • Knowledge and experience with Python and/or R a plus.
  • Experience with tools like Alteryx a plus.
  • Experience working with Google Cloud and AWS a plus.
  • Familiarity with Gitlab.

  • Analyze data and consult with subject matter experts to design and develop business rules for data processing.
  • Setup and/or maintain any existing dataflows in data wrangling tools like Alteryx or Google Dataprep.
  • Create and/or maintain SQL scripts.
  • Monitor, troubleshoot, and remediate data quality across marketing data systems ensuring full understanding of client deliverables.
  • Design and execute data quality checks.
  • Keep up to date on digital media operations - new partners, buy changes, etc.
  • Maintain ongoing management and stewardship of data governance, processing, and reporting.
  • Govern taxonomy additions, application, and use.
  • Serve as a knowledge expert for operational processes and identify areas of improvement.
  • Evaluate opportunities for simplification and/or automation for reporting and various processes.

PythonSQLGCPMicrosoft Power BITableau

Posted 2024-11-09
Apply
Apply

πŸ“ Argentina, Colombia, Costa Rica, Mexico

πŸ” Data Analytics

  • Proficient with SQL and data visualization tools (i.e. Tableau, PowerBI, Google Data Studio).
  • Programming skills mainly SQL.
  • Knowledge and experience with Python and/or R a plus.
  • Experience with tools like Alteryx a plus.
  • Experience working with Google Cloud and AWS a plus.

  • Analyze data and consult with subject matter experts to design and develop business rules for data processing.
  • Setup and/or maintain any existing dataflows in data wrangling tools like Alteryx or Google Dataprep.
  • Create and/or maintain SQL scripts.
  • Monitor, troubleshoot, and remediate data quality across marketing data systems ensuring full understanding of client deliverables.
  • Design and execute data quality checks.
  • Maintain ongoing management and stewardship of data governance, processing, and reporting.
  • Govern taxonomy additions, application, and use.
  • Serve as a knowledge expert for operational processes and identify areas of improvement to ensure appropriate turnaround times and data quality standards are being met.
  • Evaluate opportunities for simplification and/or automation for reporting and various processes.

AWSPythonSQLGCPMicrosoft Power BITableauAmazon Web ServicesData engineering

Posted 2024-11-09
Apply
Apply

πŸ“ Argentina, Colombia, Costa Rica, Mexico

πŸ” Data Analytics

  • Proficient with SQL and data visualization tools (i.e. Tableau, PowerBI, Google Data Studio).
  • Programming skills mainly SQL.
  • Knowledge and experience with Python and/or R a plus.
  • Experience with Alteryx a plus.
  • Experience working with Google Cloud and AWS a plus.

  • Analyze data and consult with subject matter experts to design and develop business rules for data processing.
  • Setup and/or maintain existing dataflows in data wrangling tools like Alteryx or Google Dataprep.
  • Create and/or maintain SQL scripts.
  • Monitor, troubleshoot, and remediate data quality across marketing data systems.
  • Design and execute data quality checks.
  • Maintain ongoing management and stewardship of data governance, processing, and reporting.
  • Govern taxonomy additions, application, and use.
  • Serve as a knowledge expert for operational processes and identify improvement areas.
  • Evaluate opportunities for simplification and/or automation for reporting and processes.

AWSPythonSQLData AnalysisGCPMicrosoft Power BITableauAmazon Web ServicesData analysisData engineering

Posted 2024-11-09
Apply