Apply

Senior Data Engineer

Posted 2024-12-03

View full description

💎 Seniority level: Senior, At least 4+ years

📍 Location: United States, Latin America, India

🔍 Industry: Cloud Data Technologies

🏢 Company: phData

🗣️ Languages: English

⏳ Experience: At least 4+ years

🪄 Skills: AWSPythonSoftware DevelopmentSQLElasticSearchGCPHadoopJavaKafkaSnowflakeAirflowAzureCassandraElasticsearchNosqlSparkCommunication SkillsDocumentation

Requirements:
  • At least 4+ years experience as a Software Engineer, Data Engineer or Data Analyst.
  • Programming expertise in Java, Python and/or Scala.
  • Core cloud data platforms: Snowflake, AWS, Azure, Databricks, and GCP.
  • SQL proficiency and the ability to write, debug, and optimize SQL queries.
  • Experience creating and delivering detailed presentations.
Responsibilities:
  • Develop end-to-end technical solutions into production.
  • Ensure performance, security, scalability, and robust data integration.
  • Client-facing communication and presentation delivery.
  • Create detailed solution documentation.
Apply

Related Jobs

Apply
🔥 Senior Data Engineer
Posted 2024-12-03

📍 Argentina

🧭 Full-Time

🔍 Nonprofit fundraising technology

🏢 Company: GoFundMe👥 251-500💰 $ Series A on 2015-07-02🫂 on 2022-10-26InternetCrowdfundingPeer to Peer

  • 5+ years as a data engineer crafting, developing and maintaining business data warehouse alternatives consisting of structured and unstructured data.
  • Proficiency with building and orchestrating data pipelines using ETL/data preparation tools.
  • Expertise in orchestration tools like Airflow or Prefect.
  • Proficiency in connecting data through web APIs.
  • Proficiency in writing and optimizing SQL queries.
  • Solid knowledge of Python and other programming languages.
  • Experience with Snowflake is required.
  • Good understanding of database architecture and best practices.

  • Develop and maintain enterprise data warehouse (Snowflake).
  • Develop and orchestrate ELT data pipelines, sourcing data from databases and web APIs.
  • Integrate data from warehouse into third-party tools for actionable insights.
  • Develop and sustain REST API endpoints for data science products.
  • Provide ongoing maintenance and improvements to existing data solutions.
  • Monitor and optimize Snowflake usage for performance and cost-effectiveness.

AWSPythonSQLAWS EKSETLJavaKubernetesSnowflakeC++AirflowREST APICollaborationTerraform

Posted 2024-12-03
Apply
Apply

📍 LATAM

🔍 Staff augmentation

🏢 Company: Nearsure

  • Bachelor's Degree in Computer Science, Engineering, or related field.
  • 5+ Years of experience working in Data Engineering.
  • 3+ Years of experience working with SQL.
  • 3+ Years of experience working with Python.
  • 2+ Years of experience working with Cloud.
  • 2+ Years of experience working with Microservices.
  • 2+ Years of experience working with Databases.
  • 2+ Years of experience working with CI/CD tools.
  • Advanced English Level is required.

  • Write code to test and validate business logic in SQL and code repositories.
  • Test back-end API interfaces for the financial platform.
  • Build developer productivity tools.
  • Provide risk assessments on identified defects.
  • Define quality engineering scope and maintain automated scripts.
  • Participate in Scrum meetings for status updates.
  • Identify and clarify information needs.
  • Organize and resolve current backlog issues.

PythonSQLSCRUMData engineeringCI/CDMicroservices

Posted 2024-11-26
Apply
Apply

📍 United States

🔍 Data and technology

  • 5+ years of experience making contributions in the form of code.
  • Experience with algorithms and data structures and knowing when to apply them.
  • Deep familiarity with Scala or Java.
  • Experience working with high-scale systems: realtime and batch.
  • Interest in data engineering to develop ingestion engines, ETL pipelines, and organizing data.
  • Experience in Machine Learning techniques and tools is a plus.

  • Be a senior member of the team by contributing to the architecture, design, and implementation of EMS systems.
  • Mentor junior engineers and promote their growth.
  • Lead technical projects, managing the planning, execution, and success of complex technical projects.
  • Collaborate with other engineering, product, and data science teams to ensure we're building the best products.
  • Be on call if required and accommodate Eastern Time Zone.

SQLETLGCPAlgorithmsData engineeringData StructuresSparkCollaboration

Posted 2024-11-26
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-24

📍 United States, Latin America, India

🧭 Full-Time

🔍 Modern data stack and cloud data services

  • At least 4+ years experience as a Software Engineer, Data Engineer, or Data Analyst.
  • Ability to develop end-to-end technical solutions in production environments.
  • Proficient in Java, Python, or Scala.
  • Experience with core cloud data platforms like Snowflake, AWS, Azure, Databricks, and GCP.
  • Strong SQL skills, capable of writing, debugging, and optimizing queries.
  • Client-facing communication skills for presentations and documentation.
  • Bachelor's degree in Computer Science or a related field.

  • Develop end-to-end technical solutions and ensure their performance, security, and scalability.
  • Help with robust data integration and contribute to the production deployment of solutions.
  • Create and deliver detailed presentations to clients.
  • Document solutions including POCs, roadmaps, diagrams, and logical system views.

AWSPythonSoftware DevelopmentSQLElasticSearchGCPHadoopJavaKafkaSnowflakeAirflowAzureCassandraElasticsearchNosqlSparkCommunication SkillsDocumentation

Posted 2024-11-24
Apply
Apply

📍 Spain, Colombia, Argentina, Chile, Mexico

🔍 HR Tech

🏢 Company: Jobgether

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field.
  • Minimum of 5 years of experience in data engineering, focusing on building and managing data pipelines and infrastructure.
  • 5 years of experience in Python programming.
  • Hands-on experience with big data frameworks and tools like Hadoop, Spark, or Kafka.
  • Proficiency with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (experience with MongoDB).
  • Experience with cloud platforms, specifically AWS.
  • Strong understanding of data modeling, schema design, and data warehousing concepts.
  • Excellent analytical and troubleshooting abilities.
  • Fluency in English and Spanish.
  • Preferred: Experience deploying machine learning models, familiarity with CI/CD pipelines, knowledge of data privacy regulations.

  • Design, build, and maintain scalable data pipelines and ETL processes to collect, process, and store large volumes of structured and unstructured data.
  • Develop and optimize data scraping and extraction solutions to gather job data from multiple sources efficiently.
  • Collaborate with data scientists to implement and optimize AI-driven matching algorithms.
  • Ensure data integrity, accuracy, and reliability by implementing robust validation and monitoring mechanisms.
  • Analyze and optimize system performance, addressing bottlenecks and scaling challenges.
  • Work with cross-functional teams to deploy machine learning models into production environments.
  • Stay abreast of emerging technologies and best practices in data engineering and AI.
  • Partner with Product, Engineering, and Operations teams to understand data requirements.
  • Develop and maintain comprehensive documentation for data pipelines and systems architecture.

AWSPostgreSQLPythonETLHadoopKafkaMongoDBMySQLAlgorithmsData engineeringNosqlSparkCommunication SkillsCollaborationDevOpsWritten communicationDocumentation

Posted 2024-11-20
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-15

📍 United States, Canada

🔍 Advanced analytics consulting

🏢 Company: Tiger Analytics👥 1001-5000AdvertisingConsultingBig DataNewsMachine LearningAnalytics

  • Bachelor’s degree in Computer Science or similar field.
  • 8+ years of experience in a Data Engineer role.
  • Experience with relational SQL and NoSQL databases like MySQL, Postgres.
  • Strong analytical skills and advanced SQL knowledge.
  • Development of ETL pipelines using Python & SQL.
  • Good experience with Customer Data Platforms (CDP).
  • Experience in SQL optimization and performance tuning.
  • Data modeling and building high-volume ETL pipelines.
  • Working experience with any cloud platform.
  • Experience with Google Tag Manager and Power BI is a plus.
  • Experience with object-oriented scripting languages: Python, Java, Scala, etc.
  • Experience extracting/querying/joining large data sets at scale.
  • Strong communication and organizational skills.

  • Designing, building, and maintaining scalable data pipelines on cloud infrastructure.
  • Working closely with cross-functional teams.
  • Supporting data analytics, machine learning, and business intelligence initiatives.

PythonSQLBusiness IntelligenceETLJavaMySQLPostgresNosqlAnalytical SkillsOrganizational skills

Posted 2024-11-15
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-15

📍 United States

🔍 Innovation data and analytics

🏢 Company: Cypris👥 11-50💰 $5.3m on 2024-06-28Artificial Intelligence (AI)Business IntelligenceMarket ResearchSoftware

  • 7+ years of proven experience as a Data Engineer or in a similar role.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Proficiency in programming languages like Python, Java, or Scala.
  • Experience with cloud platforms such as GCP, AWS, Google Cloud, or Azure.
  • Hands-on experience with big data technologies such as Hadoop, Spark.
  • Knowledge of data warehousing concepts and experience with tools like Redshift, BigQuery, or Snowflake.
  • Familiarity with ETL tools and processes.

  • Design, develop, and optimize robust data pipelines to process and transform large datasets from various sources.
  • Implement and maintain ETL processes to ensure data accuracy and integrity.
  • Collaborate with cross-functional teams to understand data requirements and deliver effective data solutions.
  • Monitor and troubleshoot data pipeline performance and reliability, implementing improvements as needed.
  • Ensure data security and compliance with relevant regulations and standards.

PythonElasticSearchETLGCPHadoopJavaMongoDBSnowflakeData engineeringElasticsearchSparkAttention to detailCompliance

Posted 2024-11-15
Apply
Apply

📍 ANY STATE

🔍 Data and technology

  • 5+ years of experience making contributions in the form of code.
  • Experience with algorithms and data structures and knowing when to apply them.
  • Experience with machine learning techniques to develop better predictive and clustering models.
  • Experience working with high-scale systems.
  • Experience creating powerful machine learning tools for experimentation and productionalization at scale.
  • Experience in data engineering and warehousing to develop ingestion engines, ETL pipelines, and organizing data for consumption.

  • Be a senior member of the team by contributing to the architecture, design, and implementation of EMS systems.
  • Mentor junior engineers and promote their growth.
  • Lead technical projects and manage planning, execution, and success of complex technical projects.
  • Collaborate with other engineering, product, and data science teams to ensure optimal product development.

PythonSQLETLGCPKubeflowMachine LearningAlgorithmsData engineeringData scienceData StructuresTensorflowCollaboration

Posted 2024-11-14
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-11

📍 United States

🧭 Full-Time

🔍 Energy analytics and forecasting

  • Senior level experience within data engineering with primary focus using Python.
  • Experience with cloud-based infrastructure (Kubernetes/Docker) and data services (GCP, AWS, Azure, et al).
  • Proven track record of building data pipelines that deliver business impact.
  • Experience working on complex large codebase with a focus on refactoring and enhancements.
  • Experience building data monitoring pipelines focused on scalability.

  • Rebuild systems to identify more efficient ways to process data.
  • Automate the entire forecasting pipeline, including data collection, preprocessing, model training, and deployment.
  • Continuously monitor system performance and optimize data processing workflows to reduce latency and improve efficiency.
  • Set up real-time monitoring for data feeds to detect anomalies or issues promptly.
  • Utilize distributed computing and parallel processing to handle large-scale data.
  • Design your data infrastructure to be scalable to accommodate future growth in data volume and sources.

AWSDockerPythonGCPKubernetesAzureData engineering

Posted 2024-11-11
Apply
Apply

📍 Argentina, Colombia, Costa Rica, Mexico

🧭 Full-Time

🔍 Data Analytics

  • Proficient with SQL and data visualization tools (e.g., Tableau, PowerBI, Google Data Studio).
  • Programming skills mainly in SQL.
  • Knowledge and experience with Python and/or R is a plus.
  • Experience with tools like Alteryx is a plus.
  • Experience working with Google Cloud and AWS is a plus.

  • Analyze data and consult with subject matter experts to design and develop business rules for data processing.
  • Setup and/or maintain existing dataflows in tools like Alteryx or Google Dataprep.
  • Create and/or maintain SQL scripts.
  • Monitor, troubleshoot, and remediate data quality across marketing data systems.
  • Design and execute data quality checks.
  • Maintain ongoing management of data governance, processing, and reporting.
  • Govern taxonomy additions, application, and use.
  • Serve as a knowledge expert for operational processes and identify improvement areas.
  • Evaluate opportunities for simplification and/or automation.

PythonSQLGCPMicrosoft Power BITableau

Posted 2024-11-09
Apply