Apply

Senior Data Engineer

Posted 2024-12-03

View full description

πŸ’Ž Seniority level: Senior, 5+ years

πŸ“ Location: Argentina

πŸ” Industry: Nonprofit fundraising technology

🏒 Company: GoFundMeπŸ‘₯ 251-500πŸ’° $ Series A on 2015-07-02πŸ«‚ on 2022-10-26InternetCrowdfundingPeer to Peer

πŸ—£οΈ Languages: English

⏳ Experience: 5+ years

πŸͺ„ Skills: AWSPythonSQLAWS EKSETLJavaKubernetesSnowflakeC++AirflowREST APICollaborationTerraform

Requirements:
  • 5+ years as a data engineer crafting, developing and maintaining business data warehouse alternatives consisting of structured and unstructured data.
  • Proficiency with building and orchestrating data pipelines using ETL/data preparation tools.
  • Expertise in orchestration tools like Airflow or Prefect.
  • Proficiency in connecting data through web APIs.
  • Proficiency in writing and optimizing SQL queries.
  • Solid knowledge of Python and other programming languages.
  • Experience with Snowflake is required.
  • Good understanding of database architecture and best practices.
Responsibilities:
  • Develop and maintain enterprise data warehouse (Snowflake).
  • Develop and orchestrate ELT data pipelines, sourcing data from databases and web APIs.
  • Integrate data from warehouse into third-party tools for actionable insights.
  • Develop and sustain REST API endpoints for data science products.
  • Provide ongoing maintenance and improvements to existing data solutions.
  • Monitor and optimize Snowflake usage for performance and cost-effectiveness.
Apply

Related Jobs

Apply

πŸ“ United States, Latin America, India

πŸ” Cloud Data Technologies

🏒 Company: phData

  • At least 4+ years experience as a Software Engineer, Data Engineer or Data Analyst.
  • Programming expertise in Java, Python and/or Scala.
  • Core cloud data platforms: Snowflake, AWS, Azure, Databricks, and GCP.
  • SQL proficiency and the ability to write, debug, and optimize SQL queries.
  • Experience creating and delivering detailed presentations.

  • Develop end-to-end technical solutions into production.
  • Ensure performance, security, scalability, and robust data integration.
  • Client-facing communication and presentation delivery.
  • Create detailed solution documentation.

AWSPythonSoftware DevelopmentSQLElasticSearchGCPHadoopJavaKafkaSnowflakeAirflowAzureCassandraElasticsearchNosqlSparkCommunication SkillsDocumentation

Posted 2024-12-03
Apply
Apply

πŸ“ LATAM

πŸ” Staff augmentation

🏒 Company: Nearsure

  • Bachelor's Degree in Computer Science, Engineering, or related field.
  • 5+ Years of experience working in Data Engineering.
  • 3+ Years of experience working with SQL.
  • 3+ Years of experience working with Python.
  • 2+ Years of experience working with Cloud.
  • 2+ Years of experience working with Microservices.
  • 2+ Years of experience working with Databases.
  • 2+ Years of experience working with CI/CD tools.
  • Advanced English Level is required.

  • Write code to test and validate business logic in SQL and code repositories.
  • Test back-end API interfaces for the financial platform.
  • Build developer productivity tools.
  • Provide risk assessments on identified defects.
  • Define quality engineering scope and maintain automated scripts.
  • Participate in Scrum meetings for status updates.
  • Identify and clarify information needs.
  • Organize and resolve current backlog issues.

PythonSQLSCRUMData engineeringCI/CDMicroservices

Posted 2024-11-26
Apply
Apply

πŸ“ United States, Latin America, India

🧭 Full-Time

πŸ” Modern data stack and cloud data services

  • At least 4+ years experience as a Software Engineer, Data Engineer, or Data Analyst.
  • Ability to develop end-to-end technical solutions in production environments.
  • Proficient in Java, Python, or Scala.
  • Experience with core cloud data platforms like Snowflake, AWS, Azure, Databricks, and GCP.
  • Strong SQL skills, capable of writing, debugging, and optimizing queries.
  • Client-facing communication skills for presentations and documentation.
  • Bachelor's degree in Computer Science or a related field.

  • Develop end-to-end technical solutions and ensure their performance, security, and scalability.
  • Help with robust data integration and contribute to the production deployment of solutions.
  • Create and deliver detailed presentations to clients.
  • Document solutions including POCs, roadmaps, diagrams, and logical system views.

AWSPythonSoftware DevelopmentSQLElasticSearchGCPHadoopJavaKafkaSnowflakeAirflowAzureCassandraElasticsearchNosqlSparkCommunication SkillsDocumentation

Posted 2024-11-24
Apply
Apply

πŸ“ Spain, Colombia, Argentina, Chile, Mexico

πŸ” HR Tech

🏒 Company: Jobgether

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field.
  • Minimum of 5 years of experience in data engineering, focusing on building and managing data pipelines and infrastructure.
  • 5 years of experience in Python programming.
  • Hands-on experience with big data frameworks and tools like Hadoop, Spark, or Kafka.
  • Proficiency with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (experience with MongoDB).
  • Experience with cloud platforms, specifically AWS.
  • Strong understanding of data modeling, schema design, and data warehousing concepts.
  • Excellent analytical and troubleshooting abilities.
  • Fluency in English and Spanish.
  • Preferred: Experience deploying machine learning models, familiarity with CI/CD pipelines, knowledge of data privacy regulations.

  • Design, build, and maintain scalable data pipelines and ETL processes to collect, process, and store large volumes of structured and unstructured data.
  • Develop and optimize data scraping and extraction solutions to gather job data from multiple sources efficiently.
  • Collaborate with data scientists to implement and optimize AI-driven matching algorithms.
  • Ensure data integrity, accuracy, and reliability by implementing robust validation and monitoring mechanisms.
  • Analyze and optimize system performance, addressing bottlenecks and scaling challenges.
  • Work with cross-functional teams to deploy machine learning models into production environments.
  • Stay abreast of emerging technologies and best practices in data engineering and AI.
  • Partner with Product, Engineering, and Operations teams to understand data requirements.
  • Develop and maintain comprehensive documentation for data pipelines and systems architecture.

AWSPostgreSQLPythonETLHadoopKafkaMongoDBMySQLAlgorithmsData engineeringNosqlSparkCommunication SkillsCollaborationDevOpsWritten communicationDocumentation

Posted 2024-11-20
Apply
Apply

πŸ“ Argentina, Colombia, Costa Rica, Mexico

🧭 Full-Time

πŸ” Data Analytics

  • Proficient with SQL and data visualization tools (e.g., Tableau, PowerBI, Google Data Studio).
  • Programming skills mainly in SQL.
  • Knowledge and experience with Python and/or R is a plus.
  • Experience with tools like Alteryx is a plus.
  • Experience working with Google Cloud and AWS is a plus.

  • Analyze data and consult with subject matter experts to design and develop business rules for data processing.
  • Setup and/or maintain existing dataflows in tools like Alteryx or Google Dataprep.
  • Create and/or maintain SQL scripts.
  • Monitor, troubleshoot, and remediate data quality across marketing data systems.
  • Design and execute data quality checks.
  • Maintain ongoing management of data governance, processing, and reporting.
  • Govern taxonomy additions, application, and use.
  • Serve as a knowledge expert for operational processes and identify improvement areas.
  • Evaluate opportunities for simplification and/or automation.

PythonSQLGCPMicrosoft Power BITableau

Posted 2024-11-09
Apply
Apply

πŸ“ Argentina, Colombia, Costa Rica, Mexico

πŸ” Data Analytics

  • Proficient with SQL and data visualization tools (i.e. Tableau, PowerBI, Google Data Studio).
  • Programming skills mainly SQL.
  • Knowledge and experience with Python and/or R a plus.
  • Experience with Alteryx a plus.
  • Experience working with Google Cloud and AWS a plus.

  • Analyze data and consult with subject matter experts to design and develop business rules for data processing.
  • Setup and/or maintain existing dataflows in data wrangling tools like Alteryx or Google Dataprep.
  • Create and/or maintain SQL scripts.
  • Monitor, troubleshoot, and remediate data quality across marketing data systems.
  • Design and execute data quality checks.
  • Maintain ongoing management and stewardship of data governance, processing, and reporting.
  • Govern taxonomy additions, application, and use.
  • Serve as a knowledge expert for operational processes and identify improvement areas.
  • Evaluate opportunities for simplification and/or automation for reporting and processes.

AWSPythonSQLData AnalysisGCPMicrosoft Power BITableauAmazon Web ServicesData analysisData engineering

Posted 2024-11-09
Apply
Apply

πŸ“ Mexico, Gibraltar, Colombia, USA, Brazil, Argentina

🧭 Full-Time

πŸ” FinTech

🏒 Company: Bitso

  • Proven English fluency.
  • 3+ years professional working experience with analytics, ETLs, and data systems.
  • 3+ years with SQL databases, data lake, big data, and cloud infrastructure.
  • 3+ years experience with Spark.
  • BS or Master's in Computer Science or similar.
  • Strong proficiency in SQL, Python, and AWS.
  • Strong data modeling skills.

  • Build processes required for optimal extraction, transformation, and loading of data from various sources using SQL, Python, Spark.
  • Identify, design, and implement internal process improvements while optimizing data delivery and redesigning infrastructure for scalability.
  • Ensure data integrity, quality, and security.
  • Work with stakeholders to assist with data-related technical issues and support their data needs.
  • Manage data separation and security across multiple data sources.

AWSPythonSQLBusiness IntelligenceMachine LearningData engineeringData StructuresSparkCommunication Skills

Posted 2024-11-07
Apply
Apply

πŸ“ LATAM

πŸ” Cannabis

🏒 Company: Truelogic Software

  • 5+ years of proven expertise and success in a data-intensive role.
  • Significant hands-on experience with AWS.
  • Experience executing ETL processes and managing cost optimization in cloud infrastructure.
  • Background in data and user governance.
  • Ability to lead junior engineers and actively contribute to the engineering lifecycle.
  • Outstanding problem-solving skills with practical communication abilities.

  • Design and implement scalable, efficient data solutions tailored to evolving needs.
  • Develop and sustain distributed data systems, ensuring performance, security, and integrity.
  • Collaborate with the architecture guild on cloud warehousing needs, focusing on AWS services.
  • Implement and optimize ETL processes for seamless data integration.
  • Utilize and optimize Snowflake for data warehousing and analytics.
  • Collaborate with diverse teams to understand data requirements and deliver solutions.
  • Troubleshoot and resolve data-related issues promptly.
  • Provide guidance and foster growth opportunities for junior engineering staff.

AWSPythonETLSnowflake

Posted 2024-11-07
Apply
Apply

πŸ“ LATAM

πŸ” Digital Marketing

🏒 Company: Truelogic Software

  • 5+ years of proven experience in Data Engineering.
  • Deep understanding of data engineering principles and modern data platforms including Snowflake, BigQuery, and AWS.
  • Excellent problem-solving skills and attention to detail.
  • Strong written and verbal communication skills.

  • Lead the design and implementation of scalable data pipelines using Snowflake, AWS, and BigQuery.
  • Create and manage automated workflows for data processing.
  • Collaborate with cross-functional teams to enhance audience insights.
  • Monitor and optimize data pipeline performance.
  • Work closely with product managers and data scientists to deliver robust solutions.
  • Maintain thorough documentation of data architecture and best practices.

AWSSnowflakeData engineeringCommunication SkillsCollaborationAttention to detailDocumentation

Posted 2024-11-07
Apply
Apply

πŸ“ LATAM

πŸ” Consulting Services

🏒 Company: Truelogic Software

  • 3+ years of experience as a Data Engineer or in a similar role.
  • Proven experience with ETL/ELT processes and data pipeline development.
  • Hands-on experience with data integration tools like Airbyte, Fivetran, or similar.
  • Proficiency in SQL and experience with relational databases.
  • Experience working with cloud data warehouses, particularly Snowflake.
  • Familiarity with workflow orchestration tools such as Apache Airflow.
  • Programming skills in Python or another relevant language.
  • Knowledge of data modeling, data warehousing concepts, and data architecture.

  • Analyze inbound data sources to understand structure, quality, and formats.
  • Categorize data based on relevance, sensitivity, and usage requirements.
  • Establish connectivity to various data sources, including APIs and databases.
  • Develop and configure data ingestion pipelines using tools like Airbyte or Fivetran.
  • Implement data governance policies, including data ownership and access controls.
  • Work closely with cross-functional teams and maintain documentation.
  • Monitor and optimize data pipeline performance.

PythonSQLApache AirflowData AnalysisETLSnowflakeAirflowData analysisCollaboration

Posted 2024-10-25
Apply