Apply

Data Engineer

Posted 2024-10-21

View full description

๐Ÿ’Ž Seniority level: Middle, 3-5 years

๐Ÿ“ Location: Brazil, Argentina, Mexico, Trinidad & Tobago, Costa Rica

๐Ÿ” Industry: Quality management and tech services

๐Ÿข Company: Testlio

๐Ÿ—ฃ๏ธ Languages: English

โณ Experience: 3-5 years

๐Ÿช„ Skills: AWSPythonSQLETLTableauAmazon Web ServicesCommunication Skills

Requirements:
  • 3-5 years' experience in data ingestion and ETL pipelining using AWS tools.
  • Proficiency in AWS Glue, Quicksight, Power BI, Tableau, etc.
  • Experience in data visualization and dashboarding.
  • Strong SQL and Python scripting skills; familiarity with JavaScript is beneficial.
  • Expertise in data modeling, analysis, and warehousing.
  • Experience in data quality assurance and troubleshooting.
  • Knowledge of data architecture and automated data integration.
Responsibilities:
  • Develop detailed data models and dictionaries for data warehouses.
  • Create and maintain data pipelines to centralize data from various sources.
  • Automate data integration processes for efficiency.
  • Implement a data mart system for reporting and actionable insights.
  • Analyze and troubleshoot data quality issues.
  • Work closely with data scientists, analysts, and stakeholders to understand data needs.
Apply

Related Jobs

Apply
๐Ÿ”ฅ Data Engineer
Posted 2024-11-20

๐Ÿ“ Argentina, Spain, England, United Kingdom, Lisbon, Portugal

๐Ÿงญ Full-Time

๐Ÿ” Web3

๐Ÿข Company: Reown

  • 5+ years working in the analytics stack within a fast-paced environment.
  • 3+ years production experience with SQL templating engines like DBT.
  • Experience with distributed query engines (Bigquery, Athena, Spark), data warehouses, and BI tools.
  • Strong understanding of software engineering principles, coding standards, design patterns, version control (e.g., Git), testing methodologies, and CI/CD processes.
  • Experience with AWS/GCP/Azure services for deployment and management.
  • Familiarity with GitHub, CI/CD pipelines, GitHub Actions, and Terraform.
  • Ability to write Python scripts for ETL processes and data manipulation.
  • Proficient with libraries like pandas for analysis and transformation.
  • Experience handling various data formats (e.g., CSV, JSON, Parquet).
  • Strong problem-solving skills and communication abilities to discuss technical concepts.

  • Write complex SQL queries that extract and combine data from on-chain and off-chain logs for analytics.
  • Create dashboards and tools for team data discoverability and KR tracking.
  • Perform deep-dive analyses into specific topics for internal stakeholders.
  • Help design, implement, and evolve Reown's on-chain data infrastructure.
  • Build, maintain, and monitor end-to-end data pipelines for new datasets and features.
  • Write health-checks and alerts to ensure data correctness, consistency, and freshness.
  • Meet with product managers and stakeholders to understand data needs and detect new product opportunities.

AWSPythonSQLData AnalysisDesign PatternsETLGCPGitTableauAzureClickhouseData analysisPandasSparkCommunication SkillsCI/CDTerraformWritten communication

Posted 2024-11-20
Apply
Apply

๐Ÿ“ Latin America

๐Ÿงญ Contract

๐Ÿ’ธ 1300 - 2300 USD per month

๐Ÿ” Talent-as-a-Service

๐Ÿข Company: GoFasti

  • Bachelorโ€™s degree in Computer Science, Engineering, or related field, or equivalent work experience.
  • 4+ years working as a Data Engineer or similar role focusing on data infrastructure.
  • Experience with AWS (Redshift, RDS) and Google Cloud (BigQuery, Firebase).
  • Proficiency in at least one programming language (Python, Java, Scala).
  • Hands-on experience with AWS Kinesis for real-time data ingestion.
  • Familiarity with data pipeline orchestration tools such as dbt and Airflow.
  • Strong knowledge of data warehousing concepts and ETL best practices.

  • Enhance and maintain the infrastructure powering analytics and data products.
  • Ingest new data from files, APIs, and external databases into a centralized data store.
  • Continuously monitor, optimize, and troubleshoot data pipelines and infrastructure.
  • Work closely with business stakeholders to understand data needs and translate requirements into solutions.
  • Implement and enforce data governance policies for security, integrity, and compliance.
  • Drive efforts to ensure high data quality by identifying and resolving pipeline issues.
  • Participate in architecture discussions and engage in peer code reviews.

AWSPythonETLAirflowFirebaseAttention to detailCompliance

Posted 2024-11-19
Apply
Apply

๐Ÿ“ Latin America

๐Ÿงญ Contract

๐Ÿ’ธ 800 - 1200 USD per month

๐Ÿ” Talent-as-a-Service

๐Ÿข Company: GoFasti

  • Bachelorโ€™s degree in Computer Science, Engineering, or related field, or equivalent work experience.
  • 1+ years working as a Data Engineer or in a similar technical role.
  • Experience with AWS (Redshift, RDS) and Google Cloud (BigQuery, Firebase).
  • Proficiency in at least one programming language (Python, Java, Scala).
  • Hands-on experience with real-time data streaming solutions (AWS Kinesis).
  • Familiarity with data pipeline orchestration tools such as dbt and Airflow.
  • Strong knowledge of data warehousing concepts and ETL best practices.

  • Enhance and maintain the infrastructure powering analytics and data products.
  • Ingest new data from files, APIs, and external databases into a centralized data store.
  • Continuously monitor, optimize, and troubleshoot data pipelines.
  • Work closely with business stakeholders to understand data needs.
  • Implement and enforce data governance policies.
  • Contribute to ensuring high data quality across the organization.
  • Participate in technical architecture discussions.

AWSPythonETLAirflowFirebaseAttention to detailCompliance

Posted 2024-11-19
Apply
Apply

๐Ÿ“ Latin America and other parts of the world

๐Ÿ” Insurance

  • Experience in Data Engineering.
  • Proficient in Python or Scala.
  • Excellent communication skills.
  • Attention to detail and strong problem-solving abilities.
  • Ability to work in an Agile environment.

  • Responsible for development and maintenance of systems in enterprise data and analytics environments.
  • Optimize performance by building and supporting decision-making tools.
  • Collaborate closely with software engineering, AI/ML, Cybersecurity, and DevOps/SysOps teams.
  • Design data pipelines and databases, build infrastructure and alerting frameworks.
  • Process data using SFTPs to APIs.

PythonSoftware DevelopmentAgileData engineeringDevOpsAttention to detail

Posted 2024-11-15
Apply
Apply

๐Ÿ“ Latin America, United States, Canada

๐Ÿ” Life insurance

  • The ideal candidate will be independent and a great communicator.
  • Attention to detail is critical.
  • Must possess problem-solving skills.

  • Develop and maintain enterprise data and analytics systems for a US client.
  • Optimize performance by building and supporting decision-making tools.
  • Collaborate closely with software engineering, AI/ML, Cybersecurity, and DevOps/SysOps teams.
  • Support end-to-end data pipelines using Python or Scala.
  • Participate in Agile framework-related tasks.

PythonSoftware DevelopmentAgileData engineeringDevOpsAttention to detail

Posted 2024-11-15
Apply
Apply

๐Ÿ“ North America, South America, Europe

๐Ÿ’ธ 100000 - 500000 USD per year

๐Ÿ” Web3, blockchain

๐Ÿข Company: Edge & Node

  • A self-motivated, team member with keen attention to detail.
  • Proactive collaboration with team members and a willingness to adapt to a growing environment.
  • Familiarity and experience with Rust, particularly focusing on data transformation and ingestion.
  • A strong understanding of blockchain data structures and ingestion interfaces.
  • Experience in real-time data handling, including knowledge of reorg handling.
  • Familiarity with blockchain clients like Geth and Reth is a plus.
  • Adaptability to a dynamic and fully-remote work environment.
  • Rigorous approach to software development that reflects a commitment to excellence.

  • Develop and maintain data ingestion adapters for various blockchain networks and web3 protocols.
  • Implement data ingestion strategies for both historical and recent data.
  • Apply strategies for handling block reorgs.
  • Optimize the latency of block ingestion at the chain head.
  • Write interfaces with file storage protocols such as IPFS and Arweave.
  • Collaborate with upstream data sources, such as chain clients and tracing frameworks, and monitor the latest upstream developments.
  • Perform data quality checks, cross-checking data across multiple sources and investigating any discrepancies that arise.

Software DevelopmentBlockchainData StructuresRustCollaborationAttention to detail

Posted 2024-11-15
Apply
Apply

๐Ÿ“ Argentina, Colombia, Costa Rica, Mexico

๐Ÿ” Data Analytics

  • Proficient with SQL and data visualization tools (Tableau, PowerBI, Google Data Studio).
  • Programming skills mainly in SQL.
  • Knowledge and experience with Python and/or R is a plus.
  • Experience with tools like Alteryx is a plus.
  • Experience working with Google Cloud and AWS is a plus.

  • Analyze data and consult with subject matter experts to design and develop business rules for data processing.
  • Setup and/or maintain existing dataflows in data wrangling tools like Alteryx or Google Dataprep.
  • Create and/or maintain SQL scripts.
  • Monitor, troubleshoot, and remediate data quality across marketing data systems.
  • Design and execute data quality checks.
  • Maintain ongoing management and stewardship of data governance, processing, and reporting.

PythonSQLGCPMicrosoft Power BITableau

Posted 2024-11-09
Apply
Apply

๐Ÿ“ Argentina, Colombia, Costa Rica, Mexico

๐Ÿ” Data Analytics

  • Proficient with SQL and data visualization tools such as Tableau, PowerBI, Google Data Studio.
  • Strong programming skills primarily in SQL.
  • Knowledge and experience with Python and/or R is a plus.
  • Experience with tools like Alteryx is a plus.
  • Experience working with Google Cloud and AWS is a plus.
  • Familiarity with Gitlab.

  • Analyze data and consult with subject matter experts to design and develop business rules for data processing.
  • Setup and/or maintain existing dataflows in tools like Alteryx or Google Dataprep.
  • Create and/or maintain SQL scripts.
  • Monitor, troubleshoot, and remediate data quality across marketing data systems.
  • Design and execute data quality checks.
  • Keep up to date on digital media operations.
  • Maintain ongoing management and stewardship of data governance.
  • Govern taxonomy additions and applications.
  • Serve as a knowledge expert for operational processes.
  • Evaluate opportunities for simplification and automation.

PythonSQLGCPKubernetesMicrosoft Power BITableau

Posted 2024-11-09
Apply
Apply

๐Ÿ“ LATAM

๐Ÿข Company: Nearsure

  • Bachelor's Degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience working with Python (data manipulation, scripting, automation).
  • 5+ years of experience working with datasets and data collection.
  • 3+ years of experience working with ETL tools.
  • 3+ years of experience working with cloud platforms (AWS, GCP, or AEP).
  • 2+ years of experience working with Artificial Intelligence.
  • 2+ years of experience working with Airflow and Snowflake.
  • Advanced English level is required.

  • Collaborate with cross-functional teams to understand data requirements and objectives.
  • Develop and maintain efficient data pipelines for data collection, processing, and cleaning.
  • Utilize expertise in Python and AI to enhance data processing and analysis capabilities.
  • Work with platforms like AEP, Google Cloud, and AWS to manage and optimize data infrastructure.
  • Explore and implement best practices for data quality, integrity, and security.
  • Collaborate with stakeholders to address data-related challenges.
  • Stay current with industry trends and emerging technologies in data engineering.

AWSPythonArtificial IntelligenceETLGCPSnowflakeAirflowData engineering

Posted 2024-11-09
Apply
Apply

๐Ÿ“ Argentina, Colombia, Costa Rica, Mexico

๐Ÿ” Data Analytics

  • Proficient with SQL and data visualization tools (i.e. Tableau, PowerBI, Google Data Studio).
  • Programming skills mainly SQL.
  • Knowledge and experience with Python and/or R a plus.
  • Experience with tools like Alteryx a plus.
  • Experience working with Google Cloud and AWS a plus.

  • Analyze data and consult with subject matter experts to design and develop business rules for data processing.
  • Setup and/or maintain any existing dataflows in data wrangling tools like Alteryx or Google Dataprep.
  • Create and/or maintain SQL scripts.
  • Monitor, troubleshoot, and remediate data quality across marketing data systems ensuring full understanding of client deliverables.
  • Design and execute data quality checks.
  • Maintain ongoing management and stewardship of data governance, processing, and reporting.
  • Govern taxonomy additions, application, and use.
  • Serve as a knowledge expert for operational processes and identify areas of improvement to ensure appropriate turnaround times and data quality standards are being met.
  • Evaluate opportunities for simplification and/or automation for reporting and various processes.

AWSPythonSQLGCPMicrosoft Power BITableauAmazon Web ServicesData engineering

Posted 2024-11-09
Apply