Apply

Senior Data Engineer

Posted 2024-10-23

View full description

💎 Seniority level: Middle, 1-3 years

📍 Location: Guatemala

🔍 Industry: Fintech

🏢 Company: FundThrough

🗣️ Languages: English

⏳ Experience: 1-3 years

🪄 Skills: AWSLeadershipPythonSQLAirflowData engineering

Requirements:
  • Bachelor's degree in Computer Science, Computer Engineering, or relevant technical field.
  • 1-3 years of Python development experience.
  • 1-3 years of SQL/No-SQL experience.
  • 1-3 years of experience with data workflow management engines like Airflow, Luigi, or similar.
  • Experience in analyzing data, communicating insights, and addressing gaps.
  • Experience with cloud or on-prem Big Data analytics platforms such as AWS Redshift or Google BigQuery.
Responsibilities:
  • Design, build and launch efficient and reliable data pipelines across various platforms.
  • Build scalable data architecture foundations.
  • Collaborate with leadership and technical teams to understand and meet data needs.
  • Conduct audits of data pipelines to identify inefficiencies and enhance data flow.
  • Communicate findings and insights via presentations and dashboards.
  • Identify gaps in existing processes and address them.
  • Utilize data principles to solve infrastructure problems.
  • Ensure data quality and expertise in assigned areas.
Apply

Related Jobs

Apply

📍 LATAM

🔍 Cannabis

🏢 Company: Truelogic Software

  • 5+ years of proven expertise and success in a data-intensive role.
  • Significant hands-on experience with AWS.
  • Experience executing ETL processes and managing cost optimization in cloud infrastructure.
  • Background in data and user governance.
  • Ability to lead junior engineers and actively contribute to the engineering lifecycle.
  • Outstanding problem-solving skills with practical communication abilities.

  • Design and implement scalable, efficient data solutions tailored to evolving needs.
  • Develop and sustain distributed data systems, ensuring performance, security, and integrity.
  • Collaborate with the architecture guild on cloud warehousing needs, focusing on AWS services.
  • Implement and optimize ETL processes for seamless data integration.
  • Utilize and optimize Snowflake for data warehousing and analytics.
  • Collaborate with diverse teams to understand data requirements and deliver solutions.
  • Troubleshoot and resolve data-related issues promptly.
  • Provide guidance and foster growth opportunities for junior engineering staff.

AWSPythonETLSnowflake

Posted 2024-11-07
Apply
Apply

📍 LATAM

🔍 Digital Marketing

🏢 Company: Truelogic Software

  • 5+ years of proven experience in Data Engineering.
  • Deep understanding of data engineering principles and modern data platforms including Snowflake, BigQuery, and AWS.
  • Excellent problem-solving skills and attention to detail.
  • Strong written and verbal communication skills.

  • Lead the design and implementation of scalable data pipelines using Snowflake, AWS, and BigQuery.
  • Create and manage automated workflows for data processing.
  • Collaborate with cross-functional teams to enhance audience insights.
  • Monitor and optimize data pipeline performance.
  • Work closely with product managers and data scientists to deliver robust solutions.
  • Maintain thorough documentation of data architecture and best practices.

AWSSnowflakeData engineeringCommunication SkillsCollaborationAttention to detailDocumentation

Posted 2024-11-07
Apply
Apply

📍 LATAM

🔍 Consulting Services

🏢 Company: Truelogic Software

  • 3+ years of experience as a Data Engineer or in a similar role.
  • Proven experience with ETL/ELT processes and data pipeline development.
  • Hands-on experience with data integration tools like Airbyte, Fivetran, or similar.
  • Proficiency in SQL and experience with relational databases.
  • Experience working with cloud data warehouses, particularly Snowflake.
  • Familiarity with workflow orchestration tools such as Apache Airflow.
  • Programming skills in Python or another relevant language.
  • Knowledge of data modeling, data warehousing concepts, and data architecture.

  • Analyze inbound data sources to understand structure, quality, and formats.
  • Categorize data based on relevance, sensitivity, and usage requirements.
  • Establish connectivity to various data sources, including APIs and databases.
  • Develop and configure data ingestion pipelines using tools like Airbyte or Fivetran.
  • Implement data governance policies, including data ownership and access controls.
  • Work closely with cross-functional teams and maintain documentation.
  • Monitor and optimize data pipeline performance.

PythonSQLApache AirflowData AnalysisETLSnowflakeAirflowData analysisCollaboration

Posted 2024-10-25
Apply
Apply

📍 LATAM

🔍 Financial Services

🏢 Company: Truelogic Software

  • 5-7 years of experience in data engineering.
  • 3-5 years of experience in software development.
  • Deep experience with Snowflake and Stored Procedures.
  • Extensive experience working with Python and modern web frameworks.
  • Experience with queueing systems such as Celery, SQS, Pub/Sub.
  • Strong expertise in Python 3, Object-Oriented Code, and Design Patterns.
  • Experience with REST APIs, Git, and writing unit tests.
  • Knowledge of databases (SQL, NoSQL), AWS, serverless environments, and Infrastructure as Code (Cloudformation & CDK).
  • Familiarity with DevOps practices (CI/CD, Automated Pipelines) and Agile methodologies.

  • You will work in a growth-oriented environment at a financial services firm focused on the U.S. mortgage market.
  • Be responsible for providing strategic guidance and solution patterns for sub products.
  • Collaborate with business analysts and stakeholders to optimize requirements.
  • Work closely with developers to review and validate key functionality and ensure successful integration with existing systems.

AWSPythonSoftware DevelopmentSQLAgileDesign PatternsDjangoFlaskGitSCRUMSnowflakeTypeScriptJiraAmazon Web ServicesData engineeringServerlessNosqlCI/CDMicroservices

Posted 2024-10-16
Apply
Apply

📍 LATAM

🔍 Mortgage Lending

🏢 Company: Truelogic Software

  • 5+ years of hands-on experience in backend engineering or data integration.
  • At least 2 years of experience working with an MDM platform.
  • Experience integrating with customer preference platforms like OneTrust or PossibleNow.
  • Proficiency in Python or another relevant programming language.
  • Expertise in AWS serverless technologies including Glue, Lambda, Kinesis, SQS.
  • Strong SQL skills and experience with Snowflake or similar data warehouses.
  • Experience with data modeling, cleansing, and quality management.
  • Understanding of REST APIs and web services.
  • Attention to detail and commitment to data accuracy.

  • Become a subject matter expert on Reltio.
  • Design and maintain integrations between Reltio and other systems using AWS services.
  • Implement data quality rules and cleansing routines.
  • Collaborate with data stewards to enforce governance policies.
  • Monitor and optimize performance of integrations.
  • Work with stakeholders to gather requirements and troubleshoot issues.
  • Stay updated on trends in MDM and data integration.

AWSNode.jsPythonSQLJavascriptSnowflakeJavaScriptAlgorithmsServerlessCollaboration

Posted 2024-10-16
Apply