Apply

Senior Data Engineer

Posted 11 days agoViewed

View full description

💎 Seniority level: Senior, 5+ years in data engineering

📍 Location: Brazil

🔍 Industry: Generative AI technology for government affairs

⏳ Experience: 5+ years in data engineering

🪄 Skills: AWSPythonSQLApache AirflowETLHadoopKafkaMachine LearningPyTorchNosqlSparkTensorflow

Requirements:
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in data engineering with a proven record.
  • Expertise in building data pipelines and architectures using AWS services.
  • Proficiency in big data tools like Hadoop, Spark, Kafka, and machine learning frameworks like TensorFlow and PyTorch.
  • 3+ years of experience with Python.
  • Deep knowledge of SQL and NoSQL databases, and workflow management tools.
  • Familiarity with vector databases and RAG systems is advantageous.
Responsibilities:
  • Architect and implement highly scalable advanced Retrieval-Augmented Generation (RAG) data pipelines.
  • Design robust data pipelines for real-time data processing and analysis.
  • Create data cleansing and transformation pipelines to support the AI product.
  • Lead cloud-based deployments in AWS for performance, security, and cost-efficiency.
  • Innovate on data architecture to support dynamic product needs.
  • Conduct tool selection and solution trade-off analysis.
Apply

Related Jobs

Apply
🔥 Senior Data Engineer
Posted about 1 month ago

📍 Brazil, Argentina

🧭 Full-Time

🔍 Manufacturing services

🏢 Company: Xometry👥 501-1000💰 $75,000,000 Series E over 4 years agoArtificial Intelligence (AI)3D PrintingIndustrial EngineeringSoftware

  • Bachelor's degree required, or relevant experience.
  • 3-5+ years of prior experience as a software engineer or data engineer in a fast-paced, technical, problem-solving environment.
  • Cloud Data Warehouse experience - Snowflake.
  • Expert in SQL.
  • Expert in ETL, data modeling, and version control, dbt and GitHub preferred.
  • Data modeling best practices for transactional and analytical processing.
  • Experience with data extraction tools (Fivetran, Airbyte, etc.).
  • Experience with event tracking software (Segment, Tealium, etc).
  • Experience with programming language like Python, JavaScript.
  • Experience with Business Intelligence tools, Looker preferred.
  • Ability to communicate effectively and influence others.
  • Ability to work in a fast-paced environment and shift gears quickly.
  • Must be able to work core aligned hours to US Eastern Time / GMT-5.

  • Close collaboration with other engineers and product managers to become a valued member of an autonomous, cross-functional team.
  • Build analytics models that utilize the data pipeline to provide actionable insights into key business performance metrics.
  • Develop data models for analytics and data scientist team members that assist them in building and optimizing data.
  • Maintain data pipelines and perform any changes or alterations as requested.
  • Develop deployment and release of functionality through software integration to support devops and CI/CD pipelines.

PythonSQLBusiness IntelligenceETLJavascriptSnowflakeCollaborationCI/CDDevOps

Posted about 1 month ago
Apply
Apply
🔥 (830) Senior Data Engineer
Posted about 1 month ago

📍 LATAM

🔍 Staff augmentation

🏢 Company: Nearsure👥 501-1000Staffing AgencyOutsourcingSoftware

  • Bachelor's Degree in Computer Science, Engineering, or related field.
  • 5+ Years of experience working in Data Engineering.
  • 3+ Years of experience working with SQL.
  • 3+ Years of experience working with Python.
  • 2+ Years of experience working with Cloud.
  • 2+ Years of experience working with Microservices.
  • 2+ Years of experience working with Databases.
  • 2+ Years of experience working with CI/CD tools.
  • Advanced English Level is required.

  • Write code to test and validate business logic in SQL and code repositories.
  • Test back-end API interfaces for the financial platform.
  • Build developer productivity tools.
  • Provide risk assessments on identified defects.
  • Define quality engineering scope and maintain automated scripts.
  • Participate in Scrum meetings for status updates.
  • Identify and clarify information needs.
  • Organize and resolve current backlog issues.

PythonSQLSCRUMData engineeringCI/CDMicroservices

Posted about 1 month ago
Apply
Apply
🔥 Senior Data Engineer
Posted 2 months ago

📍 Mexico, Gibraltar, Colombia, USA, Brazil, Argentina

🧭 Full-Time

🔍 FinTech

🏢 Company: Bitso

  • Proven English fluency.
  • 3+ years professional working experience with analytics, ETLs, and data systems.
  • 3+ years with SQL databases, data lake, big data, and cloud infrastructure.
  • 3+ years experience with Spark.
  • BS or Master's in Computer Science or similar.
  • Strong proficiency in SQL, Python, and AWS.
  • Strong data modeling skills.

  • Build processes required for optimal extraction, transformation, and loading of data from various sources using SQL, Python, Spark.
  • Identify, design, and implement internal process improvements while optimizing data delivery and redesigning infrastructure for scalability.
  • Ensure data integrity, quality, and security.
  • Work with stakeholders to assist with data-related technical issues and support their data needs.
  • Manage data separation and security across multiple data sources.

AWSPythonSQLBusiness IntelligenceMachine LearningData engineeringData StructuresSparkCommunication SkillsData modeling

Posted 2 months ago
Apply
Apply

📍 LATAM

🔍 Financial Services

🏢 Company: Truelogic Software

  • 5-7 years of experience in data engineering.
  • 3-5 years of experience in software development.
  • Deep experience with Snowflake and Stored Procedures.
  • Extensive experience working with Python and modern web frameworks.
  • Experience with queueing systems such as Celery, SQS, Pub/Sub.
  • Strong expertise in Python 3, Object-Oriented Code, and Design Patterns.
  • Experience with REST APIs, Git, and writing unit tests.
  • Knowledge of databases (SQL, NoSQL), AWS, serverless environments, and Infrastructure as Code (Cloudformation & CDK).
  • Familiarity with DevOps practices (CI/CD, Automated Pipelines) and Agile methodologies.

  • You will work in a growth-oriented environment at a financial services firm focused on the U.S. mortgage market.
  • Be responsible for providing strategic guidance and solution patterns for sub products.
  • Collaborate with business analysts and stakeholders to optimize requirements.
  • Work closely with developers to review and validate key functionality and ensure successful integration with existing systems.

AWSPythonSoftware DevelopmentSQLAgileDesign PatternsDjangoFlaskGitSCRUMSnowflakeTypeScriptJiraAmazon Web ServicesData engineeringREST APIServerlessNosqlCI/CDMicroservices

Posted 3 months ago
Apply
Apply

📍 LATAM

🔍 Mortgage Lending

🏢 Company: Truelogic Software

  • 5+ years of hands-on experience in backend engineering or data integration.
  • At least 2 years of experience working with an MDM platform.
  • Experience integrating with customer preference platforms like OneTrust or PossibleNow.
  • Proficiency in Python or another relevant programming language.
  • Expertise in AWS serverless technologies including Glue, Lambda, Kinesis, SQS.
  • Strong SQL skills and experience with Snowflake or similar data warehouses.
  • Experience with data modeling, cleansing, and quality management.
  • Understanding of REST APIs and web services.
  • Attention to detail and commitment to data accuracy.

  • Become a subject matter expert on Reltio.
  • Design and maintain integrations between Reltio and other systems using AWS services.
  • Implement data quality rules and cleansing routines.
  • Collaborate with data stewards to enforce governance policies.
  • Monitor and optimize performance of integrations.
  • Work with stakeholders to gather requirements and troubleshoot issues.
  • Stay updated on trends in MDM and data integration.

AWSNode.jsPythonSQLJavascriptSnowflakeAlgorithmsREST APIServerlessCollaborationData modeling

Posted 3 months ago
Apply
Apply
🔥 Senior Data Engineer
Posted 4 months ago

📍 Brazil

🧭 Contract

🔍 Digital Learning

  • 4+ years of professional experience in data engineering.
  • Proficiency in processes used in modern data engineering.
  • Solid understanding of cloud infrastructure.
  • Hands-on experience with relevant data-related AWS services.
  • Advanced understanding of the relational data model.
  • Experience interacting with RESTful APIs and webhooks.
  • Understanding of DevOps disciplines and Git version control.
  • A commitment to writing clean, efficient, and maintainable code.
  • Familiarity with monitoring tools like CloudWatch or New Relic.
  • Familiarity with incident management tools like PagerDuty is desirable.
  • Understanding Software Engineering paradigms and lifecycle is desirable.
  • Strong SQL skills.
  • Expertise utilizing Python for data wrangling.
  • Experience using Airflow for data pipeline orchestration.
  • Experience with Spark creating big data workloads.
  • Experience working with large-scale data warehouses like Redshift.
  • Experience with Terraform composing AWS infrastructure is desirable.

  • Partner with Engineering and Product teams on high-impact initiatives.
  • Design and implement robust data pipelines using AWS services.
  • Enhance event collection, queueing, and processing mechanisms.
  • Build and refine meaningful alerts with automated corrective actions.
  • Troubleshoot and resolve issues, ensuring system reliability and stability.
  • Architect scalable data products with long-term vision and efficiency in mind.
  • Identify and resolve complex technical challenges involving system integration, infrastructure, and software bugs.

AWSPythonSQLETLGitJavaKafkaStrategyData engineeringCollaborationRESTful APIsTerraform

Posted 4 months ago
Apply
Apply

📍 Central EU or Americas

🧭 Full-Time

🔍 Real estate investment

🏢 Company: Roofstock👥 501-1000💰 $240,000,000 Series E almost 3 years ago🫂 Last layoff almost 2 years agoRental PropertyPropTechMarketplaceReal EstateFinTech

  • BS or MS in a technical field: computer science, engineering or similar.
  • 8+ years technical experience working with data.
  • 5+ years strong experience building scalable data services and applications using SQL, Python, Java/Kotlin.
  • Deep understanding of microservices architecture and RESTful API development.
  • Experience with AWS services including messaging and familiarity with real-time data processing frameworks.
  • Significant experience building and deploying data-related infrastructure and robust data pipelines.
  • Strong understanding of data architecture and related challenges.
  • Experience with complex problems and distributed systems focusing on scalability and performance.
  • Strong communication and interpersonal skills.
  • Independent worker able to collaborate with cross-functional teams.

  • Improve and maintain the data services platform.
  • Deliver high-quality data services promptly, ensuring data governance and integrity while meeting objectives and maintaining SLAs.
  • Develop effective architectures and produce key code components contributing to technical solutions.
  • Integrate a diverse network of third-party tools into a cohesive, scalable platform.
  • Continuously enhance system performance and reliability by diagnosing and resolving operational issues.
  • Ensure rigorous testing of the team's work through automated methods.
  • Support data infrastructure and collaborate with the data team on scalable data pipelines.
  • Work within an Agile/Scrum framework with cross-functional teams to deliver value.
  • Influence the enterprise data platform architecture and standards.

AWSDockerPythonSQLAgileETLSCRUMSnowflakeAirflowData engineeringgRPCRESTful APIsMicroservices

Posted 5 months ago
Apply