Apply

Data Engineer

Posted 2024-11-08

View full description

πŸ“ Location: Philippines, India

πŸ” Industry: Employee wellness services

πŸͺ„ Skills: PythonSQLData AnalysisData analysisData engineeringCommunication SkillsAnalytical SkillsCollaboration

Requirements:
  • Expertise in data engineering.
  • Proficient in programming.
  • Willingness to work hands-on with data.
Responsibilities:
  • The Data Engineering team transforms raw data into meaningful information.
  • Responsible for managing data systems.
  • Collaborates with business analysts to provide usable data.
  • Engages in critical and collaborative debate to solve data-related problems.
Apply

Related Jobs

Apply

πŸ“ India

🧭 Full-Time

πŸ” SaaS (Software as a Service)

🏒 Company: Canibuild Au Pty Ltd

  • Expert-level proficiency in SQL (TSQL, MS SQL) with a focus on optimizing SQL queries for performance.
  • At least 7 years of experience preferred.
  • Extensive experience with Python and Airflow for ELT processes.
  • Proven experience in designing and developing data warehousing solutions on AWS.
  • Strong expertise in PowerBI for data visualization and dashboard creation.
  • Familiarity with connecting PowerBI to SQL Server and PostgreSQL Aurora databases.
  • Experience with REST APIs and JSON.
  • Agile development experience with a focus on continuous delivery and improvement.
  • Excellent problem-solving skills and a proactive attitude.
  • Strong communication skills and ability to work collaboratively.
  • Curiosity and eagerness to learn and adapt to new technologies.
  • Ability to perform database tuning and optimization.

  • Design, develop, and maintain ELT pipelines using Python, Airflow, and SQL in an AWS environment.
  • Create and manage a data lake and data warehouse solutions on AWS.
  • Develop and maintain data-driven dashboards and reporting solutions in PowerBI.
  • Connect PowerBI to SQL Server and PostgreSQL Aurora databases using a gateway.
  • Perform data profiling and source system analysis to ensure data quality and integrity.
  • Collaborate with business stakeholders to capture and understand data requirements.
  • Implement industry best practices for data engineering and visualization.
  • Participate in architectural decisions and contribute to the continuous improvement of data solutions.
  • Follow agile practices and a Lean approach in project development.
  • Optimize SQL queries for performance and ensure efficient database operations.
  • Perform database tuning and optimization as needed.

AWSPostgreSQLPythonSQLAgileAirflowData engineeringCommunication Skills

Posted 2024-11-09
Apply
Apply

πŸ“ Philippines

🧭 Contract

πŸ’Έ 1690 - 1940 INR per hour

πŸ” B2B technology

  • 7+ years of expertise with SQL, particularly in Redshift and BigQuery.
  • Experience with Python for data processing and automation.
  • Proficient with data visualization tools like Tableau or Looker.
  • Skilled in leveraging DBT for data transformations and source control.
  • Strong knowledge of data pipeline automation and ETL/ELT processes.
  • Experience in establishing database architecture and data governance.
  • Analytical skills to interpret data into actionable insights.
  • Understanding of statistical analysis, predictive modeling, and machine learning.

  • Develop, maintain, and optimize data pipelines and ETL/ELT processes using Redshift, BigQuery, and other data systems.
  • Design and manage data architectures for efficient cross-channel integration.
  • Utilize APIs and automation tools for data ingestion and transformation.
  • Build and maintain robust data models to improve reporting accuracy.
  • Collaborate with teams to align data structures with business needs.
  • Clean, preprocess, and transform raw data for consistency.
  • Monitor and troubleshoot data pipelines for high quality.
  • Conduct exploratory data analysis to uncover trends.
  • Support campaign optimization with predictive models and forecasting.
  • Create interactive dashboards and reports for real-time insights.

PythonSQLAgileData AnalysisETLMachine LearningTableauData analysisData engineeringData Structures

Posted 2024-11-09
Apply
Apply

πŸ“ India

πŸ” Mortgage Technology

🏒 Company: Saaf Finance

  • 5+ years of experience as a Data Engineer focusing on large-scale data solutions.
  • Experience in the mortgage industry working with mortgage data systems in financial services.
  • Hands-on experience with cloud services like Snowflake, S3, Redshift, and Glue.
  • Strong SQL/NoSQL skills with familiarity in databases such as PostgreSQL.
  • Knowledge of microservices architecture and real-time data processing.
  • Deep understanding of ETL workflows to optimize and troubleshoot data pipelines.
  • Experience with programming languages like Python or Javascript.
  • Familiarity with MISMO (Mortgage Industry Standards Maintenance Organization) data schema is preferred.
  • Preferred experience with Terraform or similar tools for infrastructure automation.
  • Experience at OCR or AI-based data extraction companies in processing mortgage-related documents.
  • Knowledge of Agile methodologies.

  • Design, develop, and optimize ETL pipelines extracting mortgage-related data from various sources into scalable data stores.
  • Collaborate with domain experts for proper mapping of mortgage data fields following MISMO standards.
  • Utilize best technology to create efficient data processing systems.
  • Work closely with product managers, senior engineers, and founders to contribute to software and data solutions.
  • Implement data governance for integrity, security, and compliance across data sources.
  • Design systems for real-time data streaming integrating with other systems.
  • Monitor and improve existing data pipelines and resolve data-related issues.
  • Follow advancements in data engineering tools and methods to enhance processes.

PostgreSQLPythonSQLAgileETLJavascriptSnowflakeJavaScriptData engineeringNosqlCollaborationMicroservices

Posted 2024-11-07
Apply
Apply
πŸ”₯ Data Engineer
Posted 2024-11-07

πŸ“ India

🏒 Company: Unison Consulting Pte Ltd

  • Minimum 6+ years of Data Ingestion, Integration, ETL, or security engineering experience.
  • Extensive knowledge of AWS, Azure, GCP.
  • Strong understanding of Data Management or Data Engineering.
  • Experienced in Agile methods and Atlassian stack (e.g., JIRA).
  • Ability to develop roadmaps for data-centric products.
  • Experience with monitoring frameworks and observability products.
  • Expertise in SIEM solutions and cloud-based data sources.
  • Familiarity with security monitoring solutions like Splunk and Datadog.
  • Experience in DevSecOps/IRE and agile environments.
  • Expertise in scripting languages (PowerShell, Python, Bash).
  • Experience with Docker, Kubernetes, Ansible, or Terraform.
  • Related security certifications (e.g., CISSP, CCSP).
  • Experience with Linux/Ubuntu/Mac systems.
  • Experience in creating dashboards and troubleshooting connectivity issues.

  • Define and manage data models, schemas, metadata, and security rules.
  • Design, create, deploy, and manage databases and data structures on-premise and in the cloud.
  • Identify and mitigate potential security risks.
  • Ensure compliance with data privacy laws and regulations.
  • Conduct risk assessments and take appropriate actions to mitigate data security risks.
  • Train and educate stakeholders about data management.
  • Collaborate with IT team members and stakeholders to secure data architectures.

AWSDockerPythonAgileBashData AnalysisETLGCPKubernetesJiraAzureData analysisData engineeringData StructuresCollaborationLinuxTerraformCompliance

Posted 2024-11-07
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Data Engineering

🏒 Company: Aryng

  • 8+ years of data engineering experience.
  • 4+ years of experience with cloud solutions: GCP, AWS, Azure, or on-premise distributed servers.
  • Proficiency in Python (4+ years) and strong SQL skills.
  • Experience with Big Query, Snowflake, Redshift, DBT.
  • Understanding of data warehousing, data lakes, and cloud concepts.
  • Excellent communication and problem-solving skills.
  • Bachelor's degree in relevant fields.

  • Implement asynchronous data ingestion and high volume stream data processing.
  • Develop real-time data analytics using various Data Engineering techniques.
  • Define and optimize data pipelines, identifying bottlenecks.
  • Utilize GCP, AWS, and Azure cloud technologies for cutting-edge solutions.

AWSProject ManagementPythonSQLAgileGCPHadoopKafkaSnowflakeAirflowAzureData engineeringSparkProblem Solving

Posted 2024-10-25
Apply
Apply

πŸ“ India

πŸ” Data Engineering

🏒 Company: Aryng

  • 8+ years of data engineering experience.
  • 4+ years using cloud solutions such as GCP, AWS, or Azure.
  • 4+ years experience in Python.
  • Strong knowledge of SQL and data concepts.
  • Experience with Big Query, Snowflake, Redshift, and DBT.
  • Understanding of data warehousing, data lakes, and cloud concepts.
  • Excellent communication and presentation skills.
  • Strong problem-solving skills with a proactive approach.
  • A B.S. in computer science, software engineering, computer engineering, electrical engineering, or a related field is required.

  • Implement asynchronous data ingestion and high-volume stream data processing.
  • Perform real-time data analytics using various Data Engineering techniques.
  • Implement application components using Cloud technologies and infrastructure.
  • Assist in defining data pipelines and identify bottlenecks for data management.
  • Apply cutting-edge cloud platform solutions using GCP, AWS, and Azure.

AWSProject ManagementPythonSQLAgileGCPSnowflakeAzureData engineering

Posted 2024-10-25
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Data engineering

🏒 Company: Aryng

  • 8+ years of data engineering experience.
  • 4+ years managing data engineering solutions using Cloud (GCP/AWS/Azure) or on-premise.
  • 4+ years' experience in Python.
  • Strong in SQL and its concepts.
  • Experience with Big Query, Snowflake, Redshift, DBT.
  • Understanding of data warehousing, data lake, and cloud concepts.
  • Excellent communication and presentation skills.
  • Excellent problem-solving skills.
  • B.S. in computer science or related field.

  • Implement asynchronous data ingestion, high volume stream data processing, and real-time data analytics using various Data Engineering Techniques.
  • Assist in defining the data pipelines and identify bottlenecks to enable effective data management.
  • Implement application components using Cloud technologies and infrastructure.
  • Implement cutting edge cloud platform solutions using tools and platforms offered by GCP, AWS, and Azure.

AWSProject ManagementPythonSQLAgileGCPSnowflakeAzureData engineering

Posted 2024-10-25
Apply
Apply
πŸ”₯ Data Engineer
Posted 2024-10-23

πŸ“ Philippines

🧭 Full-Time

🏒 Company: Sourcefit

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Proven experience as a Data Engineer or in a similar role.
  • Strong knowledge of SQL and experience with relational databases (e.g., MySQL, PostgreSQL).
  • Experience with big data tools and frameworks (e.g., Hadoop, Spark, Kafka).
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Extensive experience with Azure cloud services, including Azure Data Factory, Azure Databricks, and Azure Synapse Analytics.
  • Familiarity with data warehousing solutions, particularly Microsoft Fabric.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.
  • Experience in data analytics and AI, including data visualization, statistical analysis, predictive modeling, and machine learning.

  • Design, develop, and maintain scalable data pipelines and systems.
  • Assemble large, complex data sets that meet business requirements.
  • Identify, design, and implement internal process improvements.
  • Build infrastructure for optimal ETL of data from various sources using SQL and Azure technologies.
  • Develop and maintain data architecture for analytics, business intelligence, and AI.
  • Collaborate with data scientists and analysts to support data infrastructure needs.
  • Ensure data quality and integrity through cleaning, validation, and analysis.
  • Monitor and troubleshoot data pipeline performance and reliability.

PostgreSQLPythonSQLBusiness IntelligenceETLHadoopJavaKafkaMachine LearningMySQLAzureSparkCollaboration

Posted 2024-10-23
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Data & Analytics, AI

🏒 Company: Nexthire

  • Strong experience with Data Factory, Azure, Microsoft Dynamics 365, and Customer Data Platform (CDP).
  • Experience in MS Customer Insights Data.
  • Experience in MS customer insights journey.
  • Experience with Azure, Data Verse, and Power Platform.
  • Overall experience of 8-10 years.
  • At least one full project experience on CDP or 3 years in CDP-related support.

  • Design, develop, and implement solutions using Microsoft Customer Data Platform (CDP) to manage and analyze customer data.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Integrate CDP with various data sources and ensure seamless data flow and accuracy.
  • Develop and maintain data pipelines, ensuring data is collected, processed, and stored efficiently.
  • Create and manage customer profiles, segments, and audiences within the CDP.
  • Implement data governance and security best practices to protect customer data.
  • Monitor and optimize the performance of the CDP infrastructure.
  • Provide technical support and troubleshooting for CDP-related issues.
  • Stay updated with the latest trends and advancements in CDP technology and best practices.

Microsoft DynamicsAzure

Posted 2024-10-23
Apply
Apply

πŸ“ Philippines

🏒 Company: Activate Talent

  • Bachelor's degree in Computer Science, Information Technology, or related field.
  • Experience as a Data Engineer with a focus on Active Pooling and data pipeline development.
  • Strong proficiency in SQL and programming languages such as Python or Java.
  • Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and modern data stack tools.
  • Familiarity with ETL processes and data integration methodologies.
  • Analytical mindset with the ability to troubleshoot and resolve data issues effectively.
  • Excellent communication skills for collaboration with cross-functional teams.
  • Strong organizational skills and attention to detail.

  • Design, build, and maintain data pipelines that support Active Pooling initiatives and improve data accessibility.
  • Collaborate with stakeholders to identify data needs and deliver effective data solutions.
  • Utilize cloud services and technologies to architect scalable data architectures.
  • Ensure data quality, integrity, and security throughout the data lifecycle.
  • Analyze and optimize ETL processes to improve data processing efficiency.
  • Stay current with industry trends and best practices in data engineering and cloud technologies.
  • Provide support for data-related issues and contribute to troubleshooting efforts.
  • Document data engineering processes and architectures appropriately.

AWSPythonSQLETLGCPJavaAzureData engineeringCommunication SkillsCollaborationDocumentation

Posted 2024-10-23
Apply