Apply

Data Engineer

Posted 2024-11-07

View full description

πŸ’Ž Seniority level: Senior, Minimum 6+ years

πŸ“ Location: India

🏒 Company: Unison Consulting Pte Ltd

⏳ Experience: Minimum 6+ years

πŸͺ„ Skills: AWSDockerPythonAgileBashData AnalysisETLGCPKubernetesJiraAzureData analysisData engineeringData StructuresCollaborationLinuxTerraformCompliance

Requirements:
  • Minimum 6+ years of Data Ingestion, Integration, ETL, or security engineering experience.
  • Extensive knowledge of AWS, Azure, GCP.
  • Strong understanding of Data Management or Data Engineering.
  • Experienced in Agile methods and Atlassian stack (e.g., JIRA).
  • Ability to develop roadmaps for data-centric products.
  • Experience with monitoring frameworks and observability products.
  • Expertise in SIEM solutions and cloud-based data sources.
  • Familiarity with security monitoring solutions like Splunk and Datadog.
  • Experience in DevSecOps/IRE and agile environments.
  • Expertise in scripting languages (PowerShell, Python, Bash).
  • Experience with Docker, Kubernetes, Ansible, or Terraform.
  • Related security certifications (e.g., CISSP, CCSP).
  • Experience with Linux/Ubuntu/Mac systems.
  • Experience in creating dashboards and troubleshooting connectivity issues.
Responsibilities:
  • Define and manage data models, schemas, metadata, and security rules.
  • Design, create, deploy, and manage databases and data structures on-premise and in the cloud.
  • Identify and mitigate potential security risks.
  • Ensure compliance with data privacy laws and regulations.
  • Conduct risk assessments and take appropriate actions to mitigate data security risks.
  • Train and educate stakeholders about data management.
  • Collaborate with IT team members and stakeholders to secure data architectures.
Apply

Related Jobs

Apply

πŸ“ India

🧭 Full-Time

πŸ” SaaS (Software as a Service)

🏒 Company: Canibuild Au Pty Ltd

  • Expert-level proficiency in SQL (TSQL, MS SQL) with a focus on optimizing SQL queries for performance.
  • At least 7 years of experience preferred.
  • Extensive experience with Python and Airflow for ELT processes.
  • Proven experience in designing and developing data warehousing solutions on AWS.
  • Strong expertise in PowerBI for data visualization and dashboard creation.
  • Familiarity with connecting PowerBI to SQL Server and PostgreSQL Aurora databases.
  • Experience with REST APIs and JSON.
  • Agile development experience with a focus on continuous delivery and improvement.
  • Excellent problem-solving skills and a proactive attitude.
  • Strong communication skills and ability to work collaboratively.
  • Curiosity and eagerness to learn and adapt to new technologies.
  • Ability to perform database tuning and optimization.

  • Design, develop, and maintain ELT pipelines using Python, Airflow, and SQL in an AWS environment.
  • Create and manage a data lake and data warehouse solutions on AWS.
  • Develop and maintain data-driven dashboards and reporting solutions in PowerBI.
  • Connect PowerBI to SQL Server and PostgreSQL Aurora databases using a gateway.
  • Perform data profiling and source system analysis to ensure data quality and integrity.
  • Collaborate with business stakeholders to capture and understand data requirements.
  • Implement industry best practices for data engineering and visualization.
  • Participate in architectural decisions and contribute to the continuous improvement of data solutions.
  • Follow agile practices and a Lean approach in project development.
  • Optimize SQL queries for performance and ensure efficient database operations.
  • Perform database tuning and optimization as needed.

AWSPostgreSQLPythonSQLAgileAirflowData engineeringCommunication Skills

Posted 2024-11-09
Apply
Apply
πŸ”₯ Data Engineer
Posted 2024-11-08

πŸ“ Philippines, India

πŸ” Employee wellness services

  • Expertise in data engineering.
  • Proficient in programming.
  • Willingness to work hands-on with data.

  • The Data Engineering team transforms raw data into meaningful information.
  • Responsible for managing data systems.
  • Collaborates with business analysts to provide usable data.
  • Engages in critical and collaborative debate to solve data-related problems.

PythonSQLData AnalysisData analysisData engineeringCommunication SkillsAnalytical SkillsCollaboration

Posted 2024-11-08
Apply
Apply

πŸ“ India

πŸ” Mortgage Technology

🏒 Company: Saaf Finance

  • 5+ years of experience as a Data Engineer focusing on large-scale data solutions.
  • Experience in the mortgage industry working with mortgage data systems in financial services.
  • Hands-on experience with cloud services like Snowflake, S3, Redshift, and Glue.
  • Strong SQL/NoSQL skills with familiarity in databases such as PostgreSQL.
  • Knowledge of microservices architecture and real-time data processing.
  • Deep understanding of ETL workflows to optimize and troubleshoot data pipelines.
  • Experience with programming languages like Python or Javascript.
  • Familiarity with MISMO (Mortgage Industry Standards Maintenance Organization) data schema is preferred.
  • Preferred experience with Terraform or similar tools for infrastructure automation.
  • Experience at OCR or AI-based data extraction companies in processing mortgage-related documents.
  • Knowledge of Agile methodologies.

  • Design, develop, and optimize ETL pipelines extracting mortgage-related data from various sources into scalable data stores.
  • Collaborate with domain experts for proper mapping of mortgage data fields following MISMO standards.
  • Utilize best technology to create efficient data processing systems.
  • Work closely with product managers, senior engineers, and founders to contribute to software and data solutions.
  • Implement data governance for integrity, security, and compliance across data sources.
  • Design systems for real-time data streaming integrating with other systems.
  • Monitor and improve existing data pipelines and resolve data-related issues.
  • Follow advancements in data engineering tools and methods to enhance processes.

PostgreSQLPythonSQLAgileETLJavascriptSnowflakeJavaScriptData engineeringNosqlCollaborationMicroservices

Posted 2024-11-07
Apply
Apply

πŸ“ India

🧭 Contract

🏒 Company: Two95 International Inc.

  • Bachelor’s degree in computer science or related field.
  • 5-7 years of experience in Snowflake, Databricks management.
  • Strong experience in Python and AWS Lambda.
  • Knowledge of Scala and/or Java.
  • Experience with data integration services, SQL, and ELT.
  • Familiarity with Azure or AWS for development and deployment.
  • Experience with Jira or similar tools during SDLC.
  • Experience managing codebase using Git/GitHub or Bitbucket.
  • Experience working with a data warehouse.
  • Familiarity with structured and semi-structured data formats like JSON, Avro, ORC, Parquet, or XML.
  • Exposure to agile work environments.

  • Design and implement core data analytic platform components for various analytics groups.
  • Review approaches and data pipelines for best practices.
  • Maintain a common data flow pipeline including ETL activities.
  • Support and troubleshoot data flow in cloud environments.
  • Develop data pipeline code using Python, Java, AWS Lambda, or Azure Data Factory.
  • Perform requirements planning and management throughout the data asset development life-cycle.
  • Direct and help developers to adhere to data platform patterns.
  • Design, build, and document RESTful APIs using OpenAPI specification tools.

AWSPythonSQLAgileETLGitHadoopJavaSnowflakeJiraAzureSpark

Posted 2024-10-27
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Data Engineering

🏒 Company: Aryng

  • 8+ years of data engineering experience.
  • 4+ years of experience with cloud solutions: GCP, AWS, Azure, or on-premise distributed servers.
  • Proficiency in Python (4+ years) and strong SQL skills.
  • Experience with Big Query, Snowflake, Redshift, DBT.
  • Understanding of data warehousing, data lakes, and cloud concepts.
  • Excellent communication and problem-solving skills.
  • Bachelor's degree in relevant fields.

  • Implement asynchronous data ingestion and high volume stream data processing.
  • Develop real-time data analytics using various Data Engineering techniques.
  • Define and optimize data pipelines, identifying bottlenecks.
  • Utilize GCP, AWS, and Azure cloud technologies for cutting-edge solutions.

AWSProject ManagementPythonSQLAgileGCPHadoopKafkaSnowflakeAirflowAzureData engineeringSparkProblem Solving

Posted 2024-10-25
Apply
Apply

πŸ“ India

πŸ” Data Engineering

🏒 Company: Aryng

  • 8+ years of data engineering experience.
  • 4+ years using cloud solutions such as GCP, AWS, or Azure.
  • 4+ years experience in Python.
  • Strong knowledge of SQL and data concepts.
  • Experience with Big Query, Snowflake, Redshift, and DBT.
  • Understanding of data warehousing, data lakes, and cloud concepts.
  • Excellent communication and presentation skills.
  • Strong problem-solving skills with a proactive approach.
  • A B.S. in computer science, software engineering, computer engineering, electrical engineering, or a related field is required.

  • Implement asynchronous data ingestion and high-volume stream data processing.
  • Perform real-time data analytics using various Data Engineering techniques.
  • Implement application components using Cloud technologies and infrastructure.
  • Assist in defining data pipelines and identify bottlenecks for data management.
  • Apply cutting-edge cloud platform solutions using GCP, AWS, and Azure.

AWSProject ManagementPythonSQLAgileGCPSnowflakeAzureData engineering

Posted 2024-10-25
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Data engineering

🏒 Company: Aryng

  • 8+ years of data engineering experience.
  • 4+ years managing data engineering solutions using Cloud (GCP/AWS/Azure) or on-premise.
  • 4+ years' experience in Python.
  • Strong in SQL and its concepts.
  • Experience with Big Query, Snowflake, Redshift, DBT.
  • Understanding of data warehousing, data lake, and cloud concepts.
  • Excellent communication and presentation skills.
  • Excellent problem-solving skills.
  • B.S. in computer science or related field.

  • Implement asynchronous data ingestion, high volume stream data processing, and real-time data analytics using various Data Engineering Techniques.
  • Assist in defining the data pipelines and identify bottlenecks to enable effective data management.
  • Implement application components using Cloud technologies and infrastructure.
  • Implement cutting edge cloud platform solutions using tools and platforms offered by GCP, AWS, and Azure.

AWSProject ManagementPythonSQLAgileGCPSnowflakeAzureData engineering

Posted 2024-10-25
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Data & Analytics, AI

🏒 Company: Nexthire

  • Strong experience with Data Factory, Azure, Microsoft Dynamics 365, and Customer Data Platform (CDP).
  • Experience in MS Customer Insights Data.
  • Experience in MS customer insights journey.
  • Experience with Azure, Data Verse, and Power Platform.
  • Overall experience of 8-10 years.
  • At least one full project experience on CDP or 3 years in CDP-related support.

  • Design, develop, and implement solutions using Microsoft Customer Data Platform (CDP) to manage and analyze customer data.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Integrate CDP with various data sources and ensure seamless data flow and accuracy.
  • Develop and maintain data pipelines, ensuring data is collected, processed, and stored efficiently.
  • Create and manage customer profiles, segments, and audiences within the CDP.
  • Implement data governance and security best practices to protect customer data.
  • Monitor and optimize the performance of the CDP infrastructure.
  • Provide technical support and troubleshooting for CDP-related issues.
  • Stay updated with the latest trends and advancements in CDP technology and best practices.

Microsoft DynamicsAzure

Posted 2024-10-23
Apply
Apply

πŸ“ APAC

πŸ” Cryptocurrency derivatives

🏒 Company: BitMEX

  • Minimum 4+ years experience in the data engineering field with demonstrated design and technical implementation of data warehouses.
  • Experience with OLAP databases and understanding of data structuring/modeling for trade-offs between storage/performance and usability.
  • Experience building, deploying, and troubleshooting reliable and consistent data pipelines.
  • Familiarity with AWS Redshift, Glue Data Catalog, S3, PostgreSQL, Parquet, Iceberg, Trino, and their management using Terraform & Kubernetes.

  • Design and maintain enhancements to our data warehouse, data lake, and data pipelines.
  • Increase reliability and consistency of data systems.
  • Improve queriability of large historical datasets using industry-standard tools.

AWSPostgreSQLKubernetesAirflowData engineeringTerraform

Posted 2024-10-22
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ’Έ 3000000 - 3800000 INR per year

πŸ” Insurance

🏒 Company: CloudHire

  • Minimum 5+ years of experience in data consulting or a related field, preferably in the insurance industry.
  • Proven track record of delivering data-driven solutions for insurance companies.
  • Strong analytical skills with experience in data modeling, statistical analysis, and data mining techniques.
  • Proficiency in SQL querying languages such as MySQL and PostgreSQL.
  • Experience with cloud-based data storage solutions, particularly Amazon S3.
  • Familiarity with data warehousing concepts and technologies is a plus.
  • Excellent communication and presentation skills tailored to diverse audiences.
  • Ability to work independently and as part of a team in a fast-paced environment.

  • Partner with insurance companies to understand their business challenges and data landscape.
  • Develop data-driven strategies for operational efficiency, risk management, underwriting, claims processing, and customer experience.
  • Design and implement data solutions using MySQL and Amazon S3.
  • Ensure data quality, security, and scalability throughout the data lifecycle.
  • Develop and maintain data pipelines for data collection and analysis.
  • Execute data models and analytical techniques for valuable insights.
  • Identify trends and patterns to aid decision-making.
  • Communicate findings effectively through visualizations and reports.
  • Collaborate with internal and external teams to deliver successful projects.
  • Provide ongoing support to clients for data investment maximization.

PostgreSQLSQLData MiningMongoDBMySQLStrategyData miningAnalytical Skills

Posted 2024-10-21
Apply