Apply

Senior Data Engineer

Posted 2024-10-22

View full description

πŸ’Ž Seniority level: Senior, Minimum 4+ years

πŸ“ Location: APAC

πŸ” Industry: Cryptocurrency derivatives

🏒 Company: BitMEX

⏳ Experience: Minimum 4+ years

πŸͺ„ Skills: AWSPostgreSQLKubernetesAirflowData engineeringTerraform

Requirements:
  • Minimum 4+ years experience in the data engineering field with demonstrated design and technical implementation of data warehouses.
  • Experience with OLAP databases and understanding of data structuring/modeling for trade-offs between storage/performance and usability.
  • Experience building, deploying, and troubleshooting reliable and consistent data pipelines.
  • Familiarity with AWS Redshift, Glue Data Catalog, S3, PostgreSQL, Parquet, Iceberg, Trino, and their management using Terraform & Kubernetes.
Responsibilities:
  • Design and maintain enhancements to our data warehouse, data lake, and data pipelines.
  • Increase reliability and consistency of data systems.
  • Improve queriability of large historical datasets using industry-standard tools.
Apply

Related Jobs

Apply

πŸ“ India

🧭 Full-Time

πŸ” Data Engineering

🏒 Company: Aryng

  • 8+ years of data engineering experience.
  • 4+ years of experience with cloud solutions: GCP, AWS, Azure, or on-premise distributed servers.
  • Proficiency in Python (4+ years) and strong SQL skills.
  • Experience with Big Query, Snowflake, Redshift, DBT.
  • Understanding of data warehousing, data lakes, and cloud concepts.
  • Excellent communication and problem-solving skills.
  • Bachelor's degree in relevant fields.

  • Implement asynchronous data ingestion and high volume stream data processing.
  • Develop real-time data analytics using various Data Engineering techniques.
  • Define and optimize data pipelines, identifying bottlenecks.
  • Utilize GCP, AWS, and Azure cloud technologies for cutting-edge solutions.

AWSProject ManagementPythonSQLAgileGCPHadoopKafkaSnowflakeAirflowAzureData engineeringSparkProblem Solving

Posted 2024-10-25
Apply
Apply

πŸ“ India

πŸ” Data Engineering

🏒 Company: Aryng

  • 8+ years of data engineering experience.
  • 4+ years using cloud solutions such as GCP, AWS, or Azure.
  • 4+ years experience in Python.
  • Strong knowledge of SQL and data concepts.
  • Experience with Big Query, Snowflake, Redshift, and DBT.
  • Understanding of data warehousing, data lakes, and cloud concepts.
  • Excellent communication and presentation skills.
  • Strong problem-solving skills with a proactive approach.
  • A B.S. in computer science, software engineering, computer engineering, electrical engineering, or a related field is required.

  • Implement asynchronous data ingestion and high-volume stream data processing.
  • Perform real-time data analytics using various Data Engineering techniques.
  • Implement application components using Cloud technologies and infrastructure.
  • Assist in defining data pipelines and identify bottlenecks for data management.
  • Apply cutting-edge cloud platform solutions using GCP, AWS, and Azure.

AWSProject ManagementPythonSQLAgileGCPSnowflakeAzureData engineering

Posted 2024-10-25
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ” Data engineering

🏒 Company: Aryng

  • 8+ years of data engineering experience.
  • 4+ years managing data engineering solutions using Cloud (GCP/AWS/Azure) or on-premise.
  • 4+ years' experience in Python.
  • Strong in SQL and its concepts.
  • Experience with Big Query, Snowflake, Redshift, DBT.
  • Understanding of data warehousing, data lake, and cloud concepts.
  • Excellent communication and presentation skills.
  • Excellent problem-solving skills.
  • B.S. in computer science or related field.

  • Implement asynchronous data ingestion, high volume stream data processing, and real-time data analytics using various Data Engineering Techniques.
  • Assist in defining the data pipelines and identify bottlenecks to enable effective data management.
  • Implement application components using Cloud technologies and infrastructure.
  • Implement cutting edge cloud platform solutions using tools and platforms offered by GCP, AWS, and Azure.

AWSProject ManagementPythonSQLAgileGCPSnowflakeAzureData engineering

Posted 2024-10-25
Apply
Apply

πŸ“ Cyprus, Malta, USA, Thailand, Indonesia, Hong Kong, Japan, Australia, Poland, Israel, Turkey, Latvia

🧭 Full-Time

πŸ” Social discovery technology

🏒 Company: Social Discovery Group

  • 3+ years of professional experience as a Data Engineer.
  • Confident knowledge of MS SQL including window functions, subqueries, and various joins.
  • Excellent knowledge of Python.
  • Basic query optimization skills.
  • Experience with Airflow.
  • Nice to have: experience with Google Cloud Platform (BigQuery, Storage, pub/sub).

  • Design, develop, and maintain SQL data warehouses, including creation and optimization of stored procedures.
  • Build and enhance reports using SSRS and create dynamic dashboards with Superset for actionable insights.
  • Develop and manage efficient data pipelines using Airflow to ensure smooth data integration and automation.

PythonSQLApache AirflowGCPAirflowData engineering

Posted 2024-10-23
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ’Έ 3000000 - 3800000 INR per year

πŸ” Insurance

🏒 Company: CloudHire

  • Minimum 5+ years of experience in data consulting or a related field, preferably in the insurance industry.
  • Proven track record of delivering data-driven solutions for insurance companies.
  • Strong analytical skills with experience in data modeling, statistical analysis, and data mining techniques.
  • Proficiency in SQL querying languages such as MySQL and PostgreSQL.
  • Experience with cloud-based data storage solutions, particularly Amazon S3.
  • Familiarity with data warehousing concepts and technologies is a plus.
  • Excellent communication and presentation skills tailored to diverse audiences.
  • Ability to work independently and as part of a team in a fast-paced environment.

  • Partner with insurance companies to understand their business challenges and data landscape.
  • Develop data-driven strategies for operational efficiency, risk management, underwriting, claims processing, and customer experience.
  • Design and implement data solutions using MySQL and Amazon S3.
  • Ensure data quality, security, and scalability throughout the data lifecycle.
  • Develop and maintain data pipelines for data collection and analysis.
  • Execute data models and analytical techniques for valuable insights.
  • Identify trends and patterns to aid decision-making.
  • Communicate findings effectively through visualizations and reports.
  • Collaborate with internal and external teams to deliver successful projects.
  • Provide ongoing support to clients for data investment maximization.

PostgreSQLSQLData MiningMongoDBMySQLStrategyData miningAnalytical Skills

Posted 2024-10-21
Apply
Apply

πŸ“ Pakistan

🏒 Company: Creative Chaos

  • Bachelor's or Master's degree in computer science, engineering, or a related field.
  • Minimum of 7 years of industry experience as a data engineer or in a similar role.
  • Strong programming skills in languages such as Python, Scala, or Java.
  • Experience in designing and implementing data pipelines using tools like Apache Kafka, Apache Spark, or AWS Glue.
  • Proficiency in SQL and database technologies like PostgreSQL, MySQL, or MongoDB.
  • Knowledge of cloud platforms such as Azure.
  • Experience with data modeling, ETL processes, and data warehousing concepts.
  • Strong problem-solving and troubleshooting skills.
  • Excellent communication and collaboration abilities.
  • Ability to work effectively in cross-functional teams.
  • Detail-oriented and proactive mindset.

  • Design and develop scalable and reliable data pipelines, ensuring high availability and performance.
  • Construct intricate data sets that align with functional and non-functional business requirements.
  • Identify, plan, and implement internal process enhancements, such as automation of manual processes and data delivery optimization.
  • Implement best practices for data storage, processing, and retrieval.
  • Collaborate with stakeholders including executives, data scientists, and product managers to understand data requirements and implement data solutions.
  • Optimize and tune data workflows to achieve optimal performance and efficiency.
  • Maintain data security and compliance with data privacy regulations.
  • Stay up-to-date with emerging technologies and industry trends in data engineering and analytics.
  • Mentor and guide more junior data engineers in the team.

AWSPostgreSQLPythonSQLETLJavaKafkaMongoDBMySQLApache KafkaAzureData engineeringSparkCollaboration

Posted 2024-10-15
Apply
Apply

πŸ“ India

πŸ” Data and cloud engineering services

🏒 Company: Enable Data Incorporated

  • Bachelor's or Master's degree in computer science, engineering, or a related field.
  • 10+ years of experience as a Data Engineer, Software Engineer, or similar role, with a focus on building cloud-based data solutions.
  • Strong experience with cloud platforms such as Azure or AWS.
  • Proficiency in Apache Spark and Databricks for large-scale data processing and analytics.
  • Experience in designing and implementing data processing pipelines using Spark and Databricks.
  • Strong knowledge of SQL and experience with relational and NoSQL databases.
  • Experience with data integration and ETL processes using tools like Apache Airflow or cloud-native orchestration services.
  • Good understanding of data modeling and schema design principles.
  • Experience with data governance and compliance frameworks.
  • Excellent problem-solving and troubleshooting skills.
  • Strong communication and collaboration skills to work effectively in a cross-functional team.
  • Relevant certifications in cloud platforms, Spark, or Databricks are a plus.

  • Design, develop, and maintain scalable and robust data solutions in the cloud using Apache Spark and Databricks.
  • Gather and analyze data requirements from business stakeholders and identify opportunities for data-driven insights.
  • Build and optimize data pipelines for data ingestion, processing, and integration using Spark and Databricks.
  • Ensure data quality, integrity, and security throughout all stages of the data lifecycle.
  • Collaborate with cross-functional teams to design and implement data models, schemas, and storage solutions.
  • Optimize data processing and analytics performance by tuning Spark jobs and leveraging Databricks features.
  • Provide technical guidance and expertise to junior data engineers and developers.
  • Stay up-to-date with emerging trends and technologies in cloud computing, big data, and data engineering.
  • Contribute to the continuous improvement of data engineering processes, tools, and best practices.

AWSSQLApache AirflowCloud ComputingETLAirflowAzureData engineeringNosqlSparkCollaborationCompliance

Posted 2024-10-01
Apply
Apply

πŸ“ United States, India, United Kingdom

🧭 Full-Time

πŸ’Έ 150000 - 180000 USD per year

πŸ” B2B technology

  • Four-year degree in Computer Science or related field, or equivalent experience.
  • Designing frameworks and writing efficient data pipelines, including batches and real-time streams.
  • Understanding of data strategies, data analysis, and data model design.
  • Experience with the Spark Ecosystem (YARN, Executors, Livy, etc.).
  • Experience in large scale data streaming, particularly Kafka or similar technologies.
  • Experience with data orchestration frameworks, particularly Airflow or similar.
  • Experience with columnar data stores, particularly Parquet and Clickhouse.
  • Strong SDLC principles (CI/CD, Unit Testing, git, etc.).
  • General understanding of AWS EMR, EC2, S3.

  • Help build the next generation unified data platform.
  • Solve complex data warehousing problems.
  • Ensure quality, discoverability, and accessibility of data.
  • Build batch and streaming data pipelines for ingestion, normalization, and analysis.
  • Develop standard design and access patterns.
  • Lead the unification of data from multiple products.

GitAirflowClickhouseSpark

Posted 2024-07-11
Apply