ApplyData Engineer| Remote| Contract
Posted 5 months agoViewed
View full description
💎 Seniority level: Middle, 5-7 years
📍 Location: India
🏢 Company: Two95 International Inc.
⏳ Experience: 5-7 years
🪄 Skills: AWSPythonSQLAgileETLGitHadoopJavaSnowflakeJiraAzureSparkScala
Requirements:
- Bachelor’s degree in computer science or related field.
- 5-7 years of experience in Snowflake, Databricks management.
- Strong experience in Python and AWS Lambda.
- Knowledge of Scala and/or Java.
- Experience with data integration services, SQL, and ELT.
- Familiarity with Azure or AWS for development and deployment.
- Experience with Jira or similar tools during SDLC.
- Experience managing codebase using Git/GitHub or Bitbucket.
- Experience working with a data warehouse.
- Familiarity with structured and semi-structured data formats like JSON, Avro, ORC, Parquet, or XML.
- Exposure to agile work environments.
Responsibilities:
- Design and implement core data analytic platform components for various analytics groups.
- Review approaches and data pipelines for best practices.
- Maintain a common data flow pipeline including ETL activities.
- Support and troubleshoot data flow in cloud environments.
- Develop data pipeline code using Python, Java, AWS Lambda, or Azure Data Factory.
- Perform requirements planning and management throughout the data asset development life-cycle.
- Direct and help developers to adhere to data platform patterns.
- Design, build, and document RESTful APIs using OpenAPI specification tools.
Apply