Apply

Senior Database Engineer

Posted 3 months agoViewed

View full description

💎 Seniority level: Senior, 7+ years with AWS, 10+ years for various other skills

📍 Location: India

🏢 Company: CloudHire👥 11-50RecruitingWeb DesignSoftware

⏳ Experience: 7+ years with AWS, 10+ years for various other skills

🪄 Skills: AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgres

Requirements:
  • 7+ years of experience with AWS (preferred), GCP or Azure.
  • 10+ years of experience designing, building, and supporting near real-time data pipelines and analytical solutions using technologies such as Postgres, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, Power BI, and/or SSIS.
  • 10+ years of experience using object-oriented programming languages (e.g., .Net, Java, Python) for data delivery in near real-time and streaming analytics.
  • 10+ years of experience documenting business requirements and translating them into relational, non-relational, and dimensional data models using Erwin.
  • 10+ years of experience working in agile teams delivering data solutions.
  • 10+ years of experience developing Master Data Management (MDM) solutions.
  • 8+ years of experience delivering solutions on public cloud platforms, preferably Google Cloud.
  • Experience writing automated unit, integration, and acceptance tests for data interfaces and data pipelines.
  • Ability to quickly learn new technologies and determine their best applications.
  • Exceptional interpersonal skills, including teamwork, communication, and negotiation.
Responsibilities:
  • Integrate multiple databases together, including Snowflake schema, Star schema, and Network model.
  • Work with multiple message buses, such as Kafka and IBM MQ to targets like Redshift, Postgres, and MongoDB.
  • Discover appropriate workloads and utilize the right database for performance and functionality.
  • Design and deploy databases for scale based on request types.
  • Ensure database recovery with sequence and time constraints.
  • Collaborate with business and technology stakeholders to define future-state business capabilities and requirements, translating them into data architectures.
  • Partner with platform architects to ensure implementations follow principles, guidelines, and standards.
  • Analyze current technology environments for deficiencies and recommend solutions.
  • Design, implement, and maintain data services and interfaces, including real-time data pipelines using emerging technologies.
  • Develop continuous integration and deployment for data pipelines with automated testing.
  • Utilize workflow management platforms like Airflow.
  • Mentor and motivate the team to achieve organizational goals.
  • Advocate for agile practices to increase efficiency.
  • Maintain consistency with development standards.
Apply

Related Jobs

Apply

📍 India

🏢 Company: CloudHire👥 11-50RecruitingWeb DesignSoftware

  • 7+ years of experience with AWS (preferred), GCP, or Azure.
  • 10+ years designing and supporting real-time data pipelines and analytics solutions using tools like Postgres, Redshift, and others.
  • 10+ years using object-oriented languages (.Net, Java, Python) for streaming analytics.
  • 10+ years documenting business requirements and creating data models using Erwin.
  • 10+ years working in agile teams on data solutions.
  • 10+ years developing Master Data Management solutions.
  • 8+ years delivering solutions on public cloud platforms, preferably Google Cloud.
  • Experience writing automated tests for data interfaces and pipelines.
  • Exceptional interpersonal skills, including teamwork and communication.

  • Integrate multiple databases together, including Snowflake schema, Star schema, and Network model.
  • Work with message buses, such as Kafka and IBM MQ, to target databases like Redshift and Postgres.
  • Discover appropriate workloads to select the right database for performance.
  • Ensure database recovery with time constraints.
  • Collaborate with stakeholders to define future-state capabilities and translate them into data architectures.
  • Partner with platform architects for compliance with platform principles.
  • Analyze the technology environment to address deficiencies.
  • Design and maintain data services and real-time data pipelines.
  • Develop CI/CD for data pipelines, including automated testing.
  • Utilize workflow management platforms like Airflow.
  • Mentor and support the team to reach organizational goals.
  • Advocate for agile practices to improve delivery.

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgres

Posted 4 months ago
Apply