Erwin Jobs

Find remote positions requiring Erwin skills. Browse through opportunities where you can utilize your expertise and grow your career.

Erwin
7 jobs found. to receive daily emails with new job openings that match your preferences.
7 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ India

🧭 Full-Time

πŸ’Έ 3500000.0 - 3700000.0 INR per year

πŸ” Technology

🏒 Company: CloudHireπŸ‘₯ 11-50RecruitingWeb DesignSoftware

  • 7+ years with AWS (preferred), GCP or Azure.
  • 10+ years designing and building data pipelines with Postgres, Redshift, and others.
  • 10+ years using object-oriented languages (.Net, Java, Python).
  • 10+ years documenting business requirements and translating them into data models.
  • 10+ years working on agile teams.
  • 10+ years developing MDM solutions.
  • 8+ years delivering solutions on public cloud platforms (Google Cloud preferred).
  • Experience writing automated tests for data interfaces & pipelines.

  • Integrate multiple databases together, including Snowflake schema and others.
  • Work with message buses like Kafka and IBM MQ.
  • Analyze technology environment and recommend improvements.
  • Design and maintain real-time data pipelines.
  • Mentor and support team to meet organizational goals.
  • Collaborate with business and technology stakeholders.

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgresMentoringNegotiation

Posted 2 months ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ” Health Information Technology

🏒 Company: VetsEZπŸ‘₯ 101-250DatabaseInformation ServicesInformation TechnologySoftware

  • Bachelor's degree in Computer Science, Electronics Engineering, or related technical discipline.
  • 5+ years of experience using Microsoft SQL Server, including T-SQL scripts.
  • Experience with SQL Server Integration Services (SSIS) and data mapping tools.
  • Knowledge of database architecture, administration, and security for on-premise and cloud systems.
  • Ability to communicate effectively and lead technical discussions.

  • Perform ETL processes to convert data into various target formats.
  • Model and design data tables and schemas to support evolving data syndication architecture.
  • Evaluate and recommend commercial modeling and data cataloging tools.
  • Write and optimize SQL queries, update data models adhering to VA data standards.
  • Enhance database architectures to meet integration needs and support security.
  • Assist with InterSystems HealthConnect, HealthShare, ensuring compliance with VA regulations.

SQLErwinETLMicrosoft SQL ServerOracleAzureSparkData modeling

Posted 3 months ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ’Έ 3000000 - 3700000 INR per year

πŸ” Remote Employee Provider

🏒 Company: CloudHireπŸ‘₯ 11-50RecruitingWeb DesignSoftware

  • 7+ years experience with AWS, GCP, or Azure preferred.
  • 10+ years experience designing and supporting data pipelines with PostgreSQL, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, Power BI, and/or SSIS.
  • 10+ years experience using object-oriented languages such as .Net, Java, Python.
  • 10+ years experience documenting business requirements and translating them into data models.
  • 10+ years experience working on agile teams.
  • 10+ years experience developing MDM solutions.
  • 8+ years experience delivering solutions on cloud platforms, preferably Google Cloud.
  • Experience with automated testing for data pipelines.
  • Exceptional interpersonal skills.

  • Integrate multiple databases together using various schema types.
  • Work with message buses to deliver data to different targets.
  • Identify appropriate workloads and determine database effectiveness.
  • Design and deploy database systems for scalability.
  • Collaborate with stakeholders on business capabilities and data architectures.
  • Analyze technology environments to recommend improvements.
  • Implement and maintain data services and real-time pipelines.
  • Develop CI/CD for data pipelines with testing automation.
  • Mentor the team and advocate agile practices.

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgresAgile methodologies

Posted 3 months ago
Apply
Apply
πŸ”₯ Database Architect
Posted 3 months ago

πŸ“ India

🧭 Full-Time

πŸ’Έ 3000000.0 - 3700000.0 INR per year

πŸ” Cloud services

🏒 Company: CloudHireπŸ‘₯ 11-50RecruitingWeb DesignSoftware

  • 7+ years of experience with AWS, GCP, or Azure.
  • 10+ years in designing, building, and supporting real-time data pipelines and analytical solutions using technologies like Postgres, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, PowerBI, and/or SSIS.
  • 10+ years using object-oriented languages (.Net, Java, Python) for data delivery.
  • 10+ years working with partners to document and translate business requirements into various data models using Erwin.
  • 10+ years on agile teams delivering data solutions.
  • 10+ years developing MDM solutions.
  • 8+ years of experience delivering solutions on public cloud platforms, preferably Google Cloud.
  • Experience in writing automated tests for data interfaces and pipelines.
  • Ability to quickly adapt to new technologies.
  • Exceptional interpersonal skills, including teamwork, communication, and negotiation.

  • Integrate multiple databases together, including Snowflake schema, Star schema, and Network model.
  • Work with message buses like Kafka and IBM MQ to targets like Redshift, Postgres, and MongoDB.
  • Discover appropriate workloads and select the suitable database to ensure performance and functionality.
  • Design and deploy database solutions for scalability.
  • Implement database recovery strategies.
  • Collaborate with business and technology stakeholders to define and translate future-state business capabilities into data architectures.
  • Partner with platform architects to adhere to platform principles.
  • Analyze the technology environment for deficiencies and recommend improvements.
  • Design and maintain real-time data service interfaces and pipelines using emerging technologies.
  • Develop continuous integration and deployment processes for data pipelines.
  • Manage workflows using platforms like Airflow.
  • Mentor and support team members to meet organizational goals.
  • Advocate for agile practices to enhance delivery throughput.
  • Create and maintain development standards.

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgresAgile methodologies

Posted 3 months ago
Apply
Apply
πŸ”₯ Data Practice Lead
Posted 3 months ago

πŸ“ U.S, Europe, APAC

πŸ” Data Management

🏒 Company: EncoraπŸ‘₯ 10001-10001πŸ’° $200,000,000 Private over 5 years agoBig DataCloud ComputingSoftware

  • A bachelor’s degree in computer science, information technology, or a closely related discipline is usually necessary; a master’s degree is preferable.
  • Prior experience in data modelling, database design, and data administration is required.
  • Knowledge of data warehousing ideas and proficiency in various database systems (e.g., SQL, NoSQL).
  • Knowledge of data modelling tools such as Erwin, or Microsoft Visio.
  • Knowledge of ETL methods and technologies (e.g., Apache NiFi, Talend, Informatica).
  • Strong problem-solving and analytical skills.
  • Excellent communication skills for cross-functional collaboration.
  • Understanding of data governance principles, data security, and regulatory compliance.
  • Knowledge of programming languages such as Python or Java can be advantageous.

  • Creating data models that specify how data is formatted, stored, and retrieved within an organization.
  • Creating and optimizing databases, including the selection of appropriate database management systems (DBMS).
  • Creating and maintaining data integration processes, ETL workflows, and data pipelines.
  • Collaborating with business analysts, data scientists, and other stakeholders to understand data requirements.
  • Establishing processes for monitoring and improving data quality.
  • Implementing data quality tools to detect and resolve data issues.

LeadershipPythonSQLErwinETLJavaNosqlCommunication SkillsAnalytical Skills

Posted 3 months ago
Apply
Apply

πŸ“ India

🏒 Company: CloudHireπŸ‘₯ 11-50RecruitingWeb DesignSoftware

  • 7+ years of experience with AWS (preferred), GCP or Azure.
  • 10+ years of experience designing, building, and supporting near real-time data pipelines and analytical solutions using technologies such as Postgres, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, Power BI, and/or SSIS.
  • 10+ years of experience using object-oriented programming languages (e.g., .Net, Java, Python) for data delivery in near real-time and streaming analytics.
  • 10+ years of experience documenting business requirements and translating them into relational, non-relational, and dimensional data models using Erwin.
  • 10+ years of experience working in agile teams delivering data solutions.
  • 10+ years of experience developing Master Data Management (MDM) solutions.
  • 8+ years of experience delivering solutions on public cloud platforms, preferably Google Cloud.
  • Experience writing automated unit, integration, and acceptance tests for data interfaces and data pipelines.
  • Ability to quickly learn new technologies and determine their best applications.
  • Exceptional interpersonal skills, including teamwork, communication, and negotiation.

  • Integrate multiple databases together, including Snowflake schema, Star schema, and Network model.
  • Work with multiple message buses, such as Kafka and IBM MQ to targets like Redshift, Postgres, and MongoDB.
  • Discover appropriate workloads and utilize the right database for performance and functionality.
  • Design and deploy databases for scale based on request types.
  • Ensure database recovery with sequence and time constraints.
  • Collaborate with business and technology stakeholders to define future-state business capabilities and requirements, translating them into data architectures.
  • Partner with platform architects to ensure implementations follow principles, guidelines, and standards.
  • Analyze current technology environments for deficiencies and recommend solutions.
  • Design, implement, and maintain data services and interfaces, including real-time data pipelines using emerging technologies.
  • Develop continuous integration and deployment for data pipelines with automated testing.
  • Utilize workflow management platforms like Airflow.
  • Mentor and motivate the team to achieve organizational goals.
  • Advocate for agile practices to increase efficiency.
  • Maintain consistency with development standards.

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgres

Posted 3 months ago
Apply
Apply

πŸ“ India

🏒 Company: CloudHireπŸ‘₯ 11-50RecruitingWeb DesignSoftware

  • 7+ years of experience with AWS (preferred), GCP, or Azure.
  • 10+ years designing and supporting real-time data pipelines and analytics solutions using tools like Postgres, Redshift, and others.
  • 10+ years using object-oriented languages (.Net, Java, Python) for streaming analytics.
  • 10+ years documenting business requirements and creating data models using Erwin.
  • 10+ years working in agile teams on data solutions.
  • 10+ years developing Master Data Management solutions.
  • 8+ years delivering solutions on public cloud platforms, preferably Google Cloud.
  • Experience writing automated tests for data interfaces and pipelines.
  • Exceptional interpersonal skills, including teamwork and communication.

  • Integrate multiple databases together, including Snowflake schema, Star schema, and Network model.
  • Work with message buses, such as Kafka and IBM MQ, to target databases like Redshift and Postgres.
  • Discover appropriate workloads to select the right database for performance.
  • Ensure database recovery with time constraints.
  • Collaborate with stakeholders to define future-state capabilities and translate them into data architectures.
  • Partner with platform architects for compliance with platform principles.
  • Analyze the technology environment to address deficiencies.
  • Design and maintain data services and real-time data pipelines.
  • Develop CI/CD for data pipelines, including automated testing.
  • Utilize workflow management platforms like Airflow.
  • Mentor and support the team to reach organizational goals.
  • Advocate for agile practices to improve delivery.

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgres

Posted 4 months ago
Apply