Remote Working

Remote working from home provides convenience and freedom, a lifestyle embraced by millions of people around the world. With our platform, finding the right job, whether full-time or part-time, becomes quick and easy thanks to AI, precise filters, and daily updates. Sign up now and start your online career today β€” fast and easy!

Remote IT Jobs
Erwin
7 jobs found. to receive daily emails with new job openings that match your preferences.
7 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

πŸ“ United States

πŸ’Έ 86840.0 - 130000.0 USD per year

πŸ” Financial Crimes Risk Management

  • 7+ year related business analysis experience required
  • Proficiency in SQL and or equivalent computer languages; programming level preferred
  • Experience with data modeling and/or data transformation tools preferred (Erwin, SQL Developer, Informatica)
  • Plan, manage, lead and oversee the end-to-end delivery of requirements throughout the lifecycle of the project in alignment with the business and/or enterprise needs and strategies.
  • Provide leadership and work collaboratively with stakeholders including business, technology and finance partners to support project benefits and changes to business processes, policies and systems across single or multiple Lines of Business (LoB).
  • Leads Requirements Management / work packages for Tier 2, high risk, strategic and regulatory projects or programs and may lead requirements may lead Requirements Management for Tier 1 projects/programs

Project ManagementSQLBusiness AnalysisData AnalysisErwinCommunication SkillsProblem SolvingData modeling

Posted about 12 hours ago
Apply
Apply

πŸ“ United States

πŸ” Software Development

🏒 Company: ge_externalsite

  • Exposure to industry standard data modeling tools (e.g., ERWin, ER Studio, etc.).
  • Exposure to Extract, Transform & Load (ETL) tools like Informatica or Talend
  • Exposure to industry standard data catalog, automated data discovery and data lineage tools (e.g., Alation, Collibra, TAMR etc., )
  • Hands-on experience in programming languages like Java, Python or Scala
  • Hands-on experience in writing SQL scripts for Oracle, MySQL, PostgreSQL or HiveQL
  • Experience with Big Data / Hadoop / Spark / Hive / NoSQL database engines (i.e. Cassandra or HBase)
  • Exposure to unstructured datasets and ability to handle XML, JSON file formats
  • Work independently as well as with a team to develop and support Ingestion jobs
  • Evaluate and understand various data sources (databases, APIs, flat files etc. to determine optimal ingestion strategies
  • Develop a comprehensive data ingestion architecture, including data pipelines, data transformation logic, and data quality checks, considering scalability and performance requirements.
  • Choose appropriate data ingestion tools and frameworks based on data volume, velocity, and complexity
  • Design and build data pipelines to extract, transform, and load data from source systems to target destinations, ensuring data integrity and consistency
  • Implement data quality checks and validation mechanisms throughout the ingestion process to identify and address data issues
  • Monitor and optimize data ingestion pipelines to ensure efficient data processing and timely delivery
  • Set up monitoring systems to track data ingestion performance, identify potential bottlenecks, and trigger alerts for issues
  • Work closely with data engineers, data analysts, and business stakeholders to understand data requirements and align ingestion strategies with business objectives.
  • Build technical data dictionaries and support business glossaries to analyze the datasets
  • Perform data profiling and data analysis for source systems, manually maintained data, machine generated data and target data repositories
  • Build both logical and physical data models for both Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) solutions
  • Develop and maintain data mapping specifications based on the results of data analysis and functional requirements
  • Perform a variety of data loads & data transformations using multiple tools and technologies.
  • Build automated Extract, Transform & Load (ETL) jobs based on data mapping specifications
  • Maintain metadata structures needed for building reusable Extract, Transform & Load (ETL) components.
  • Analyze reference datasets and familiarize with Master Data Management (MDM) tools.
  • Analyze the impact of downstream systems and products
  • Derive solutions and make recommendations from deep dive data analysis.
  • Design and build Data Quality (DQ) rules needed

AWSPostgreSQLPythonSQLApache AirflowApache HadoopData AnalysisData MiningErwinETLHadoop HDFSJavaKafkaMySQLOracleSnowflakeCassandraClickhouseData engineeringData StructuresREST APINosqlSparkJSONData visualizationData modelingData analyticsData management

Posted 3 days ago
Apply
Apply
πŸ”₯ Solutions Architect
Posted about 1 month ago

πŸ“ UK

πŸ” Life Sciences

  • Proven experience in Infrastructure Architecture design within the context of a wider enterprise architecture framework.
  • Experience in Solution Architecture or a willingness to learn.
  • Knowledge of the Purdue model and OT network protocols, as well as experience working with industrial manufacturing systems (PLCs and SCADA) would be beneficial.
  • Hands-on cloud infrastructure experience (Azure, AWS), network segregation, and/or software-defined-WAN experience will be a distinct asset.
  • Ability to interpret complex technical information in relation to networks and tailor implementation to best suit the LGC environment.
  • Experience in writing and presenting technical designs to meet business needs.
  • Familiarity with Architectural frameworks such as TOGAF or Zachman and Enterprise Architecture tools, such as Erwin Evolve, LeanX, or others.
  • Experience working within regulatory frameworks like GxP or ISO would be beneficial.
  • Experience in manufacturing and life sciences industries would also be beneficial.
  • Actively supporting Business and IT projects as an Architecture SME for Infrastructure and business solutions.
  • Developing Infrastructure and Solutions Architectures in line with LGC’s Architecture principles and IT policies.
  • Ensuring architectural design considerations are effectively embedded into delivery planning while aligning with business outcomes.
  • Providing mentorship and expertise to members of the wider IT organization and business community in the design and implementation of enterprise solutions.
  • Proactively finding opportunities that will enable LGC to meet its business and IT goals, aligned with common architectural principles.
  • Embedding common ways of working, tooling, and reporting across the LGC architecture community aligned with a standard framework of artefacts.
  • Supporting the ongoing continuous improvement of LGC’s architecture capability to ensure it meets the dynamic needs of the organization’s ambitions.

AWSErwinAzure

Posted about 1 month ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ’Έ 3500000.0 - 3700000.0 INR per year

πŸ” Technology

🏒 Company: CloudHireπŸ‘₯ 11-50RecruitingWeb DesignSoftware

  • 7+ years of experience with AWS, GCP, or Azure.
  • 10+ years of experience in designing and supporting real-time data pipelines.
  • Experience with Postgres, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, Powerbi, and/or SSIS.
  • 10+ years using object-oriented languages (.Net, Java, Python).
  • Experience in developing MDM solutions and working on agile teams.
  • Integrate multiple databases including Snowflake schema and Star schema.
  • Work with message buses like Kafka to databases such as Redshift and Postgres.
  • Discover appropriate workloads and select the suitable database for performance.
  • Collaborate with stakeholders to define future-state business capabilities.
  • Analyze current technology to identify deficiencies and recommend improvements.
  • Design, implement, and maintain real-time data pipelines and services.
  • Mentor and support the team to achieve organizational goals.

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgresMentoringNegotiation

Posted 4 months ago
Apply
Apply

πŸ“ India

🧭 Full-Time

πŸ’Έ 3000000 - 3700000 INR per year

πŸ” Remote Employee Provider

🏒 Company: CloudHireπŸ‘₯ 11-50RecruitingWeb DesignSoftware

  • 7+ years experience with AWS, GCP, or Azure preferred.
  • 10+ years experience designing and supporting data pipelines with PostgreSQL, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, Power BI, and/or SSIS.
  • 10+ years experience using object-oriented languages such as .Net, Java, Python.
  • 10+ years experience documenting business requirements and translating them into data models.
  • 10+ years experience working on agile teams.
  • 10+ years experience developing MDM solutions.
  • 8+ years experience delivering solutions on cloud platforms, preferably Google Cloud.
  • Experience with automated testing for data pipelines.
  • Exceptional interpersonal skills.
  • Integrate multiple databases together using various schema types.
  • Work with message buses to deliver data to different targets.
  • Identify appropriate workloads and determine database effectiveness.
  • Design and deploy database systems for scalability.
  • Collaborate with stakeholders on business capabilities and data architectures.
  • Analyze technology environments to recommend improvements.
  • Implement and maintain data services and real-time pipelines.
  • Develop CI/CD for data pipelines with testing automation.
  • Mentor the team and advocate agile practices.

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgresAgile methodologies

Posted 5 months ago
Apply
Apply
πŸ”₯ Database Architect
Posted 5 months ago

πŸ“ India

🧭 Full-Time

πŸ’Έ 30000.0 - 37000.0 USD per year

πŸ” Software Development

🏒 Company: CloudHireπŸ‘₯ 11-50RecruitingWeb DesignSoftware

  • 7+ years with AWS (preferred), GCP or Azure
  • 10+ years of experience using standard methodologies to design, build, and support near real-time data pipelines and analytical solutions using Postgres, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, Powerbi and/or SSIS
  • 10+ years of experience using object-oriented languages (.Net, Java, Python) to deliver data for near real-time, streaming analytics.
  • 10+ years of experience working with partners documenting business requirements and translating those requirements into relational, non-relational, and dimensional data models using Erwin
  • 10+ years of experience working on agile teams delivering data solutions
  • 10+ years of experience developing MDM solutions
  • 8+ years of experience in delivering solutions on public cloud platforms (Google Cloud preferred)
  • Experience writing automated unit, integration, and acceptance tests for data interfaces & data pipelines
  • Ability to quickly comprehend the functions and capabilities of new technologies, and identify the most appropriate use for them
  • Exceptional interpersonal skills, including teamwork, communication, and negotiation
  • Integrate multiple databases together, Snowflake schema, Star schema, Network model, and others.
  • Work with multiple message buses, Kafka, IBM MQ to targets like Redshift, Postgres, MongoDB
  • Discovering appropriate workloads and use the appropriate database to deliver the performance and functionality needed
  • Adept at design and deploy for scale considering the types of requests the database must deliver on
  • Database recovery with sequence and time constraint
  • Collaborating directly with business and technology stakeholders to define future-state business capabilities & requirements and translating those into transitional and target state data architectures.
  • Partnering with platform architects to ensure implementations meet published platform principles, guidelines, and standards.
  • Analyzing the current technology environment to detect critical deficiencies and recommend solutions for improvement.
  • Designing, implementing, and maintaining data services, interfaces, and real-time data pipelines via the practical application of existing, new, and emerging technologies and data engineering techniques
  • Developing continuous integration and continuous deployment for data pipelines that include automated unit & integration testing
  • Workflow management platforms like Airflow
  • Mentoring, motivating, and supporting the team to achieve organizational objectives and goals
  • Advocating for agile practices to increase delivery throughput.
  • Creating, maintaining, and ensuring consistency with published development standards

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgresAgile methodologies

Posted 5 months ago
Apply
Apply

πŸ“ India

🏒 Company: CloudHireπŸ‘₯ 11-50RecruitingWeb DesignSoftware

  • 7+ years of experience with AWS (preferred), GCP or Azure.
  • 10+ years of experience designing, building, and supporting near real-time data pipelines and analytical solutions using technologies such as Postgres, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, Power BI, and/or SSIS.
  • 10+ years of experience using object-oriented programming languages (e.g., .Net, Java, Python) for data delivery in near real-time and streaming analytics.
  • 10+ years of experience documenting business requirements and translating them into relational, non-relational, and dimensional data models using Erwin.
  • 10+ years of experience working in agile teams delivering data solutions.
  • 10+ years of experience developing Master Data Management (MDM) solutions.
  • 8+ years of experience delivering solutions on public cloud platforms, preferably Google Cloud.
  • Experience writing automated unit, integration, and acceptance tests for data interfaces and data pipelines.
  • Ability to quickly learn new technologies and determine their best applications.
  • Exceptional interpersonal skills, including teamwork, communication, and negotiation.
  • Integrate multiple databases together, including Snowflake schema, Star schema, and Network model.
  • Work with multiple message buses, such as Kafka and IBM MQ to targets like Redshift, Postgres, and MongoDB.
  • Discover appropriate workloads and utilize the right database for performance and functionality.
  • Design and deploy databases for scale based on request types.
  • Ensure database recovery with sequence and time constraints.
  • Collaborate with business and technology stakeholders to define future-state business capabilities and requirements, translating them into data architectures.
  • Partner with platform architects to ensure implementations follow principles, guidelines, and standards.
  • Analyze current technology environments for deficiencies and recommend solutions.
  • Design, implement, and maintain data services and interfaces, including real-time data pipelines using emerging technologies.
  • Develop continuous integration and deployment for data pipelines with automated testing.
  • Utilize workflow management platforms like Airflow.
  • Mentor and motivate the team to achieve organizational goals.
  • Advocate for agile practices to increase efficiency.
  • Maintain consistency with development standards.

AWSPythonSQLAgileErwinGCPHadoopJavaKafkaMongoDBSnowflakeAirflowAzureData engineering.NETPostgres

Posted 5 months ago
Apply

Ready to Start Your Remote Journey?

Apply to 5 jobs per day for free, or get unlimited applications with a subscription starting at €5/week.

Why do Job Seekers Choose Our Platform for Remote Work Opportunities?

We’ve developed a well-thought-out service for home job matching, making the searching process easier and more efficient.

AI-powered Job Processing and Advanced Filters

Our algorithms process thousands of offers postings daily, extracting only the key information from each listing. This allows you to skip lengthy texts and focus only on the offers that match your requirements.

With powerful skill filters, you can specify your core competencies to instantly receive a selection of job opportunities that align with your experience. 

Search by Country of Residence

For those looking for fully remote jobs in their own country, our platform offers the ability to customize the search based on your location. This is especially useful if you want to adhere to local laws, consider time zones, or work with employers familiar with local specifics.

If necessary, you can also work remotely with employers from other countries without being limited by geographical boundaries.

Regular Data Update

Our platform features over 40,000 remote work offers with full-time or part-time positions from 7,000 companies. This wide range ensures you can find offers that suit your preferences, whether from startups or large corporations.

We regularly verify the validity of vacancy listings and automatically remove outdated or filled positions, ensuring that you only see active and relevant opportunities.

Job Alerts

Once you register, you can set up convenient notification methods, such as receiving tailored job listings directly to your email or via Telegram. This ensures you never miss out on a great opportunity.

Our job board allows you to apply for up to 5 vacancies per day absolutely for free. If you wish to apply for more, you can choose a suitable subscription plan with weekly, monthly, or annual payments.

Wide Range of Completely Remote Online Jobs

On our platform, you'll find fully remote work positions in the following fields:

  • IT and Programming β€” software development, website creation, mobile app development, system administration, testing, and support.
  • Design and Creative β€” graphic design, UX/UI design, video content creation, animation, 3D modeling, and illustrations.
  • Marketing and Sales β€” digital marketing, SMM, contextual advertising, SEO, product management, sales, and customer service.
  • Education and Online Tutoring β€” teaching foreign languages, school and university subjects, exam preparation, training, and coaching.
  • Content β€” creating written content for websites, blogs, and social media; translation, editing, and proofreading.
  • Administrative Roles (Assistants, Operators) β€” Virtual assistants, work organization support, calendar management, and document workflow assistance.
  • Finance and Accounting β€” bookkeeping, reporting, financial consulting, and taxes.

Other careers include: online consulting, market research, project management, and technical support.

All Types of Employment

The platform offers online remote jobs with different types of work:

  • Full-time β€” the ideal choice for those who value stability and predictability;
  • part-time β€” perfect for those looking for a side home job or seeking a balance between work and personal life;
  • Contract β€” suited for professionals who want to work on projects for a set period.
  • Temporary β€” short-term work that can be either full-time or part-time. These positions are often offered for seasonal or urgent tasks;
  • Internship β€” a form of on-the-job training that allows you to gain practical experience in your chosen field.

Whether you're looking for stable full-time employment, the flexibility of freelancing, or a part-time side gig, you'll find plenty of options on Remoote.app.

Remote Working Opportunities for All Expertise Levels

We feature offers for people with all levels of expertise:

  • for beginners β€” ideal positions for those just starting their journey in internet working from home;
  • for intermediate specialists β€” if you already have experience, you can explore positions requiring specific skills and knowledge in your field;
  • for experts β€” roles for highly skilled professionals ready to tackle complex tasks.

How to Start Your Online Job Search Through Our Platform?

To begin searching for home job opportunities, follow these three steps:

  1. Register and complete your profile. This process takes minimal time.
  2. Specify your skills, country of residence, and the preferable position.
  3. Receive notifications about new vacancy openings and apply to suitable ones.

If you don't have a resume yet, use our online builder. It will help you create a professional document, highlighting your key skills and achievements. The AI will automatically optimize it to match job requirements, increasing your chances of a successful response. You can update your profile information at any time: modify your skills, add new preferences, or upload an updated resume.