Apply

Data Architect

Posted 3 months agoInactiveViewed

View full description

πŸ’Ž Seniority level: Senior, 4+ years

🏒 Company: EffectualπŸ‘₯ 11-50πŸ’° $2,145,823 Series A over 4 years agoInfrastructureBusiness DevelopmentRisk ManagementBrand MarketingMarketing

⏳ Experience: 4+ years

Requirements:
  • Strong knowledge of database structures and data mining.
  • Familiarity with SQL, NoSQL, Hadoop, Spark, AWS, Azure.
  • Expertise in data modeling and design tools.
  • Deep knowledge in data modeling, warehousing, and database design.
  • Expertise in data management technologies and understanding of big data solutions.
  • Familiarity with languages and tools like SQL, UML, XML, enterprise architecture frameworks.
Responsibilities:
  • Design and create the overall data architecture including databases and processing systems.
  • Define data storage, consumption, integration, and management methodologies.
  • Ensure compliance of data strategies with regulatory requirements.
  • Align data architecture with business requirements by working with executives.
  • Optimize data retrieval and perform database maintenance tasks.
  • Collaborate with data scientists to meet analytics requirements.
Apply

Related Jobs

Apply

🏒 Company: ServantπŸ‘₯ 11-50ConsultingAdviceProfessional Services

  • 8+ years of experience in data architecture, data engineering, or related fields, with a strong track record of designing and implementing scalable data solutions.
  • Deep expertise in data modeling, performance tuning, and query optimization.
  • Strong SQL and Python skills for developing and optimizing ETL/ELT workflows.
  • Experience with modern data warehouses (e.g., Snowflake, BigQuery, Redshift, Databricks), but with the ability to apply general best practices across platforms.
  • Expertise in ETL/ELT tools such as Fivetran, Stitch, dbt, or Airflow.
  • Strong understanding of data quality, lineage tracking, and governance best practices.
  • Experience with cloud platforms (AWS, GCP, Azure) and related services (e.g., S3, Data Lakes, IAM security).
  • Excellent problem-solving skillsβ€”able to balance best practices with real-world constraints and deliver solutions quickly.
  • Collaborative mindset with the ability to work across teams and influence key decisions.
  • Design and implement a scalable, high-performance, and secure data architecture that supports operational reporting, advanced analytics, and business intelligence.
  • Integrate diverse data sources (e.g., Shopify, Funraise, Braze) into a unified architecture, ensuring seamless data flow across systems.
  • Continuously assess and refine data models and infrastructure to align with evolving business needs and long-term objectives.
  • Provide creative, efficient solutions for data challenges, balancing best practices with pragmatic decision-making.
  • Lead the migration and modernization of data platforms, ensuring minimal downtime and data integrity.
  • Guide data modeling, performance tuning, and storage optimization, ensuring scalability and efficiency.
  • Establish best practices for ETL/ELT workflows, balancing automation, monitoring, and fault tolerance.
  • Work with cloud-based data platforms (AWS, GCP, Azure) to design robust and cost-effective data solutions.
  • Stay informed on emerging technologies and recommend improvements to the data ecosystem.
  • Implement frameworks for data validation, monitoring, and anomaly detection to ensure reliability and trustworthiness.
  • Document data lineage, metadata, and governance policies to promote transparency and usability.
  • Develop comprehensive technical documentation for data architecture, pipelines, and governance processes.
  • Assess the cost implications of implementing and maintaining data solutions, considering factors such as infrastructure, licensing, operational expenses, and staffing requirements. Provide a holistic view of the total cost of ownership (TCO) for proposed architectures, ensuring solutions are scalable, efficient, and cost-effective.
  • Collaborate with stakeholders to balance performance, security, and budgetary constraints while optimizing the long-term value of data investments.
  • Act as a technical mentor and leader, guiding data engineers in architectural decisions and best practices.
  • Work closely with business stakeholders to translate complex data requirements into practical, scalable solutions.
  • Serve as a bridge between technical and non-technical teams, communicating ideas clearly and effectively.
Posted 4 days ago
Apply
Apply
πŸ”₯ Data Architect
Posted 11 days ago

🧭 Full-Time

πŸ” IT Consulting

🏒 Company: DMV IT Service

  • Deep understanding of Azure Synapse Analytics, Azure Data Factory, and related Azure data tools.
  • A minimum of 7 years of experience with large and complex database management systems.
  • Expertise in SQL, with proficiency in Python or PowerShell preferred.
  • Strong skills in data collection, storage, accessibility, and quality improvement processes.
  • Excellent communication skills for effective collaboration with both technical and non-technical team members.
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field, or equivalent work experience.
  • Evaluate and recommend data management processes to enhance performance and cost efficiency.
  • Design, prepare, and optimize data pipelines and workflows using Azure data tools.
  • Lead implementations of secure, scalable, and reliable Azure solutions.
  • Monitor Azure environments to optimize performance and cost.
  • Foster security best practices, access controls, and compliance standards for all data resources.
  • Design and optimize fact and dimension table models.
  • Develop and maintain data pipelines, ensuring data accessibility and quality.
  • Provide expert guidance on Data Vault 2.0 methodologies using Wherescape automation software.
  • Perform knowledge transfer and document Azure architectures and solutions effectively.
Posted 11 days ago
Apply
Apply
πŸ”₯ Sr Data Architect
Posted 15 days ago

🏒 Company: RecargaPayπŸ‘₯ 501-1000πŸ’° $10,000,000 Debt Financing over 2 years agoMobile PaymentsFinancial ServicesFinTech

  • Data Modeling (conceptual, logical, and physical)
  • SQL and NoSQL Databases (PostgreSQL, MySQL, Oracle, ElasticSearch)
  • Data Warehousing and Data Lakes Architecture (Snowflake, Redshift, BigQuery, Hadoop)
  • ETL/ELT and Data Integration (Apache Airflow, DBT)
  • Big Data and Distributed Processing (Spark, Kafka, Flink, Databricks)
  • Cloud Computing (AWS, Azure, Google Cloud – services like S3, Glue, Data Factory, BigQuery)
  • Data Security and Governance (LGPD, GDPR, RBAC, data encryption)
  • APIs and System Integration (REST, GraphQL, gRPC)
  • Agile Methodologies (Scrum, Kanban)
  • Define and design the company's data architecture, aligned with best practices and business needs.
  • Create conceptual, logical, and physical data models, ensuring data structuring and integrity.
  • Develop robust and scalable data pipelines, optimizing ETL/ELT processes for Big Data and Data Warehousing.
  • Lead data governance initiatives, establishing quality, security, and compliance policies (LGPD, GDPR).
  • Ensure interoperability between systems and platforms, creating architectures that enable integration across multiple data sources.
  • Support the selection of technologies and tools for data storage, processing, and analysis (Data Lakes, Data Warehouses, Cloud, etc.).
  • Collaborate with cross-functional teams (Data Engineers, Data Scientists, BI Analysts) to ensure data efficiency and quality.
  • Conduct audits and optimizations to improve the performance of databases and data solutions.
  • Document processes, architecture, and data standards to ensure reproducibility and maintenance.
  • Act as a technical mentor, training other professionals and ensuring the dissemination of best practices in data architecture.
Posted 15 days ago
Apply
Apply
πŸ”₯ Data Architect
Posted 19 days ago

πŸ“ Canada

🧭 Full-Time

🏒 Company: Top HatπŸ‘₯ 251-500πŸ’° $130,000,000 Series E about 4 years agoπŸ«‚ Last layoff about 1 year agoEducationEdTechMobileSoftware

  • 5 years of experience with data modelling in an agile work environment.
  • Profess in multiple modelling techniques.
  • In-depth understanding of database storage principles.
  • Alignment with a culture of experimentation and A/B testing.
  • Experience gathering and analyzing product requirements.
  • Knowledge of data mining and segmentation techniques.
  • Expertise in SQL and MySQL.
  • Creation of ETL Specifications to satisfy product requirements.
  • Testing of Data products.
  • Familiarity with ERwin data modelling tool.
  • Familiarity with big data systems like Hadoop, Spark and dbt.
  • Familiarity with database management systems like PostgreSQL, MongoDB
  • Familiarity with data visualization tools.
  • Proven analytical skills.
  • Problem-solving attitude.
  • Comfortable with agile practices.
  • Possesses an experimentation mindset.
  • BSc in Computer Science or relevant field.
  • Analyze requirements and translate them to conceptual and logical data models for our applications and the data warehouse.
  • Optimize and/or perform migration on existing database systems.
  • Improve system performance by conducting tests, troubleshooting and integrating new elements.
  • Develop and maintain a data dictionary and metadata management strategy.
  • Develop and maintain data governance policies, procedures, and standards to ensure the integrity and accuracy of the data.
  • Coordinate with the Data Science and Revenue Operation teams to identify future needs and requirements.
  • Provide operational support for downstream business units.

PostgreSQLSQLAgileData AnalysisData MiningErwinETLHadoopMongoDBMySQLNosqlSparkAnalytical SkillsData visualizationData modeling

Posted 19 days ago
Apply
Apply
πŸ”₯ Data Architect
Posted 25 days ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 76000.0 - 149000.0 USD per year

πŸ” Software Development

🏒 Company: MongoDBπŸ‘₯ 1001-5000πŸ’° Post-IPO Equity about 7 years agoDatabaseOpen SourceCloud ComputingSaaSSoftware

  • At least 3 years experience building data models to support structured and semi-structured product data including telemetry
  • Highly proficient in SQL.
  • Experienced with at least one distributed query engine or cloud-based data warehouse, such as Trino or BigQuery
  • Familiarity with Python and rudimentary Prompt Engineering
  • Well-versed in core data lake, data warehousing and data security concepts
  • Familiar with data catalog functionality, business glossaries, metadata management and data lineage
  • Familiar with cloud computing concepts and big data storage formats such as Parquet and Iceberg
  • Partner with Product Owners and other Stakeholders to document data requirements, utilizing your experience to provide insights and add value to the process
  • Validate and assess incoming product telemetry data, ensuring alignment with documentation and meeting Product Owners and other Stakeholders' expectations by thorough cross-referencing and testing
  • Design data models and schemas within the internal data lake and data warehouse that will be used for analytical and data science consumption
  • Create and execute test plans to ensure data is accurate and performance meets SLAs
  • Scope and create SQL-based pipelines prioritizing data consistency and accuracy
  • Implement data security standards for data access
  • Document metadata and lineage in the enterprise data catalog, promoting self-service data discovery and exploration

PythonSQLCloud ComputingData AnalysisETLMongoDBData engineeringNosqlCommunication SkillsAnalytical SkillsRESTful APIsData visualizationData modelingData management

Posted 25 days ago
Apply
Apply
πŸ”₯ Data Architect - Remote
Posted about 1 month ago

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 100000.0 - 120000.0 USD per year

πŸ” Information Technology

🏒 Company: New Era TechnologyπŸ‘₯ 1001-5000InternetInformation TechnologyAudio/Visual Equipment

  • 8+ years of experience in data architecture
  • Strong knowledge of Database Design Principles and Data Warehousing
  • Experience with SQL and NoSQL databases
  • Knowledge of Big Data technologies and Cloud Storage solutions
  • Ability to develop and present ideas to management
  • Deep knowledge of Enterprise Data Lake/Lakehouse and related tools
  • Facilitate organizational change through research and technical strategy definition
  • Conduct research and develop data and visualization roadmaps
  • Communicate best practices in data, ETL, BI, and analytics
  • Assist in evaluating business demands for software and technologies
  • Provide technical design and architectural leadership
  • Collaborate with project teams for solution implementation

SQLBusiness IntelligenceData MiningETLNosqlData visualizationData modeling

Posted about 1 month ago
Apply
Apply
πŸ”₯ Data Architect
Posted about 1 month ago

πŸ“ WA, MA, CA, UT

πŸ’Έ 120000.0 - 165000.0 USD per year

πŸ” Software Development

🏒 Company: Shape TherapeuticsπŸ‘₯ 101-250πŸ’° $112,000,000 Series B over 3 years agoBiotechnology

  • 5+ years of experience building and implementing data systems, with exposure scientific or biotech domains preferred
  • Extensive experience with relational and non-relational databases (PostgreSQL, MySQL, MongoDB, etc.)
  • Proficiency in Python and modern data engineering tools
  • An ability to communicate effectively and collaborate with scientific teams
  • Experience with or exposure to: Cloud platforms (AWS) and data infrastructure, Scientific platforms (e.g., Benchling), Enterprise data warehousing (e.g., Redshift, Snowflake, Databricks), Modern data orchestration and pipeline tools (e.g. Apache Airflow, Prefect, etc.), Scientific data standards (ADF, UDM), Data visualization and analytics
  • Design and implement the data architecture that will form the foundation of our biotech data ecosystem, including standardized pipelines for scientific datasets
  • Develop automated ETL pipelines to handle complex scientific data from sequencing experiments, imaging instruments, manufacturing workflows, and electronic lab notebooks
  • Partner with our experimental, computational, and machine learning teams to create data storage solutions that accelerate scientific discovery
  • Architect and implement intelligent data lifecycle management, including automated archiving workflows
  • Maintain detailed technical documentation for pipelines, models, and processes to ensure reproducibility and knowledge sharing
  • Help establish data governance frameworks that enable team-wide collaboration
  • Contribute to data modeling initiatives and establish metadata standards to ensure our data assets are FAIR
  • Work cross-functionally to ensure our data solutions comply with security and compliance requirements

AWSPostgreSQLPythonSQLApache AirflowCloud ComputingData AnalysisETLMachine LearningMongoDBData engineeringData scienceData visualizationData modelingData management

Posted about 1 month ago
Apply
Apply
πŸ”₯ Sr. Data Architect
Posted about 1 month ago

πŸ“ USA

🧭 Full-Time

πŸ” Foreign Exchange payments processing

🏒 Company: ConveraπŸ‘₯ 1001-5000Financial ServicesPaymentsFinanceFinTech

  • Bachelor's degree in Computer Science or related field
  • 10+ years of experience in data architecture
  • Strong knowledge of database design and data modeling
  • Hands-on experience with Snowflake and SQL
  • Proficiency with data modeling tools
  • Familiarity with regulatory compliance
  • Experience with Big Data technologies
  • Strong communication and collaboration skills
  • Design and implement scalable data architecture solutions
  • Partner with stakeholders to understand data requirements
  • Collaborate on data integration and governance
  • Develop semantic models for reporting needs
  • Monitor and optimize data systems performance
  • Integrate AI and ML technologies into architecture
  • Document data architecture and best practices

AWSPythonSQLETLMachine LearningSnowflakeTableauData modeling

Posted about 1 month ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 175000.0 - 200000.0 USD per year

πŸ” Data Engineering

🏒 Company: AstronomerπŸ‘₯ 251-500πŸ’° $213,000,000 Series C about 3 years agoπŸ«‚ Last layoff almost 2 years agoCloud Data ServicesBig DataData IntegrationMachine LearningAnalyticsInformation TechnologyEnterprise SoftwareSoftware

  • Experience with Apache Airflow in production environments
  • Experience in designing and implementing ETL, Data Warehousing, and ML/AI use cases
  • Proficiency in Python
  • Knowledge of cloud-native data architecture
  • Demonstrated technical leadership
  • Work directly with Astronomer customers
  • Participate in pre-sales motions
  • Build architecture and operational diagrams
  • Provide reference implementations
  • Collaborate to build reusable assets
  • Interact with Product and Engineering stakeholders
  • Establish strong relationships with key customer stakeholders

PythonApache AirflowETL

Posted about 2 months ago
Apply
Apply

πŸ“ Vietnam

  • Strong background in Databricks implementation.
  • Experience architecting complex data systems in large-scale environments.
  • Successful delivery of complex data-related projects end to end.
  • Experience in Agile and DevSecOps delivery and quality practices.
  • Experience with cloud-based technologies.
  • Familiarity with JIRA, Confluence, GitHub, and Bitbucket.
  • Configure and maintain Databricks clusters and workspaces.
  • Implement and manage access controls and security policies to protect data.
  • Provide technical support to Databricks engineers and other users.
  • Collaborate with data engineers to integrate Databricks with data sources.
  • Optimize cluster configurations for performance and cost-effectiveness.
  • Monitor cluster performance and troubleshoot issues.
  • Perform configuration management using Terraform.
  • Define DevSecOps strategy and evaluate CICD pipelines.

AgileCloud ComputingCI/CDTerraform

Posted about 2 months ago
Apply

Related Articles

Posted about 1 month ago

Why remote work is such a nice opportunity?

Why is remote work so nice? Let's try to see!

Posted 7 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 8 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 8 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 8 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.