Apply

Data Architect

Posted about 2 months agoViewed

View full description

πŸ’Ž Seniority level: Senior, 6+ years

πŸ“ Location: United States, Canada

πŸ” Industry: Software Development

πŸ—£οΈ Languages: English

⏳ Experience: 6+ years

πŸͺ„ Skills: AWSPythonSQLMicrosoft SQL ServerData modeling

Requirements:
  • 6+ years of experience in similar positions
  • Strong expertise in AWS with a focus on Redshift
  • Advanced proficiency in SQL scripting
  • Hands-on experience in Python for data processing
  • Practical knowledge of SQL Server
  • Proven expertise in data architecture and modeling
Responsibilities:
  • Design and develop data models to support new database implementations
  • Create and optimize queries and scripts for established data models
  • Transform existing data models into robust data marts and data lakes
  • Collaborate within a team of data engineers
  • Contribute to ongoing database projects
Apply

Related Jobs

Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 100000.0 - 120000.0 USD per year

πŸ” Information Technology

🏒 Company: New Era TechnologyπŸ‘₯ 1001-5000InternetInformation TechnologyAudio/Visual Equipment

  • 8+ years of experience in data architecture
  • Strong knowledge of Database Design Principles and Data Warehousing
  • Experience with SQL and NoSQL databases
  • Knowledge of Big Data technologies and Cloud Storage solutions
  • Ability to develop and present ideas to management
  • Deep knowledge of Enterprise Data Lake/Lakehouse and related tools
  • Facilitate organizational change through research and technical strategy definition
  • Conduct research and develop data and visualization roadmaps
  • Communicate best practices in data, ETL, BI, and analytics
  • Assist in evaluating business demands for software and technologies
  • Provide technical design and architectural leadership
  • Collaborate with project teams for solution implementation

SQLBusiness IntelligenceData MiningETLNosqlData visualizationData modeling

Posted 17 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 175000.0 - 200000.0 USD per year

πŸ” Data Engineering

🏒 Company: AstronomerπŸ‘₯ 251-500πŸ’° $213,000,000 Series C almost 3 years agoπŸ«‚ Last layoff almost 2 years agoCloud Data ServicesBig DataData IntegrationMachine LearningAnalyticsInformation TechnologyEnterprise SoftwareSoftware

  • Experience with Apache Airflow in production environments
  • Experience in designing and implementing ETL, Data Warehousing, and ML/AI use cases
  • Proficiency in Python
  • Knowledge of cloud-native data architecture
  • Demonstrated technical leadership
  • Work directly with Astronomer customers
  • Participate in pre-sales motions
  • Build architecture and operational diagrams
  • Provide reference implementations
  • Collaborate to build reusable assets
  • Interact with Product and Engineering stakeholders
  • Establish strong relationships with key customer stakeholders

PythonApache AirflowETL

Posted 27 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 175000.0 - 205000.0 USD per year

πŸ” Software Development

🏒 Company: CoreWeaveπŸ’° $642,000,000 Secondary Market over 1 year agoCloud ComputingMachine LearningInformation TechnologyCloud Infrastructure

  • Hands-on experience applying Kimball Dimensional Data Modeling principles to large datasets.
  • Expertise in working with analytical table/file formats, including Iceberg, Parquet, Avro, and ORC.
  • Proven experience optimizing MPP databases (StarRocks, Snowflake, BigQuery, Redshift).
  • Minimum 5+ years of programming experience in Python or Scala.
  • Advanced SQL skills, with a strong ability to write, optimize, and debug complex queries.
  • Hands-on experience with Airflow for batch orchestration distributed computing frameworks like Spark or Flink.
  • Develop and maintain data models, including star and snowflake schemas, to support analytical needs across the organization.
  • Establish and enforce best practices for dimensional modeling in our Lakehouse.
  • Engineer and optimize data storage using analytical table/file formats (e.g., Iceberg, Parquet, Avro, ORC).
  • Partner with BI, analytics, and data science teams to design datasets that accurately reflect business metrics.
  • Tune and optimize data in MPP databases such as StarRocks, Snowflake, BigQuery, or Redshift.
  • Collaborate on data workflows using Airflow, building and managing pipelines that power our analytical infrastructure.
  • Ensure efficient processing of large datasets through distributed computing frameworks like Spark or Flink.

AWSDockerPythonSQLCloud ComputingETLKubernetesSnowflakeAirflowAlgorithmsApache KafkaData engineeringData StructuresREST APISparkCommunication SkillsAnalytical SkillsCollaborationCI/CDRESTful APIsDevOpsTerraformProblem-solving skillsJSONScalaData visualizationAnsibleData modelingData analyticsDebugging

Posted about 1 month ago
Apply
Apply
πŸ”₯ Lead/Data Architect
Posted about 1 month ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 170000.0 - 190000.0 USD per year

πŸ” Healthcare technology

🏒 Company: Consensus Cloud SolutionsπŸ‘₯ 501-1000InternetInformation ServicesInformation Technology

  • 7+ years professional experience in data architect or similar role.
  • Proficient in programming languages such as Python and experience with frameworks like Apache Spark and Pandas.
  • Solid understanding of data warehousing and dimensional modeling.
  • SQL proficiency and experience optimizing queries on databases like Postgres, Oracle, and Redshift.
  • 4+ years proven experience leading a team.
  • Experience with cloud platforms and their data services.
  • Develop technical architecture of OLTP and Enterprise BI/EDW datastores.
  • Design and implement scalable and efficient data infrastructure including data pipelines.
  • Lead and mentor a team of data engineers.
  • Develop and maintain data governance policies for quality and compliance.
  • Collaborate with teams to identify and address data challenges.
  • Evaluate and select appropriate technologies for scalability.
  • Drive continuous improvement initiatives.
  • Lead code reviews and troubleshoot production issues.
  • Assist department manager in project management and team tasks.

PythonSQLApache AirflowETLOraclePostgres

Posted about 1 month ago
Apply
Apply
πŸ”₯ Lead Data Architect
Posted about 2 months ago

πŸ“ USA

🧭 Regular

πŸ” Technology consultancy

🏒 Company: Referrals Only

  • Experience designing and implementing solutions on Databricks, Snowflake, Spark or EMR platforms.
  • Experience defining and implementing types of data architecture and technology stacks.
  • Exposure to designing application system architecture based on big data and AI technologies.
  • Experience in building, maintaining and tuning data platforms and data warehouse design.
  • Familiarity with common design patterns and foundational knowledge of distributed systems.
  • Proficiency in open-source distributed computing/storage technologies like YARN, Spark, Kafka, etc.
  • Understanding of data-driven techniques, advanced analytics, ML/AI and data mining applications.
  • Exposure to developing real-time and low-latency data streaming solutions.
  • Passionate about data infrastructure and operations, with expertise in cloud environments.
  • Navigate, with support, a data project's architectural concerns, enabling delivery teams to deliver on accepted standards within time and budget.
  • Provide client-facing technical leadership and guidance on topics related to data architecture, engineering, and analytics.
  • Interact with client counterparts from the enterprise architecture group to deliver, share, align and sign-off on key architectural decisions.
  • Communicate both high-level and low-level technical details of data architecture to engineers and business stakeholders.
  • Lead, with support, the technical design of data governance, data security and data privacy to fulfill compliance requirements.
  • Incorporate data quality frameworks and processes to meet requirements.
  • Collaborate with sales and pre-sales to clarify requirements and design viable solutions.

Cloud ComputingData MiningETLMachine LearningSnowflakeData engineeringSparkData modeling

Posted about 2 months ago
Apply
Apply

πŸ“ Eastern time

πŸ” HRTech

🏒 Company: EnergageπŸ‘₯ 101-250πŸ’° 4 months agoDigital MediaService IndustryOnline PortalsSaaS

  • Experience leading data architecture responsibility in a complex environment for enterprise-class products.
  • Extensive experience with SQL Server and Azure.
  • Experience with cloud-based NoSQL data storage and processing services.
  • Expertise in modern data architecture patterns and approaches.
  • Experience with lean/agile principles.
  • Ability to work with various stakeholders and lead by influence.
  • Create comprehensive conceptual, logical, and physical data models to minimize redundancy and ensure data consistency and performance.
  • Analyze data flow across product suite and integration partners for optimized data pipelines.
  • Evaluate and recommend database technologies based on performance, scalability, and volume.
  • Define data access controls, encryption methods, and protection strategies for sensitive information.
  • Recommend and implement monitoring tools for data quality and performance analysis.

AgileCloud ComputingData AnalysisETLMicrosoft AzureMicrosoft SQL ServerNosqlData modeling

Posted 2 months ago
Apply
Apply

πŸ“ Sioux Falls, SD, Scottsdale, AZ, Troy, MI, Franklin, TN, Dallas, TX

🧭 Full-Time

πŸ’Έ 99501.91 - 183764.31 USD per year

πŸ” Financial Services

🏒 Company: Pathward, N.A.

  • 12+ years of varied experience implementing Data Management including Data Quality, Data Modeling, and Data Integration.
  • 10+ years of enterprise experience in data architecture efforts across various disciplines.
  • 5+ years of experience with end-to-end Data Warehouses using technologies like SSIS, Power BI, and Tableau.
  • At least 4+ years in the financial industry.
  • 2+ years of leading distributed teams in cloud environments.
  • Strong skills in SQL, performance optimization, data catalog, and effective communication.
  • Drive the architectural effectiveness of data-related business initiatives within the organization.
  • Maintain and enhance the current state data architecture while planning future state architecture.
  • Develop technology roadmaps for smooth transitions.
  • Design data solutions addressing business capabilities.
  • Create solutions to optimize enterprise data infrastructure.
  • Advocate for data architecture process to business leaders.
  • Provide technical leadership and guidance to development teams.

SQLETLData modelingData management

Posted 3 months ago
Apply
Apply

πŸ“ Canada, United States, Latin America

πŸ” Cloud services

🏒 Company: CaylentπŸ‘₯ 251-500πŸ’° Private over 2 years agoIaaSDevOpsCloud ComputingCloud Infrastructure

  • At least 4 years of experience in AWS data landscape.
  • 10 years of experience in areas such as relational database design, data modeling, big data processing, machine learning, BI dashboards, and data governance.
  • Experience with infrastructure as code tools, CI/CD pipelines, and Python for analytics.
  • Strong drive towards standardizing and documenting approaches and solutions.
  • Excellent written and verbal communication skills.
  • Define strategic roadmaps and targeted outcomes for clients in a consultative capacity.
  • Act as a data engineering SME to define upcoming engagements with pre-sales teams.
  • Perform technical interviews for Architect and Engineer candidates and provide technical guidance.
  • Oversee the development and implementation of data standards and operating procedures.

AWSPythonMachine LearningNumpyAmazon Web ServicesData engineeringNosqlPandasSparkCommunication SkillsCI/CDData modeling

Posted 4 months ago
Apply
Apply
πŸ”₯ Data Architect
Posted 4 months ago

πŸ“ GA, TN, SC, NC, FL, TX, PA

πŸ” Healthcare

  • Minimum of 7 years of experience in data architecture, data engineering, or a related role.
  • Proficiency in data modeling, database design, and data warehousing concepts.
  • Experience with data integration tools and ETL processes.
  • Strong knowledge of SQL and Python.
  • Strong knowledge with cloud platforms, especially Snowflake.
  • Expertise in big data architectures, including data lakes and machine learning architectures.
  • Proficiency in integrating business intelligence tools like PowerBI and Tableau.
  • Understanding of data governance and data QA practices.
  • Knowledge of medical claims data, including Medicare FFS, Commercial, Medicaid, and Medicare Advantage.
  • Experience working with medical coding systems (ICD10, CPT, HCPCS, MS DRG, NPPES, NPI, etc.)
  • Design, implement, and maintain the enterprise data architecture to support current and future data processing and storage needs.
  • Design and implement data integration processes to consolidate data from multiple sources into a unified platform.
  • Create and maintain conceptual, logical, and physical data models.
  • Establish and enforce data governance policies to ensure data quality, security, and compliance.
  • Evaluate and implement data management tools to enhance data architecture.
  • Work closely with data scientists, engineers, and business stakeholders to understand data requirements.
  • Identify and resolve data performance issues.
  • Document data architecture, data flows, and technical specifications.
  • Stay abreast of industry trends and best practices to drive continuous improvement.

PythonSQLBusiness IntelligenceETLMachine LearningQASnowflakeTableauStrategyData engineeringCommunication SkillsAnalytical SkillsCollaborationWritten communicationDocumentationCompliance

Posted 4 months ago
Apply
Apply

πŸ“ United States

🧭 Contract

πŸ’Έ 80 - 90 USD per hour

πŸ” Financial information and services

  • Proven experience with Snowflake data warehouse and optimization projects.
  • Strong expertise in analytic engineering techniques within Snowflake.
  • In-depth understanding of Snowflake architecture including virtual warehouses and clustering keys.
  • Proficiency in SQL and data modeling for Snowflake.
  • Familiarity with ETL processes and data pipeline optimization.
  • Excellent problem-solving and communication skills.
  • Self-motivated and proactive in implementing optimizations.
  • Evaluate and optimize the reporting layer of Snowflake data warehouse.
  • Develop strategies for improved query performance and reduced data processing time.
  • Collaborate with data engineering for optimizations while maintaining data integrity and security.
  • Utilize advanced analytic engineering techniques to optimize data transformations.
  • Design efficient data models and implement best practices for data loading and retrieval.

SQLETLSnowflakeStrategyData engineeringData StructuresCommunication Skills

Posted 5 months ago
Apply