Apply

Data Architect

Posted 20 days agoViewed

View full description

πŸ“ Location: United Kingdom

πŸ” Industry: Business Technology Solutions

🏒 Company: Sword Group

πŸ—£οΈ Languages: English

Requirements:
  • Solid knowledge of Data Architecture principles and best practices.
  • Proficiency in SQL, Python, Spark (with either Python, Scala, or Java).
  • Familiarity with Azure DevOps for deployment and version control.
  • Strong background in Enterprise BI and AI concepts.
  • Expertise in dimensional modelling and database optimisation.
  • Experience with data governance processes and frameworks.
  • Strong reporting and data modelling skills.
  • Professional certification in Microsoft data & AI services (preferred).
  • Experience in the following Systems/Platforms: Microsoft Fabric, Azure Synapse Analytics, Oracle, P6 Unifier
Responsibilities:
  • Design, implement, and integrate software solutions using Microsoft Fabric.
  • Develop and maintain logical and dimensional data models, ensuring scalability and efficiency.
  • Install, configure, and optimise information systems to meet business requirements.
  • Lead data migration projects from legacy systems to modern platforms.
  • Work directly with clients to gather requirements and present data-driven solutions.
  • Take ownership of the delivery and execution of core data architecture components.
  • Ensure data governance best practices and compliance with industry standards.
Apply

Related Jobs

Apply

πŸ’Έ 180000.0 - 312000.0 USD per year

πŸ” Software Development

🏒 Company: Referrals Only

  • Experience designing and implementing solutions on the Databricks, Snowflake, Spark or EMR platforms.
  • Experience in defining and implementing different types of data architecture, analyzing trade-offs and can define technology stacks for different types of data architecture.
  • Experience in designing application system architecture based on big data, artificial intelligence and related technologies.
  • Rich experience in building, maintaining and tuning data platforms, as well as extensive experience in data warehouse design, data modeling, data monitoring and operations.
  • Experience with common design patterns, application frameworks and foundational/theoretical knowledge (i.e.: distributed systems, data intensive applications, etc.).
  • Proficient in common open-source distributed computing/storage technologies, including but not limited to YARN, Impala, Spark, MapReduce, Kafka and Flink, with practical project architecture experience.
  • Good understanding of business and communication, collaboration skills, strong learning and summarizing abilities.
  • Experience in defining, developing and enabling data-driven techniques, advanced analytics, ML/AI and data mining applications in enterprise.
  • Experience in developing real-time and low-latency data streaming solutions and a differentiated view on the complexities and tradeoffs connected with them.
  • Experience in productionizing machine learning models and applying techniques, tools and processes.
  • Expertise working in cloud environments.
  • Lead complex data programs, navigating architectural concerns and enabling delivery teams to deliver on accepted standards within time and budget.
  • Provide client-facing technical leadership and guidance on topics related to data architecture, engineering, and analytics to advise clients on bringing their data strategy to life.
  • Interact with client counterparts from the enterprise architecture group and lead the delivery, alignment and sign-off of key architectural decisions, trade-offs and ways of working.
  • Communicate both high- and low-level technical details of data architecture to engineers and business stakeholders.
  • Collaborate, influence and guide at the intersection of analytical and operational architecture.
  • Be responsible for the technical design of data governance, data security and data privacy to fulfill compliance requirements.
  • Seamlessly incorporate data quality frameworks and processes to address and fulfill requirements as set out in strategy and acceptance criteria.
  • Collaborate with sales and pre-sales to clarify requirements and design viable solutions.
  • Represent Thoughtworks in various online and offline forums, including events and conferences.
Posted 5 days ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 100000.0 - 120000.0 USD per year

πŸ” Information Technology

🏒 Company: New Era TechnologyπŸ‘₯ 1001-5000InternetInformation TechnologyAudio/Visual Equipment

  • 8+ years of experience in data architecture
  • Strong knowledge of Database Design Principles and Data Warehousing
  • Experience with SQL and NoSQL databases
  • Knowledge of Big Data technologies and Cloud Storage solutions
  • Ability to develop and present ideas to management
  • Deep knowledge of Enterprise Data Lake/Lakehouse and related tools
  • Facilitate organizational change through research and technical strategy definition
  • Conduct research and develop data and visualization roadmaps
  • Communicate best practices in data, ETL, BI, and analytics
  • Assist in evaluating business demands for software and technologies
  • Provide technical design and architectural leadership
  • Collaborate with project teams for solution implementation

SQLBusiness IntelligenceData MiningETLNosqlData visualizationData modeling

Posted 15 days ago
Apply
Apply

πŸ“ USA

🧭 Full-Time

πŸ” Foreign Exchange payments processing

🏒 Company: ConveraπŸ‘₯ 1001-5000Financial ServicesPaymentsFinanceFinTech

  • Bachelor's degree in Computer Science or related field
  • 10+ years of experience in data architecture
  • Strong knowledge of database design and data modeling
  • Hands-on experience with Snowflake and SQL
  • Proficiency with data modeling tools
  • Familiarity with regulatory compliance
  • Experience with Big Data technologies
  • Strong communication and collaboration skills
  • Design and implement scalable data architecture solutions
  • Partner with stakeholders to understand data requirements
  • Collaborate on data integration and governance
  • Develop semantic models for reporting needs
  • Monitor and optimize data systems performance
  • Integrate AI and ML technologies into architecture
  • Document data architecture and best practices

AWSPythonSQLETLMachine LearningSnowflakeTableauData modeling

Posted 23 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 175000.0 - 200000.0 USD per year

πŸ” Data Engineering

🏒 Company: AstronomerπŸ‘₯ 251-500πŸ’° $213,000,000 Series C almost 3 years agoπŸ«‚ Last layoff almost 2 years agoCloud Data ServicesBig DataData IntegrationMachine LearningAnalyticsInformation TechnologyEnterprise SoftwareSoftware

  • Experience with Apache Airflow in production environments
  • Experience in designing and implementing ETL, Data Warehousing, and ML/AI use cases
  • Proficiency in Python
  • Knowledge of cloud-native data architecture
  • Demonstrated technical leadership
  • Work directly with Astronomer customers
  • Participate in pre-sales motions
  • Build architecture and operational diagrams
  • Provide reference implementations
  • Collaborate to build reusable assets
  • Interact with Product and Engineering stakeholders
  • Establish strong relationships with key customer stakeholders

PythonApache AirflowETL

Posted 26 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 175000.0 - 205000.0 USD per year

πŸ” Software Development

🏒 Company: CoreWeaveπŸ’° $642,000,000 Secondary Market over 1 year agoCloud ComputingMachine LearningInformation TechnologyCloud Infrastructure

  • Hands-on experience applying Kimball Dimensional Data Modeling principles to large datasets.
  • Expertise in working with analytical table/file formats, including Iceberg, Parquet, Avro, and ORC.
  • Proven experience optimizing MPP databases (StarRocks, Snowflake, BigQuery, Redshift).
  • Minimum 5+ years of programming experience in Python or Scala.
  • Advanced SQL skills, with a strong ability to write, optimize, and debug complex queries.
  • Hands-on experience with Airflow for batch orchestration distributed computing frameworks like Spark or Flink.
  • Develop and maintain data models, including star and snowflake schemas, to support analytical needs across the organization.
  • Establish and enforce best practices for dimensional modeling in our Lakehouse.
  • Engineer and optimize data storage using analytical table/file formats (e.g., Iceberg, Parquet, Avro, ORC).
  • Partner with BI, analytics, and data science teams to design datasets that accurately reflect business metrics.
  • Tune and optimize data in MPP databases such as StarRocks, Snowflake, BigQuery, or Redshift.
  • Collaborate on data workflows using Airflow, building and managing pipelines that power our analytical infrastructure.
  • Ensure efficient processing of large datasets through distributed computing frameworks like Spark or Flink.

AWSDockerPythonSQLCloud ComputingETLKubernetesSnowflakeAirflowAlgorithmsApache KafkaData engineeringData StructuresREST APISparkCommunication SkillsAnalytical SkillsCollaborationCI/CDRESTful APIsDevOpsTerraformProblem-solving skillsJSONScalaData visualizationAnsibleData modelingData analyticsDebugging

Posted 30 days ago
Apply
Apply
πŸ”₯ Lead/Data Architect
Posted about 1 month ago

πŸ“ United States

🧭 Full-Time

πŸ’Έ 170000.0 - 190000.0 USD per year

πŸ” Healthcare technology

🏒 Company: Consensus Cloud SolutionsπŸ‘₯ 501-1000InternetInformation ServicesInformation Technology

  • 7+ years professional experience in data architect or similar role.
  • Proficient in programming languages such as Python and experience with frameworks like Apache Spark and Pandas.
  • Solid understanding of data warehousing and dimensional modeling.
  • SQL proficiency and experience optimizing queries on databases like Postgres, Oracle, and Redshift.
  • 4+ years proven experience leading a team.
  • Experience with cloud platforms and their data services.
  • Develop technical architecture of OLTP and Enterprise BI/EDW datastores.
  • Design and implement scalable and efficient data infrastructure including data pipelines.
  • Lead and mentor a team of data engineers.
  • Develop and maintain data governance policies for quality and compliance.
  • Collaborate with teams to identify and address data challenges.
  • Evaluate and select appropriate technologies for scalability.
  • Drive continuous improvement initiatives.
  • Lead code reviews and troubleshoot production issues.
  • Assist department manager in project management and team tasks.

PythonSQLApache AirflowETLOraclePostgres

Posted about 1 month ago
Apply
Apply
πŸ”₯ Lead Data Architect
Posted about 2 months ago

πŸ“ USA

🧭 Regular

πŸ” Technology consultancy

🏒 Company: Referrals Only

  • Experience designing and implementing solutions on Databricks, Snowflake, Spark or EMR platforms.
  • Experience defining and implementing types of data architecture and technology stacks.
  • Exposure to designing application system architecture based on big data and AI technologies.
  • Experience in building, maintaining and tuning data platforms and data warehouse design.
  • Familiarity with common design patterns and foundational knowledge of distributed systems.
  • Proficiency in open-source distributed computing/storage technologies like YARN, Spark, Kafka, etc.
  • Understanding of data-driven techniques, advanced analytics, ML/AI and data mining applications.
  • Exposure to developing real-time and low-latency data streaming solutions.
  • Passionate about data infrastructure and operations, with expertise in cloud environments.
  • Navigate, with support, a data project's architectural concerns, enabling delivery teams to deliver on accepted standards within time and budget.
  • Provide client-facing technical leadership and guidance on topics related to data architecture, engineering, and analytics.
  • Interact with client counterparts from the enterprise architecture group to deliver, share, align and sign-off on key architectural decisions.
  • Communicate both high-level and low-level technical details of data architecture to engineers and business stakeholders.
  • Lead, with support, the technical design of data governance, data security and data privacy to fulfill compliance requirements.
  • Incorporate data quality frameworks and processes to meet requirements.
  • Collaborate with sales and pre-sales to clarify requirements and design viable solutions.

Cloud ComputingData MiningETLMachine LearningSnowflakeData engineeringSparkData modeling

Posted about 2 months ago
Apply
Apply
πŸ”₯ (881) Data Architect
Posted about 2 months ago

πŸ“ LATAM

🏒 Company: NearsureπŸ‘₯ 501-1000Staffing AgencyOutsourcingSoftware

  • Bachelor's Degree in Computer Science, Engineering, or a related field.
  • 5+ Years of experience in designing, configuring, and implementing major data platforms.
  • Hands-on experience with at least two of the following: Salesforce DataCloud, Google BigQuery, AWS RedShift/Aurora, Snowflake, Azure Synapse.
  • 5+ Years of experience working with data migration projects involving mainstream databases (e.g., SQL Server, Oracle, MySQL, DB2).
  • 5+ Years of experience in data modeling for Data Warehouses or Data Lakes.
  • 3+ Years of experience with Python.
  • 3+ Years of experience with data security and privacy compliance.
  • Experience with major cloud technologies (AWS, GCP, Azure).
  • Proven experience in leading a technical team of data engineers.
  • Expert in SQL and data modeling for complex enterprise environments.
  • Experience implementing and managing data catalogs.
  • Proficiency in handling various data types (structured, semi-structured, unstructured, JSON, AVRO).
  • Experience in source code control using Git.
  • Understanding of diverse Data & Analytics architectures.
  • Advanced English Level is required.
  • Lead the selection process of the optimal Data Lake and Analytics solution that aligns with customer business needs.
  • Collaborate to identify top key requirements for platform selection.
  • Develop a comprehensive comparison (battlecard) of potential Data Lake and Analytics platforms.
  • Architect and design the enterprise data platform to ensure seamless integration and interoperability with existing systems.
  • Guide and mentor a team of data developers, ensuring technical excellence and efficient implementation of the selected platform.
  • Participate in the configuration and setup of the chosen cloud environment.

AWSPythonSQLETLGCPGitMySQLOracleSalesforceSnowflakeAzureData engineeringData modeling

Posted about 2 months ago
Apply
Apply
πŸ”₯ Data Architect
Posted about 2 months ago

πŸ“ United States, Canada

🧭 Full-Time

πŸ” Software Development

  • 6+ years of experience in similar positions
  • Strong expertise in AWS with a focus on Redshift
  • Advanced proficiency in SQL scripting
  • Hands-on experience in Python for data processing
  • Practical knowledge of SQL Server
  • Proven expertise in data architecture and modeling
  • Design and develop data models to support new database implementations
  • Create and optimize queries and scripts for established data models
  • Transform existing data models into robust data marts and data lakes
  • Collaborate within a team of data engineers
  • Contribute to ongoing database projects

AWSPythonSQLMicrosoft SQL ServerData modeling

Posted about 2 months ago
Apply
Apply

πŸ“ Eastern time

πŸ” HRTech

🏒 Company: EnergageπŸ‘₯ 101-250πŸ’° 4 months agoDigital MediaService IndustryOnline PortalsSaaS

  • Experience leading data architecture responsibility in a complex environment for enterprise-class products.
  • Extensive experience with SQL Server and Azure.
  • Experience with cloud-based NoSQL data storage and processing services.
  • Expertise in modern data architecture patterns and approaches.
  • Experience with lean/agile principles.
  • Ability to work with various stakeholders and lead by influence.
  • Create comprehensive conceptual, logical, and physical data models to minimize redundancy and ensure data consistency and performance.
  • Analyze data flow across product suite and integration partners for optimized data pipelines.
  • Evaluate and recommend database technologies based on performance, scalability, and volume.
  • Define data access controls, encryption methods, and protection strategies for sensitive information.
  • Recommend and implement monitoring tools for data quality and performance analysis.

AgileCloud ComputingData AnalysisETLMicrosoft AzureMicrosoft SQL ServerNosqlData modeling

Posted 2 months ago
Apply

Related Articles

Posted 12 days ago

Why remote work is such a nice opportunity?

Why is remote work so nice? Let's try to see!

Posted 7 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 7 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 7 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 7 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.