Principal Data Architect
Inactive
United States, Canada, IndiaFull-TimePrincipal
Salary160555 - 181000 USD per year
Job Details
- Languages
- English
- Experience
- 5+ years
- Required Skills
- AWSPythonSQLETLGCPHadoopJavaKafkaSnowflakeSoftware ArchitectureAzureSparkRESTful APIsLinuxMicroservicesMentoringScalaData modeling
Requirements
- Bachelor’s Degree in Computer Science, Information Technology, Data Science, Engineering, or a related field plus 5 years of experience in related occupations.
- 5+ years of experience in the following: Professional development experience in architecting and implementing Big Data Solutions; Scripting languages: Java, Scala, Python, or Shell Scripting.
- 4+ years of experience in the following: Cloud: AWS, Azure, or GCP; Using one or more of the following ETL tools: Informatica, Talend, IBM DataStage, Azure Data Factory, AWS Glue;
- At least 2 of the following Big data tools and technologies: Linux, Hadoop, Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce;
- Performing complex data migration to and from disparate data systems/platforms as well as to/from the cloud: AWS, Azure, or GCP.
- 3+ years of experience in the following: Data Visualization tools: PowerBI, Tableau, Looker or similar.
Responsibilities
- Provide Top Quality solution design and implementation for clients.
- Design and execute data abstractions and integration patterns (APIs) to support complex distributed computing problems.
- Ensure that data security, governance, and compliance best practices are embedded into all solutions.
- Provide support in defining the scope and the estimating of proposed solutions.
- Engage with our clients to understand their strategic objectives.
- Translate business requirements into scalable and cost effective technology solutions that optimize data availability, performance, and usability.
- Work with client and engagement leaders to understand their strategic business objectives and align data architecture solutions accordingly.
- Identify gaps in data infrastructure, governance, or integration, and work with clients to resolve them in a timely manner.
- Utilize Big Data technologies to architect and build a scalable data solution.
- Architect and implement top-quality data solutions using cloud (AWS, GCP, Azure), big data frameworks (Apache Spark, Kafka, Databricks), and modern data platforms (Snowflake, BigQuery, Redshift).
- Stay up to date with emerging technology trends in cloud data platforms, AI-driven analytics, and data architecture best practices to make recommendations that align with client needs.
- Responsible for the design and execution of abstractions and integration patterns (APIs) to solve complex distributed computing problems.
- Participate in pre-sales activities, including proposal development, RFI/RFP response, shaping a solution from the client’s business problem.
- Support pre-sales activities, including proposal development, RFI/RFP responses, and solution presentations.
- Act as thought leader in the industry by creating written collateral (white papers or POVs), participate in speaking events, create/participate in internal and external events.
- Mentor and guide data engineers, analysts, and solution architects on data engineering best practices, architecture frameworks, and cloud infrastructure.