Snowflake Job Salaries

Find salary information for remote positions requiring Snowflake skills. Make data-driven decisions about your career path.

Snowflake

Median high-range salary for jobs requiring Snowflake:

$180,000

This analysis is based on salary ranges collected from 93 job descriptions that match the search and allow working remotely. Choose a country to narrow down the search and view statistics exclusively for remote jobs available in that location.

The Median Salary Range is $144,000 - $180,000

  • 25% of job descriptions advertised a maximum salary above $210,750.
  • 5% of job descriptions advertised a maximum salary above $299,550.

Skills and Salary

Specific skills can have a substantial impact on salary ranges for jobs that align with these search preferences. Certain in-demand skills are highly valued by employers and can significantly boost compensation. These skills often reflect the unique requirements and challenges faced by professionals in these roles. Some of the most sought-after skills that correlate with higher salaries include Machine Learning, AWS and Data engineering. Mastering these skills can demonstrate expertise and make individuals more competitive in the job market. Employers often prioritize candidates who possess these skills, as they can contribute directly to the organization's success. The ability to effectively utilize these skills can lead to increased earning potential and career advancement opportunities.

  1. Machine Learning

    28% jobs mention Machine Learning as a required skill. The Median Salary Range for these jobs is $162,310.5 - $206,838

    • 25% of job descriptions advertised a maximum salary above $232,300.
    • 5% of job descriptions advertised a maximum salary above $324,620.
  2. AWS

    44% jobs mention AWS as a required skill. The Median Salary Range for these jobs is $150,000 - $185,000

    • 25% of job descriptions advertised a maximum salary above $230,575.
    • 5% of job descriptions advertised a maximum salary above $319,395.
  3. Data engineering

    53% jobs mention Data engineering as a required skill. The Median Salary Range for these jobs is $150,000 - $183,764.31

    • 25% of job descriptions advertised a maximum salary above $207,965.06.
    • 5% of job descriptions advertised a maximum salary above $315,250.
  4. Python

    77% jobs mention Python as a required skill. The Median Salary Range for these jobs is $147,500 - $180,000

    • 25% of job descriptions advertised a maximum salary above $207,965.
    • 5% of job descriptions advertised a maximum salary above $274,500.
  5. SQL

    85% jobs mention SQL as a required skill. The Median Salary Range for these jobs is $140,000 - $170,000

    • 25% of job descriptions advertised a maximum salary above $207,948.75.
    • 5% of job descriptions advertised a maximum salary above $275,000.
  6. ETL

    53% jobs mention ETL as a required skill. The Median Salary Range for these jobs is $131,414 - $170,000

    • 25% of job descriptions advertised a maximum salary above $206,323.25.
    • 5% of job descriptions advertised a maximum salary above $241,500.
  7. Data visualization

    29% jobs mention Data visualization as a required skill. The Median Salary Range for these jobs is $140,000 - $170,000

    • 25% of job descriptions advertised a maximum salary above $202,500.
    • 5% of job descriptions advertised a maximum salary above $286,250.
  8. Data modeling

    42% jobs mention Data modeling as a required skill. The Median Salary Range for these jobs is $130,000 - $168,150

    • 25% of job descriptions advertised a maximum salary above $204,750.
    • 5% of job descriptions advertised a maximum salary above $297,000.
  9. Data Analysis

    26% jobs mention Data Analysis as a required skill. The Median Salary Range for these jobs is $137,057.5 - $160,000

    • 25% of job descriptions advertised a maximum salary above $219,000.
    • 5% of job descriptions advertised a maximum salary above $345,170.

Industries and Salary

Industry plays a crucial role in determining salary ranges for jobs that align with these search preferences. Certain industries offer significantly higher compensation packages compared to others. Some in-demand industries known for their competitive salaries in these roles include Fintech, Mental Health and Software Development. These industries often have a strong demand for skilled professionals and are willing to invest in talent to meet their growth objectives. Factors such as industry size, profitability, and market trends can influence salary levels within these sectors. It's important to consider industry-specific factors when evaluating potential career paths and salary expectations.

  1. Fintech

    5% jobs are in Fintech industry. The Median Salary Range for these jobs is $176,000 - $230,000

    • 25% of job descriptions advertised a maximum salary above $308,525.
    • 5% of job descriptions advertised a maximum salary above $343,100.
  2. Mental Health

    3% jobs are in Mental Health industry. The Median Salary Range for these jobs is $131,414 - $197,100

    • 25% of job descriptions advertised a maximum salary above $210,525.
    • 5% of job descriptions advertised a maximum salary above $215,000.
  3. Software Development

    16% jobs are in Software Development industry. The Median Salary Range for these jobs is $160,000 - $190,000

    • 25% of job descriptions advertised a maximum salary above $207,175.
    • 5% of job descriptions advertised a maximum salary above $237,000.
  4. AI and machine learning

    2% jobs are in AI and machine learning industry. The Median Salary Range for these jobs is $160,000 - $185,000

    • 25% of job descriptions advertised a maximum salary above $210,000.
  5. Financial Services

    2% jobs are in Financial Services industry. The Median Salary Range for these jobs is $94,550.95 - $175,882.16

    • 25% of job descriptions advertised a maximum salary above $183,764.31.
  6. Data Engineering

    3% jobs are in Data Engineering industry. The Median Salary Range for these jobs is $130,000 - $170,000

    • 25% of job descriptions advertised a maximum salary above $198,473.75.
    • 5% of job descriptions advertised a maximum salary above $207,965.
  7. Healthcare

    6% jobs are in Healthcare industry. The Median Salary Range for these jobs is $135,000 - $157,500

    • 25% of job descriptions advertised a maximum salary above $185,000.
    • 5% of job descriptions advertised a maximum salary above $220,000.
  8. Legal Services

    2% jobs are in Legal Services industry. The Median Salary Range for these jobs is $102,500 - $135,000

    • 25% of job descriptions advertised a maximum salary above $139,000.
  9. Data Analytics

    2% jobs are in Data Analytics industry. The Median Salary Range for these jobs is $89,000 - $116,500

    • 25% of job descriptions advertised a maximum salary above $133,000.
  10. AI legal tech for private markets

    1% jobs are in AI legal tech for private markets industry. The Median Salary Range for these jobs is $68,000 - $102,000

Disclaimer: This analysis is based on salary ranges advertised in job descriptions found on Remoote.app. While it provides valuable insights into potential compensation, it's important to understand that advertised salary ranges may not always reflect the actual salaries paid to employees. Furthermore, not all companies disclose salary ranges, which can impact the accuracy of this analysis. Several factors can influence the final compensation package, including:

  • Negotiation: Salary ranges often serve as a starting point for negotiation. Your experience, skills, and qualifications can influence the final offer you receive.
  • Benefits: Salaries are just one component of total compensation. Some companies may offer competitive benefits packages that include health insurance, paid time off, retirement plans, and other perks. The value of these benefits can significantly affect your overall compensation.
  • Cost of Living: The cost of living in a particular location can impact salary expectations. Some areas may require higher salaries to maintain a similar standard of living compared to others.

Jobs

107 jobs found. to receive daily emails with new job openings that match your preferences.
107 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply
🔥 Lead Partner Sales Engineer
Posted about 7 hours ago

📍 United States

🧭 Full-Time

💸 166372.19 - 207965.24 USD per year

🔍 Software Development

  • Deep understanding of GSI and Reseller pre-sales motions, including how solution architects, pre-sales technical specialists and technical teams participate in discovery workshops, opportunity qualification, proof-of-concepts, and technical validation sessions. This includes understanding how technology choices impact project economics and delivery timelines.
  • Experience with GSI and Reseller sales methodologies, particularly how they identify and qualify opportunities through both client-direct and vendor-partner channels and how they manage complex, multi-stakeholder, multi-vendor, multi-tech option sales cycles.
  • Knowledge of how GSIs and Resellers approach technical enablement and skills development, including their internal training programs and certification paths, and how they maintain technical excellence across delivery teams while managing utilization targets.
  • Cloud data platforms and data lakes: Snowflake, Databricks, Redshift, BigQuery, Synapse, S3, ADLS, OneLake, Hadoop - Hands-on experience with modern cloud data platforms and data lake architectures
  • Data workloads including data warehousing, data lakes, AI, ML, Gen AI, data applications, data engineering
  • Open table formats: Iceberg, Delta Lake, Hudi
  • Data movement technologies: Fivetran, HVR, Matillion, Airbyte, Informatica, Talend, Datastage, ADF, AWS Glue. Hands-on experience with traditional (or more modern) ETL/ELT solutions
  • Databases: Oracle, SQL Server, PostgreSQL, MySQL, MongoDB, CosmosDB, SAP HANA.
  • Applications: Salesforce, Google Analytics, ServiceNow, Hubspot, etc.
  • ERP solutions: SAP, Oracle, Workday, etc.
  • Data transformation solutions: dbt Core, dbt Cloud, Coalesce, Informatica
  • Gen AI: approaches, concepts, and the Gen AI technology ecosystem for hyperscalers, cloud data platforms, and solution-specific ISVs
  • REST APIs: Experience programmatically interacting with REST APIs
  • Data Infrastructure, Data Security, Data Governance and Application Development: strong understanding of technical architectures and approaches
  • Programming languages: Python, SQL
  • Create, develop, build, and take ownership of ongoing relationships with GSI and Reseller pre-sales technical team members, lead architects, and other technical recommenders to create multiple Fivetran technical evangelists and champions within each GSI and Reseller partner.
  • Interact daily with Fivetran’s top GSI and Reseller partners while tracking, measuring, and driving 3 primary KPIs - partners enabled, gtm activities supported, and technical content created.
  • Represent Fivetran as the partner technical liaison plus product, data and solution expert. Collaborate with the strategic GSI and Reseller Partner Sales Managers to enable new and existing GSI and Reseller partners to understand Fivetran's value, differentiation, messaging, and technical/business value.
  • Assist Fivetran Partner Sales Managers in communicating and demonstrating Fivetran’s value prop to existing and prospective partners to get them excited about partnering with Fivetran
  • Get hands-on with all aspects of Fivetran and the broader data ecosystem technologies listed above in building out live demos, hands-on labs, solution prototypes, and Fivetran/GSI/Reseller joint solutions.
  • Communicate customer requirements and competitor solutions. Provide a strong technical response to partner requests. Engage the customer SE team and communicate client requirements for individual client opportunities.

PostgreSQLPythonSQLBusiness IntelligenceCloud ComputingETLHadoopMongoDBMySQLOracleSalesforceSAPSnowflakeGoogle AnalyticsData engineeringAnalytical SkillsMentoringPresentation skillsExcellent communication skillsRelationship buildingStrong communication skills

Posted about 7 hours ago
Apply
Apply
🔥 Account Executive, Enterprise
Posted about 8 hours ago

📍 United States

🧭 Full-Time

💸 144144.0 - 180180.0 USD per year

🔍 Software Development

  • 8+ years of large enterprise software sales experience and well-developed pattern recognition for navigating complex organizations.
  • Excellent written and verbal communication skills, with the ability to hold multiple stakeholders accountable throughout a complex value-driven sales cycle.
  • In-depth familiarity with the modern data technology industry and key players.
  • You are familiar with a solution-based approach to selling, have experience managing a complex sales process and possess excellent presentation and listening skills, organization and contact management capabilities.
  • You thrive in an extremely fast-paced, ever changing work environment. You’re able to keep up with a highly motivated team in a market that is growing extremely fast.
  • You are extremely organized. You are able to juggle lots of things at once while not letting anything drop.
  • You are a strategic thinker. You are able to see and communicate the big picture in an inspiring way.
  • You are a problem solver. You are resilient and creative, able to be resourceful to proactively seek out a solution to a problem.
  • You are enthusiastic! You exhibit passion and excitement for your work and you have a can-do attitude.
  • Collaborate cross-functionally with marketing, customer success, alliances, operations, and analytics to drive pipeline generation and exceed revenue goals.
  • Accelerate the growth & adoption of Fivetran in the Enterprise Market through value-driven sales cycles.
  • Lead in-depth discovery and demonstrate a deep interest in our Enterprise customers’ data challenges, identify required capabilities and positive business outcomes to drive towards valuable long term customer engagements.
  • Speak comfortably about Fivetran’s vision to a broad range of audiences from C-level executives to individual contributors.
  • Seek out and land deals with new Enterprise target accounts, then look to grow their footprint with Fivetran through new use cases, cross-sells and expansion.
  • Build strategic relationship with partners in order to identify new opportunities and accelerate deal cycles.
  • Forecast accurately and provide clear visibility on sales and revenue performance by actively managing and progressing opportunities.

Business IntelligenceETLSalesforceSnowflakeCross-functional Team LeadershipREST APICommunication SkillsCustomer servicePresentation skillsRelationship buildingAccount ManagementNegotiation skillsSales experienceMarket ResearchData visualizationLead GenerationStrategic thinkingCRMFinancial analysisData modelingData analyticsData managementCustomer SuccessSaaSBudget management

Posted about 8 hours ago
Apply
Apply

📍 United States

💸 64000.0 - 120000.0 USD per year

  • Strong PL/SQL, SQL development skills
  • Proficient in multiple languages used in data engineering such as Python, Java
  • Minimum 3-5 years of experience in Data engineering working with Oracle and MS SQL
  • Experience with data warehousing concepts and technologies including cloud-based services (e.g. Snowflake)
  • Experience with cloud platforms like Azure and knowledge of infrastructure
  • Experience with data orchestration tools (e.g. Azure Data Factory, DataBricks workflows)
  • Understanding of data privacy regulations and best practices
  • Experience working with remote teams
  • Experience working on a team with a CI/CD process
  • Familiarity using tools like Git, Jira
  • Bachelor's degree in Computer Science or Computer Engineering
  • Design, implement and maintain scalable pipelines and architecture to collect, process, and store data from various sources.
  • Unit test and document solutions that meet product quality standards prior to release to QA.
  • Identify and resolve performance bottlenecks in pipelines due to data, queries and processing workflows to ensure efficient and timely data delivery.
  • Implement data quality checks and validations processes to ensure accuracy, completeness and consistency of data delivery.
  • Work with Data Architect and implement best practices for data governance, quality and security.
  • Collaborate with cross-functional teams to identify and address data needs.
  • Ensure technology solutions support the needs of the customer and/or organization.
  • Define and document technical requirements.

PythonSQLETLGitJavaOracleSnowflakeAzureData engineeringCI/CDRESTful APIs

Posted about 19 hours ago
Apply
Apply
🔥 Senior Growth Engineer
Posted about 20 hours ago

📍 United States

🧭 Full-Time

💸 140000.0 - 160000.0 USD per year

🔍 Email Security

🏢 Company: Valimail

  • 4+ years of experience with Python, SQL, or similar languages, with hands-on experience in automation and data flow management in platforms like Snowflake, Segment, Zapier, and Planhat.
  • 2+ years in an operations role supporting Sales, Marketing, Customer Success, or Finance, with a demonstrated ability to align data operations with strategic business goals.
  • Required proficiency in Snowflake, Segment (or other CDP solutions), and BI tools (e.g., Sigma, PowerBI).
  • Familiarity with Zapier, Planhat (or similar tools such as Gainsight or ChurnZero), Salesforce, and Atlassian are a plus.
  • Oversee and optimize data flow across core platforms, including Snowflake, Sigma, Segment, Salesforce, and Planhat, ensuring seamless integration and reliable data access for cross-functional teams.
  • Build strong, collaborative relationships with Marketing, Sales, Finance, Product, and Engineering, facilitating data-driven insights and project alignment across departments.
  • Analyze customer data to derive insights that inform strategic decision-making, delivering actionable reports that support growth objectives.
  • Spearhead automation and optimization projects, implementing solutions that improve data flow and system efficiency.
  • Lead key cross-functional initiatives, ensuring smooth execution and alignment across multiple departments.
  • Provide valuable insights and recommendations to leadership, aligning data-driven findings with broader business objectives.
  • Tackle a wide range of tasks, from technical troubleshooting to strategic planning and cross-departmental collaboration.

Project ManagementPythonSQLBusiness IntelligenceData AnalysisETLSalesforceSnowflakeOperations ManagementAPI testingData engineeringREST APICommunication SkillsAnalytical SkillsCI/CDProblem SolvingDevOpsCross-functional collaborationData visualizationStrategic thinkingData modelingData management

Posted about 20 hours ago
Apply
Apply

🧭 Full-Time

💸 185000.0 - 315000.0 USD per year

🔍 Financial Operations

🏢 Company: Ramp👥 501-1000💰 $150,000,000 Series D 11 months agoFinancial ServicesFinanceFinTech

  • Bachelor’s degree or above in Math, Economics, Bioinformatics, Statistics, Engineering, Computer Science, or other quantitative fields with a minimum of 5 years of industry experience as a Machine Learning Engineer, Applied Scientist or Data Scientist
  • Strong python experience (numpy, pandas, sklearn, pytorch etc.) across ML techniques and back end engineering
  • Prior experience deploying Machine Learning models to production and making meaningful contribution to backend systems
  • Strong knowledge of SQL (preferably Snowflake, BigQuery)
  • Ability to thrive in a fast-paced, constantly improving, start-up environment that focuses on solving problems with iterative technical solutions
  • Employ statistical and machine learning on large datasets to discover patterns of account takeovers and identity theft
  • Prototype and productionalize machine learning models and rules-based systems to protect user accounts
  • Partner closely with Identity Engineering and Data Platform teams to augment and leverage data across first and third party sources, ensuring we’ve added as much context as possible to every decision we make
  • Contribute to the culture of Ramp’s machine learning team by influencing processes, tools, and systems that will allow us to make better decisions in a scalable way

Backend DevelopmentPythonSQLMachine LearningNumpyPyTorchSnowflakeData engineeringPandasData modelingSoftware Engineering

Posted 2 days ago
Apply
Apply
🔥 Senior Data Engineer
Posted 2 days ago

📍 United States

💸 144000.0 - 180000.0 USD per year

🔍 Software Development

🏢 Company: Hungryroot👥 101-250💰 $40,000,000 Series C almost 4 years agoArtificial Intelligence (AI)Food and BeverageE-CommerceRetailConsumer GoodsSoftware

  • 5+ years of experience in ETL development and data modeling
  • 5+ years of experience in both Scala and Python
  • 5+ years of experience in Spark
  • Excellent problem-solving skills and the ability to translate business problems into practical solutions
  • 2+ years of experience working with the Databricks Platform
  • Develop pipelines in Spark (Python + Scala) in the Databricks Platform
  • Build cross-functional working relationships with business partners in Food Analytics, Operations, Marketing, and Web/App Development teams to power pipeline development for the business
  • Ensure system reliability and performance
  • Deploy and maintain data pipelines in production
  • Set an example of code quality, data quality, and best practices
  • Work with Analysts and Data Engineers to enable high quality self-service analytics for all of Hungryroot
  • Investigate datasets to answer business questions, ensuring data quality and business assumptions are understood before deploying a pipeline

AWSPythonSQLApache AirflowData MiningETLSnowflakeAlgorithmsAmazon Web ServicesData engineeringData StructuresSparkCI/CDRESTful APIsMicroservicesJSONScalaData visualizationData modelingData analyticsData management

Posted 2 days ago
Apply
Apply

📍 CA, CO, CT, FL, IL, MA, MD, NC, NJ, NY, OR, VA, VT, WA, United Kingdom

🧭 Full-Time

💸 175000.0 - 191300.0 USD per year

🔍 Crowdfunding

🏢 Company: Kickstarter PBC

  • 8+ years of experience in data engineering, analytics engineering, or related fields.
  • Strong experience with cloud-based data warehouses (Redshift, Snowflake, or BigQuery) and query performance optimization.
  • Expertise in SQL, Python, and data transformation frameworks like dbt.
  • Experience building scalable data pipelines with modern orchestration tools (Airflow, MWAA, Dagster, etc.).
  • Knowledge of real-time streaming architectures (Kafka, Kinesis, etc.) and event-based telemetry best practices.
  • Experience working with business intelligence tools (e.g. Looker) and enabling self-serve analytics.
  • Ability to drive cost-efficient and scalable data solutions, balancing performance with resource management.
  • Familiarity with machine learning operations (MLOps) and experimentation tooling is a plus.
  • Develop, own and improve Kickstarter’s data architecture—optimize our Redshift warehouse, implement best practices for data storage, processing, and orchestration.
  • Design and build scalable ETL/ELT pipelines to transform raw data into clean, usable datasets for analytics, product insights, and machine learning applications.
  • Enhance data accessibility and self-service analytics by improving Looker models and enabling better organizational data literacy.
  • Support real-time data needs by optimizing event-based telemetry and integrating new data streams to fuel new products, personalization, recommendations, and fraud detection.
  • Lead cost optimization efforts—identify and implement more efficient processes and tools to lower costs.
  • Drive data governance and security best practices—ensure data integrity, access controls, and proper lineage tracking.
  • Collaborate across teams to ensure data solutions align with product, growth, and business intelligence needs.

PythonSQLETLKafkaSnowflakeAirflowData engineeringData visualizationData modeling

Posted 3 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 160000.0 - 240000.0 USD per year

🔍 Data Operations, Analytics, BI

🏢 Company: dbt Labs👥 251-500💰 $222,000,000 Series D about 3 years ago🫂 Last layoff over 1 year agoArtificial Intelligence (AI)Open SourceBig DataAnalyticsInformation TechnologySoftware

  • 4+ years of sales engineering, solutions architecture, or consulting in data operations, analytics, or BI
  • A solid technical background, with experience in SQL, ETL, and data modeling
  • A firm understanding of and ability to discuss modern data warehousing architectures
  • Ability to gather and translate technical and business requirements into concrete solutions
  • Excellent verbal, written, and in-person communication skills to discuss complex topics with both technical and non-technical audiences
  • Ability to operate in an ambiguous and fast-paced work environment
  • A passion for being an inclusive teammate and involved member of the community
  • Become an expert in designing enterprise-grade data pipelines with dbt Cloud
  • Work hand-in-hand with sales to communicate the technical value of the dbt Labs suite of products
  • Guide customers through proof-of-concept implementations and facilitate trials of dbt Cloud
  • Work with engineering to deploy dbt Cloud in private clouds for enterprise clients
  • Own the full customer lifecycle from pre-sale evaluation to product adoption and expansion
  • Work with product to build and maintain the dbt Cloud enterprise roadmap
  • Build close relationships with our Partners (including companies like Snowflake, AWS, and Databricks) and enable them to adopt and implement dbt for their clients
  • Be an active member of the dbt community

AWSBackend DevelopmentSQLBusiness IntelligenceCloud ComputingETLSnowflakeSoftware ArchitectureData engineeringCommunication SkillsAnalytical SkillsProblem SolvingRESTful APIsDevOpsSales experienceData visualizationData modelingData analyticsData management

Posted 3 days ago
Apply
Apply

📍 Poland

🧭 Contract

💸 165000.0 - 195000.0 PLN per year

🔍 Consumer Finance

🏢 Company: Affirm👥 1001-5000💰 Post-IPO Equity about 4 years ago🫂 Last layoff about 2 years agoLendingFinancial ServicesPaymentsFinTech

  • 3+ years of professional work experience in BSA/AML, Sanctions, or Fraud experience preferred.
  • Exceptional judgment, organizational and analytical skills and keen attention to detail, coupled with excellent verbal and written communication abilities
  • Strong working knowledge of analytical tools, such as Snowflake, Excel, and visualization software (e.g. Looker, Tableau, etc.)
  • Strong background in suspicious activity identification and customer/merchant investigations
  • Ability to identify current manual processes and suggest/implement automation to drive efficiency
  • Strong understanding of BSA/AML and Sanctions Program requirements
  • Proven experience in enhancing or building a financial crimes program (risk assessment, training, development of governance/reporting metrics)
  • Ability to collaborate cross-functionally with, and communicate complex regulations to, business partners
  • Passion for technology, consumer finance, and improving consumer credit products
  • Follow company policies/standards to determine if activities or transactions are non-compliant/potentially suspicious and action as appropriate
  • Assist our originating bank partners with generating and filing reports
  • Undertake project work and Financial Crime program enhancements
  • Work closely with our Financial Crimes team to stay up to date on current industry standards and regulatory requirements
  • Ability to work / investigate alerts in a fast-paced environment while maintaining high quality standards
  • Work in collaboration with cross functional teams across the organization

SQLSnowflakeComplianceRisk ManagementData visualizationFinancial analysis

Posted 6 days ago
Apply
Apply
🔥 Data Engineer II
Posted 6 days ago

📍 Canada

🧭 Full-Time

💸 110500.0 - 130000.0 CAD per year

🔍 Software Engineering

🏢 Company: HashiCorp👥 1001-5000💰 Secondary Market about 4 years ago🫂 Last layoff almost 2 years agoPrivate CloudDevOpsInformation TechnologyCyber SecuritySoftwareCloud Infrastructure

  • Minimum 2 years of experience with snowflake- snowflake SQL, Snow pipe, streams, Stored procedure, Task, Hashing, Row Level Security, Time Travel etc.
  • Hands on experience with Snowpark and App development with Snowpark and Stream lit.
  • Proficient in ETL or ELT Data Pipelines and various aspects, terminologies with Pure SQL like SCD Dimensions, Delta Processing etc.
  • Working with AWS cloud services - S3, Lambda, Glue, Athena, IAM, CloudWatch.
  • Hands-on experience in API (Restful API) development and maintenance with Cloud technologies( Like AWS API Gateway, AWS lambda etc).
  • Experience in creating pipelines for real time and near real time integration working with different data sources - flat files, XML, JSON, Avro files and databases.
  • Fluent in Python/Go language to be able to write maintainable, reusable, and complex functions for backend data processing.
  • Front development with python is good to have but not necessary.
  • Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices.
  • Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
  • Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
  • Implements processes and systems to monitor data quality, ensuring production data is always reliable and available for key stakeholders and business processes that depend on it.
  • Writes unit/integration tests, contributes to engineering wiki, and documents work.
  • Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
  • Designs data integrations and data quality framework.
  • Designs and evaluates open source and vendor tools for data lineage.
  • Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.
  • Develop best practices for data structure to ensure consistency within the system

AWSBackend DevelopmentPythonSQLCloud ComputingData AnalysisETLSnowflakeApache KafkaData engineeringGoREST APICI/CDRESTful APIsMicroservicesJSONData modeling

Posted 6 days ago
Apply
Shown 10 out of 107