Snowflake Jobs

Find remote positions requiring Snowflake skills. Browse through opportunities where you can utilize your expertise and grow your career.

Snowflake
331 jobs found. to receive daily emails with new job openings that match your preferences.
331 jobs found.

Set alerts to receive daily emails with new job openings that match your preferences.

Apply

📍 Canada

🔍 Artificial Intelligence

  • Strong background in AWS DevOps and data engineering.
  • Expertise with AWS and SageMaker is essential.
  • Experience with Snowflake for analytics and data warehousing is highly desirable.

  • Manage and optimize the data infrastructure.
  • Focus on both data engineering and DevOps responsibilities.
  • Deploy machine learning models to AWS using SageMaker.

AWSMachine LearningSnowflakeData engineeringDevOps

Posted 2024-11-21
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-21

📍 Poland

🧭 Full-Time

🔍 Software development

🏢 Company: Sunscrapers sp. z o.o.

  • At least 5 years of professional experience as a data engineer.
  • Undergraduate or graduate degree in Computer Science, Engineering, Mathematics, or similar.
  • Excellent command in spoken and written English, at least C1.
  • Strong professional experience with Python and SQL.
  • Hands-on experience with DBT and Snowflake.
  • Experience in building data pipelines with Airflow or alternative solutions.
  • Strong understanding of various data modeling techniques like Kimball Star Schema.
  • Great analytical skills and attention to detail.
  • Creative problem-solving skills.
  • Great customer service and troubleshooting skills.

  • Modeling datasets and schemes for consistency and easy access.
  • Design and implement data transformations and data marts.
  • Integrating third-party systems and external data sources into data warehouse.
  • Building data flows for fetching, aggregation and data modeling using batch pipelines.

PythonSQLSnowflakeAirflowAnalytical SkillsCustomer serviceDevOpsAttention to detail

Posted 2024-11-21
Apply
Apply

📍 United States, Latin America, India

🔍 Data Analytics and Consulting

  • 8+ years of relevant experience in data analytics and data visualization platforms.
  • 3+ years of advanced Alteryx experience.
  • Alteryx certification is highly preferred.
  • Previous team lead experience coaching junior analytics consultants.
  • Proven experience creating scalable analytics solutions, including Power BI, Tableau, Alteryx, SQL, Snowflake, Python, R.
  • Experience with cloud data platforms like Snowflake, AWS, and Azure.
  • Strong data fluency and communication skills.
  • Knowledge of SQL and experience with data warehouses.
  • Exceptional customer-facing skills and project management abilities.

  • Gather client requirements through prototyping and visualization, refining them as needed during the engagement.
  • Execute consulting engagements successfully, delivering high-quality project outcomes.
  • Manage client expectations through weekly status updates and proactive feedback sessions.
  • Collaborate with clients to develop scalable, durable analytics solutions using tools like Alteryx and SQL.
  • Create standard workflows, analytical apps, reusable macros, and automated reports.
  • Manage team time effectively and provide clear project timelines.
  • Partner with the sales team to identify and assist in account growth.
  • Lead capability stand-ups, team meetings, and/or workshops.
  • Mentor and manage consultants, ensuring alignment with job descriptions.
  • Drive organic growth of accounts with existing clients.

AWSLeadershipProject ManagementPythonSQLSAPSharePointSnowflakeTableauAzureData scienceProblem SolvingMentoringCoaching

Posted 2024-11-21
Apply
Apply
🔥 Data Scientist
Posted 2024-11-21

📍 United States

🧭 Full-Time

🔍 Cybersecurity

🏢 Company: GreyNoise Intelligence

  • 5+ years of data science experience and/or an advanced degree in a relevant discipline (Data Science, Machine Learning, Operations Research, etc.).
  • Experience implementing machine learning techniques with real-world data, preferably in computer networking or cybersecurity; specifically clustering and anomaly detection.
  • Proficiency with machine learning frameworks and libraries such as PyTorch, scikit-learn, and numpy.
  • Experience with natural language processing (NLP) techniques and working with LLMs.
  • Strong programming skills in Python and familiarity with statistical analysis.
  • Experience in data visualization with a variety of tools and working with front-end developers to bring visualizations to production.
  • Familiarity with database and big data technologies (Elasticsearch, SQL, Snowflake, etc.).
  • Knowledge of cloud-based hosting and ML services, particularly AWS.
  • Understanding of containerization and deployment technologies like Docker and Kubernetes.
  • Ability to communicate technical concepts effectively, both to teammates and external audiences.
  • Excellent problem-solving skills and adaptability in a dynamic environment.

  • Develop and deploy machine learning models for real-time anomaly detection and threat identification.
  • Automate the discovery of interesting and anomalous data from our global honeypot network.
  • Research and implement new LLM technologies to help read and understand complex internet traffic patterns.
  • Integrate new visualizations and statistical models into our product to enhance user experience and data interpretation.
  • Ensure data quality by collaborating with infrastructure engineers to develop tests and alerts for detecting defects and determining their origin.
  • Optimize data pipelines in collaboration with data engineers for efficient data processing.
  • Interface directly with customers to capture analytical requests and translate them into actionable engineering requirements.
  • Present findings through social media, blogs, and conferences to engage with the broader community.
  • Stay current with the latest AI/ML research and cybersecurity trends to continuously improve our solutions.
  • Monitor and tune ML models in production environments to ensure scalability and reliability.

AWSDockerPythonSQLElasticSearchKubernetesMachine LearningNumpyPyTorchSnowflakeProduct DevelopmentData scienceElasticsearchCollaborationDocumentation

Posted 2024-11-21
Apply
Apply

📍 India

🧭 Full-Time

🔍 Data & AI

🏢 Company: ProArch

  • Bachelor’s degree in Computer Science, Engineering, or related field (Master’s preferred).
  • 8+ years of experience in AI, Data Engineering, and full-stack development.
  • Ability to work with ambiguity and deliver consultative solutions.
  • Familiarity with Agile methodologies (Scrum, Kanban).
  • Excellent communication and interpersonal skills.

  • Build and maintain a list of innovative ideas and evaluate them for feasibility.
  • Partner with presales teams to create tailored solutions for clients.
  • Provide technical expertise to resolve delivery challenges.
  • Conduct workshops to educate sales and marketing teams.
  • Share insights from sales calls with solution teams.

AWSDockerGraphQLLeadershipNode.jsPostgreSQLPythonSQLAgileBlockchainDjangoFlaskGCPIoTJavaJenkinsKafkaKubernetesMachine LearningMongoDBPyTorchSCRUMSnowflakeSpringSpring BootVue.JsAzureData engineering.NETAngularServerlessReactSparkTensorflowVue.jsCI/CDAgile methodologiesDevOpsMicroservices

Posted 2024-11-21
Apply
Apply

📍 United States, Latin America, India

🔍 Data Services, Cloud Data Solutions

🏢 Company: phData

  • 10+ years of Technical Program Management experience.
  • Experience managing customer relationships, timelines, scopes, budgets, and resources.
  • Proven ability to manage multiple customers with billable utilization of 85+%.
  • Strong background in managing data, AI, and analytics projects.
  • Direct client-facing experience in a consulting or professional services organization.
  • Hands-on experience as a Portfolio/Program Manager with Agile and Waterfall methodologies.
  • Proficiency in Jira, Azure DevOps, MS Project, Excel, and technical documentation.
  • Exceptional written and verbal communication skills.

  • Own and drive the success of projects across phData’s largest and most strategic customer.
  • Build and maintain strong relationships with the internal sales and delivery organization.
  • Identify opportunities for expanding phData services within existing customers.
  • Manage project risks and customer escalations, ensuring resolution.
  • Mentor and develop a team of world-class consultants.
  • Facilitate innovation and onsite visits.
  • Provide account health readouts and manage complex financials across a project portfolio.

AWSLeadershipProject ManagementAgileGCPSnowflakeJiraAzureCommunication SkillsMicrosoft OfficeAgile methodologiesTime ManagementDocumentation

Posted 2024-11-21
Apply
Apply

📍 United States, Latin America, India

🔍 Data and Analytics Consulting

🏢 Company: phData

  • 5+ years of relevant experience and/or demonstrated expertise in data analytics and data visualization platforms.
  • Expertise in Tableau Desktop and Tableau Prep, along with some experience managing workbooks on Tableau Server or managing a server environment.
  • Demonstrated expertise in data literacy.
  • Exceptional customer-facing skills, including but not limited to communication skills and project management skills.
  • Strong problem-solving skills with a passion for learning and mastering new technologies, techniques, and procedures.

  • Deliver on project-based consulting engagements.
  • Help clients develop analytical requirements through prototyping and visualization.
  • Create reporting and data visualization solutions.
  • Analyze, troubleshoot and/or tune product performance or deployment issues when required.
  • Support or assist with backlog support and development.

Project ManagementSQLSnowflakeTableauData scienceCommunication SkillsProblem Solving

Posted 2024-11-21
Apply
Apply

📍 US

🔍 Consumer insights

  • Strong PL/SQL and SQL development skills.
  • Proficient in multiple data engineering languages such as Python and Java.
  • Minimum 3-5 years of experience in Data engineering with Oracle and MS SQL.
  • Experience with data warehousing concepts and cloud-based technologies like Snowflake.
  • Experience with cloud platforms such as Azure.
  • Knowledge of data orchestration tools like Azure Data Factory and DataBricks workflows.
  • Understanding of data privacy regulations and best practices.
  • Experience working with remote teams.
  • Familiarity with tools like Git, Jira.
  • Bachelor's degree in Computer Science or Computer Engineering.

  • Design, implement and maintain scalable pipelines and architecture to collect, process, and store data from various sources.
  • Unit test and document solutions that meet product quality standards prior to release to QA.
  • Identify and resolve performance bottlenecks in pipelines to ensure efficient data delivery.
  • Implement data quality checks and validation processes.
  • Work with Data Architect to implement best practices for data governance, quality, and security.
  • Collaborate with cross-functional teams to address data needs.

PythonSQLETLGitJavaOracleQASnowflakeJiraAzureData engineeringCI/CD

Posted 2024-11-21
Apply
Apply
🔥 Staff Engineer
Posted 2024-11-20

📍 Cambridge, MA

🧭 Full-Time

💸 128000 - 139000 USD per year

🔍 Marketing and advertising

🏢 Company: Known

  • Must possess at least a Master’s degree in Computer Science or a closely related field plus 3 years of experience as a full-stack software engineer.
  • Alternatively, a Bachelor’s degree in Computer Science or a closely related field plus 5 years of progressive experience is acceptable.
  • Must have at least 1 year of experience in the advertising or marketing industry.
  • Professional experience must include: TypeScript, PostgreSQL, Google Cloud, Snowflake, AWS, CircleCI, Docker, Airflow, Linux, MacOS, Node.js, Vue.js.
  • Must have experience with digital and television platforms.
  • Highly proficient in Python and experience with SQL databases is required.

  • Manage complex projects related to the development of data applications for marketing and advertising across television and digital platforms.
  • Serve as the subject-matter expert and drive the development of data applications.
  • Collaborate with stakeholders to translate business requirements into technical requirements and workflows.
  • Set project milestones, manage releases, and lead implementation.
  • Oversee continuous integration/delivery, testing, and tooling design.
  • Research and present new technologies to improve performance.

AWSDockerNode.jsPostgreSQLPythonSQLSnowflakeTypeScriptVue.JsAirflowVue.jsLinux

Posted 2024-11-20
Apply
Apply

📍 United States of America

💸 80000 - 135000 USD per year

🏢 Company: VSPVisionCareers

  • Bachelor’s degree in computer science, data science, statistics, economics, or related functional area; or equivalent experience.
  • Excellent written and verbal communication skills.
  • 6 years of experience working with end users in development of analytical capabilities.
  • 6 years of hands-on experience in data modeling, SQL-based database management systems, ETL/data pipeline design, and data visualization.
  • Expert-level SQL coding experience.

  • Work with business stakeholders to design data and analytics capabilities supporting business strategies.
  • Develop data models and structures for data-driven solutions.
  • Collaborate in an agile, multi-disciplinary team to deliver data solutions.
  • Research, promote, and develop data architecture best practices.

PythonSQLAgileBusiness IntelligenceETLSCRUMSnowflakeData scienceData StructuresCommunication Skills

Posted 2024-11-20
Apply
Shown 10 out of 331