Sr. Data Analyst, Pricing

New
IndiaFull-TimeSenior
Salary not disclosed
Apply NowOpens the employer's application page

Job Details

Experience
Minimum 3 years
Required Skills
PythonSQLCloud ComputingSnowflakeData engineeringGitHubMLOpsPySpark

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related technical field.
  • Minimum 3 years of experience in data analytics, business analytics, or data engineering roles.
  • Strong expertise in SQL, including advanced joins, CTEs, window functions, and query optimization techniques.
  • Hands-on experience with Snowflake, including data warehousing, orchestration, query tuning, and Snowpark functionalities.
  • Proficiency in Python for data processing, automation, and analytical workflows.
  • Experience with distributed data processing tools such as PySpark or Snowpark.
  • Understanding of MLOps concepts, CI/CD pipelines, and model lifecycle management.
  • Familiarity with GitHub workflows, branching strategies, and collaborative development practices.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
  • Strong analytical thinking, problem-solving skills, and the ability to work independently with minimal supervision.
  • Excellent communication and stakeholder collaboration skills in cross-functional and remote work environments.

Responsibilities

  • Collect, transform, validate, and analyze large volumes of structured and unstructured data to support pricing and business analytics initiatives.
  • Design, develop, and maintain scalable ETL/ELT pipelines using SQL, Python, and distributed processing frameworks such as PySpark or Snowpark.
  • Build and optimize cloud-based data solutions on Snowflake, ensuring strong performance, reliability, and scalability.
  • Collaborate with business stakeholders, IT teams, and product groups to translate analytical requirements into efficient technical solutions.
  • Support MLOps workflows, including automation, deployment, monitoring, and reproducibility of analytical models and processes.
  • Develop reusable workflows and exploratory analyses in Jupyter Lab while maintaining clean documentation and version control practices using GitHub.
  • Troubleshoot complex data quality and integration issues to ensure accurate, accessible, and reliable reporting environments.
  • Manage multiple projects independently in a fast-paced and highly collaborative global environment.
View Full Description & ApplyYou'll be redirected to the employer's site
View details
Apply Now