Apply

Analytics Engineer

Posted 2024-10-30

View full description

๐Ÿ“ Location: United States, Canada

๐Ÿ” Industry: SaaS for restaurants and local service-based businesses

๐Ÿช„ Skills: SQLSnowflakeAnalytical Skills

Requirements:
  • Proficiency in DBT and Snowflake.
  • Experience with building and maintaining data pipelines.
  • Ability to enhance data infrastructure.
Responsibilities:
  • Play a pivotal role in shaping the data landscape and driving strategic use of data.
  • Instrumental in building Single Source of Truth (SSOT) models using DBT and Snowflake.
  • Maintain robust and scalable data pipelines.
  • Enhance data infrastructure to support analytics and data-driven decision-making.
Apply

Related Jobs

Apply

๐Ÿ“ Salt Lake City, Utah

๐Ÿงญ Internship

๐Ÿ” Visual collaboration

๐Ÿข Company: Lucid Software

  • Currently pursuing graduate or undergraduate degree in a technical or quantitative field.
  • Desire to develop technical skills for working with large data sets, including SQL and Python or R.
  • Comfortable using SQL for data transformations.
  • Willingness to familiarize with version control workflows and modern programming languages.
  • Ability to communicate clearly about data to technical and non-technical audiences.
  • Detail-oriented, organized, and a good team player.
  • Passion for structure, organization, and efficiency.

  • Write complex, production-quality data transformation code.
  • Implement effective data tests for accuracy and reliability.
  • Coach analysts on data modeling and SQL query optimization.
  • Design and maintain the architecture of the data warehouse.
  • Collaborate with data engineers on infrastructure projects.
  • Troubleshoot and resolve data issues.
  • Ensure documentation of data, systems, and metrics.
  • Maintain the quality of the analytics codebase.

SQLSnowflakeAirflowCollaborationDocumentationCoaching

Posted 2024-11-19
Apply
Apply

๐Ÿ“ U.S.

๐Ÿ” Restaurant industry

NOT STATED

  • Reporting to the Sr. Director of Business Analytics.
  • Develop and maintain data models.
  • Support reporting and analysis.
  • Focus on the go-to-market function.
  • Drive performance management and key decisions around a complex implementation process.
  • Establish baselines for robust data models for various questions to drive growth.

SQLData AnalysisData analysisCommunication SkillsAnalytical SkillsCollaborationMicrosoft ExcelProblem SolvingMicrosoft OfficeAttention to detailOrganizational skillsPresentation skillsTime ManagementWritten communicationMultitaskingDocumentationMicrosoft Office Suite

Posted 2024-11-19
Apply
Apply

๐Ÿ“ Kenya, Philippines, Mexico, India, United States

๐Ÿ” Financial services, fintech

  • Ability to collaborate effectively with various teams.
  • Passion for improving data accessibility and infrastructure.

  • Build and maintain the presentation layer to provide aggregated, clean, and trustworthy data.
  • Collaborate with other teams to seek ways to improve our data infrastructure.
  • Provide self-service and training to enable data analysts to explore insights using our data infrastructure.
  • Enhance data accessibility across the company.

PostgreSQLPythonSQLData AnalysisData analysisPostgresPandasCommunication SkillsAnalytical SkillsCollaborationMicrosoft ExcelProblem SolvingAttention to detailOrganizational skillsPresentation skillsTime ManagementWritten communicationDocumentation

Posted 2024-11-14
Apply
Apply

๐Ÿ“ US, UK, Philippines, Poland, South Africa

๐Ÿงญ Full-Time

๐Ÿ” Financial services / Remittance

๐Ÿข Company: Zepz

  • Comfortable with daily use of SQL in a modern cloud data warehouse environment.
  • Able to automate processes and deploy applications in Python, developing production standard scripts.
  • Confidence working with command line, version control, testing, and code reviews.
  • Problem-solver who understands business issues and communicates commercial impact.
  • Advocate for data-driven decision-making and strives to improve processes.
  • Familiarity with dbt for designing and implementing data models is a nice to have.
  • Open-minded with respect to diversity and inclusivity.

  • Building and maintaining data models to expose reliable data for analysis and reporting.
  • Communicating with analysts and business stakeholders to understand commercial requirements and translating them into technical solutions.
  • Developing standards and best practices for data consumption, including educating data consumers on data quality.
  • Identifying opportunities to reduce complexity and increase efficiency across data models and the data warehouse.
  • Ensuring data quality is high, including testing, automation, scalability, and documentation.

AWSPythonSQLKubernetesAirflow

Posted 2024-11-09
Apply
Apply

๐Ÿ“ AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, WA

๐Ÿงญ Full-Time

๐Ÿ’ธ 162500 - 202500 USD per year

๐Ÿ” Analytics and Postal Services

๐Ÿข Company: Lob

  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience with big data warehouse systems like Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating production systems using dbt and Python.
  • 3+ years of BI Software experience with analytics platforms like Looker, Power BI, or Tableau.
  • Empathy and effective communication skills to convey complex analytical issues.
  • Strong interpretive skills for deconstructing complex data into usable models.
  • Product mindset to build long-lasting data systems for insights.

  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams to improve the quality of their metrics.
  • Champion data governance, security, privacy, and retention policies.
  • Support and mentor fellow engineers and data team members through various means.

Project ManagementPythonSQLDynamoDBElasticSearchETLSnowflakeTableauAirflowElasticsearchNosqlCommunication Skills

Posted 2024-11-07
Apply
Apply

๐Ÿ“ United States

๐Ÿ” Financial technology

๐Ÿข Company: Forward Financing

  • 4+ years of experience in Analytics Engineering, Data Engineering, or Business Intelligence.
  • 2+ years of hands-on experience with dbt or equivalent experience using Python for data transformations.
  • 2+ years of experience with a cloud-based data warehouse such as Snowflake or Redshift.
  • Advanced proficiency in SQL and data modeling.
  • Strong ability to understand business requirements and translate them into technical solutions.
  • Experience with business intelligence tools such as Looker, Tableau, or similar.
  • Preferred: Experience with Python.

  • Design, build, and maintain scalable data models and marts using dbt.
  • Partner with the Data Science team to create variables for predictive models.
  • Collaborate with Data Engineering on deploying variables for production models.
  • Work with the Technology team on schema changes and data migrations.
  • Collaborate with DevOps to monitor our streaming dbt project for real-time use.
  • Ensure code quality by reviewing analystโ€™s pull requests.
  • Assist in evaluating and integrating third-party data sources.
  • Adhere to data governance standards including data quality checks and security compliance.

SQLBusiness IntelligenceSnowflakeTableauData engineeringData scienceDevOpsCompliance

Posted 2024-11-07
Apply
Apply

๐Ÿ“ US

๐Ÿ’ธ 144500 - 171000 USD per year

๐Ÿ” Password and identity management

๐Ÿข Company: LastPass

  • Experience in data engineering, analytics engineering, or a similar role, preferably in the cybersecurity or tech industry.
  • Proficiency with SQL and experience working with relational databases and data warehouses.
  • Familiarity with Python for data processing.
  • Experience with Kimball data modeling and designing scalable data architectures.
  • Excellent communication skills and the ability to collaborate with cross-functional teams.
  • Strong problem-solving skills, attention to detail, and a proactive mindset.
  • Strong proficiency with AWS Cloud Services (S3, Redshift, EC2, Lambda, RDS, etc.).
  • Experience with Terraform or CloudFormation for infrastructure as code in AWS.
  • Familiarity with CI/CD pipelines for automated data workflow deployment.
  • Experience working in an Agile environment.

  • Design and build scalable, reliable, and efficient ETL pipelines to process and transform data from various sources.
  • Collaborate with data scientists and analysts to ensure data is easily accessible, clean, and enriched for downstream analysis.
  • Develop and maintain the company's data warehouse and analytics infrastructure using best practices.
  • Partner with engineering teams to create and maintain data models that power key business metrics.
  • Contribute to data strategy focusing on quality, governance, and compliance.
  • Lead monitoring and alerting on data pipelines.
  • Optimize data system performance, reducing query times and ensuring data availability.
  • Mentor and provide technical guidance to junior analytics engineers.

SQLETLStrategyCommunication SkillsCollaborationProblem SolvingAttention to detailCompliance

Posted 2024-11-07
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Internship

๐Ÿ” Visual collaboration software

๐Ÿข Company: Lucid Software - Extra Job Board

  • Currently pursuing graduate or undergraduate degree, ideally in a technical or quantitative field.
  • Ability and desire to develop technical skills to work with large data sets using SQL, Python, or R.
  • Comfortable using SQL for data transformations.
  • Ability to communicate clearly about data to both technical and non-technical audiences.
  • Detail-oriented, organized, and a team player.
  • Passion for structure and efficiency, with attention to naming conventions and coding style.

  • Write complex, production-quality data transformation code to meet the needs of analysts, data scientists, and business stakeholders.
  • Implement effective data tests to ensure the accuracy and reliability of data and ELT pipelines.
  • Assist in coaching analysts on data modeling, SQL query structure and optimization, and software engineering best practices.
  • Collaborate with data engineers on implementing new systems/tools/processes and ingesting and modeling data from new sources.
  • Troubleshoot and resolve data issues as they arise.
  • Ensure thorough documentation of data, systems, business logic, and metrics.
  • Maintain the quality of the analytics codebase by managing old code and addressing tech debt.

SQLSnowflakeAirflowCollaborationDocumentationCoaching

Posted 2024-11-07
Apply
Apply

๐Ÿ“ US

๐Ÿงญ Full-Time

๐Ÿ’ธ 140000 - 170000 USD per year

๐Ÿ” Healthcare

๐Ÿข Company: SmarterDx

  • 3+ years of analytics engineering experience in the healthcare industry, with clinical or billing/claims data.
  • Proficiency in SQL and ETL processes, with significant experience in dbt.
  • Experience in a general programming language (Python, Java, Scala, Ruby, etc.).
  • Strong experience in data modeling and implementing production data pipelines.
  • Familiarity with data orchestration essentials.

  • Designing, developing, and maintaining dbt data models that support healthcare analytics products.
  • Integrating and transforming customer data to meet data specifications and pipelines.
  • Collaborating with cross-functional teams to translate data requirements into effective data models.
  • Configuring and improving data pipelines to ensure seamless integration.
  • Conducting QA and testing on data models for accuracy and reliability.
  • Applying industry standards and best practices in data modeling and testing.
  • Participating in issue resolution with other engineers.

AWSPythonSQLDynamoDBETLQASnowflakeAirflowData engineering

Posted 2024-11-07
Apply
Apply

๐Ÿ“ United States

๐Ÿ’ธ 140000 - 170000 USD per year

๐Ÿ” Event ticketing

๐Ÿข Company: Gametime United

  • Bachelorโ€™s degree in Data Engineering, Computer Science, Information Systems, or related field.
  • 3+ years of experience in data engineering, analytics engineering, or business intelligence.
  • Proficiency with SQL, Snowflake, Python, and BI tools like Sigma or Tableau.
  • Hands-on experience with ETL tools such as Airflow or dbt.
  • Experience building self-service analytics capabilities is preferred.

  • Develop and enhance the Analytics Data Layer (ADL) for scalable data pipelines.
  • Build and maintain operational reports using Sigma and other BI tools.
  • Develop a deep understanding of business operations for effective modeling.
  • Collaborate with data engineers and analysts for reporting solutions.
  • Create intuitive data models and dashboards for self-service analytics.
  • Implement best practices for data modeling, governance, and ETL processes.

PythonSQLBusiness IntelligenceETLSnowflakeTableauAirflowData engineeringCollaborationDocumentation

Posted 2024-11-07
Apply