Apply

Analytics Engineer

Posted 2024-11-07

View full description

πŸ’Ž Seniority level: Middle, 3+ years

πŸ“ Location: United States

πŸ’Έ Salary: 132813 - 179688 USD per year

πŸ” Industry: Gaming

🏒 Company: Rec Room

⏳ Experience: 3+ years

πŸͺ„ Skills: PythonSoftware DevelopmentSQLData AnalysisJavaJavascriptData analysisJavaScript

Requirements:
  • 3+ years of SQL development experience; experience with Databricks SQL Warehouse preferred.
  • Solid proficiency in SQL, including query optimization skills.
  • Experience with pipeline orchestration tools, preferably DBT and Prefect.
  • Experience with dashboarding tools; experience with Sigma and Hex is a plus.
  • Familiarity with event-based data handling processes is advantageous.
  • Experience with Python and JavaScript is a bonus.
Responsibilities:
  • Partner with PMs, designers, engineers, and analysts/data scientists to develop the data layer for analysis and product features.
  • Maintain ELT pipelines using tools like Rudderstack and DBT.
  • Develop data warehouse models on Databricks SQL Warehouse.
  • Apply software engineering practices for dataset quality.
  • Design dashboards for company performance.
  • Build experimentation datasets for A/B testing.
  • Collaborate with ML teams for data model deployment.
Apply

Related Jobs

Apply

πŸ“ Salt Lake City, UT

🧭 Internship

πŸ” Visual Collaboration

🏒 Company: Lucid Software

  • Currently pursuing a graduate or undergraduate degree in a technical or quantitative field.
  • Ability and desire to develop technical skills for handling large data sets (SQL, Python, R).
  • Comfort with using SQL for data transformations.
  • Ability to learn version control workflows and modern programming languages.
  • Clear communication skills for both technical and non-technical audiences.
  • Detail-oriented, organized, and a good team player.
  • Passionate about structure, organization, and coding efficiency.

  • Write complex, production-quality data transformation code to meet the needs of analysts and stakeholders.
  • Implement effective data tests for data accuracy and ELT pipelines.
  • Coach analysts on data modeling, SQL optimization, and best practices.
  • Maintain the architecture of the data warehouse.
  • Collaborate with data engineers on ingesting new data sources and processing data between systems.
  • Troubleshoot and resolve data issues.
  • Document data, systems, business logic, and metrics.

SQLSnowflakeAirflowCollaborationDocumentationCoaching

Posted 2024-11-19
Apply
Apply

πŸ“ U.S.

πŸ” Restaurant industry

NOT STATED

  • Developing and maintaining data models to support reporting and analysis.
  • Focusing initially on the go-to-market function.
  • Helping drive performance management and key decisions regarding complex implementations.
  • Establishing baselines for robust data models to answer a wide range of questions promoting growth at Olo.

SQLData AnalysisData analysisCommunication SkillsAnalytical SkillsCollaborationMicrosoft ExcelProblem SolvingMicrosoft OfficeAttention to detailOrganizational skillsPresentation skillsTime ManagementWritten communicationMultitaskingDocumentationMicrosoft Office Suite

Posted 2024-11-19
Apply
Apply

πŸ“ Kenya, Philippines, Mexico, India, United States

πŸ” Financial services, fintech

  • Ability to collaborate effectively with various teams.
  • Passion for improving data accessibility and infrastructure.

  • Build and maintain the presentation layer to provide aggregated, clean, and trustworthy data.
  • Collaborate with other teams to seek ways to improve our data infrastructure.
  • Provide self-service and training to enable data analysts to explore insights using our data infrastructure.
  • Enhance data accessibility across the company.

PostgreSQLPythonSQLData AnalysisData analysisPostgresPandasCommunication SkillsAnalytical SkillsCollaborationMicrosoft ExcelProblem SolvingAttention to detailOrganizational skillsPresentation skillsTime ManagementWritten communicationDocumentation

Posted 2024-11-14
Apply
Apply

πŸ“ US, UK, Philippines, Poland, South Africa

🧭 Full-Time

πŸ” Financial services / Remittance

🏒 Company: Zepz

  • Comfortable with daily use of SQL in a modern cloud data warehouse environment.
  • Able to automate processes and deploy applications in Python, developing production standard scripts.
  • Confidence working with command line, version control, testing, and code reviews.
  • Problem-solver who understands business issues and communicates commercial impact.
  • Advocate for data-driven decision-making and strives to improve processes.
  • Familiarity with dbt for designing and implementing data models is a nice to have.
  • Open-minded with respect to diversity and inclusivity.

  • Building and maintaining data models to expose reliable data for analysis and reporting.
  • Communicating with analysts and business stakeholders to understand commercial requirements and translating them into technical solutions.
  • Developing standards and best practices for data consumption, including educating data consumers on data quality.
  • Identifying opportunities to reduce complexity and increase efficiency across data models and the data warehouse.
  • Ensuring data quality is high, including testing, automation, scalability, and documentation.

AWSPythonSQLKubernetesAirflow

Posted 2024-11-09
Apply
Apply

πŸ“ AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, WA

🧭 Full-Time

πŸ’Έ 162500 - 202500 USD per year

πŸ” Analytics and Postal Services

🏒 Company: Lob

  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience with big data warehouse systems like Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating production systems using dbt and Python.
  • 3+ years of BI Software experience with analytics platforms like Looker, Power BI, or Tableau.
  • Empathy and effective communication skills to convey complex analytical issues.
  • Strong interpretive skills for deconstructing complex data into usable models.
  • Product mindset to build long-lasting data systems for insights.

  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams to improve the quality of their metrics.
  • Champion data governance, security, privacy, and retention policies.
  • Support and mentor fellow engineers and data team members through various means.

Project ManagementPythonSQLDynamoDBElasticSearchETLSnowflakeTableauAirflowElasticsearchNosqlCommunication Skills

Posted 2024-11-07
Apply
Apply

πŸ“ United States

πŸ” Financial technology

🏒 Company: Forward Financing

  • 4+ years of experience in Analytics Engineering, Data Engineering, or Business Intelligence.
  • 2+ years of hands-on experience with dbt or equivalent experience using Python for data transformations.
  • 2+ years of experience with a cloud-based data warehouse such as Snowflake or Redshift.
  • Advanced proficiency in SQL and data modeling.
  • Strong ability to understand business requirements and translate them into technical solutions.
  • Experience with business intelligence tools such as Looker, Tableau, or similar.
  • Preferred: Experience with Python.

  • Design, build, and maintain scalable data models and marts using dbt.
  • Partner with the Data Science team to create variables for predictive models.
  • Collaborate with Data Engineering on deploying variables for production models.
  • Work with the Technology team on schema changes and data migrations.
  • Collaborate with DevOps to monitor our streaming dbt project for real-time use.
  • Ensure code quality by reviewing analyst’s pull requests.
  • Assist in evaluating and integrating third-party data sources.
  • Adhere to data governance standards including data quality checks and security compliance.

SQLBusiness IntelligenceSnowflakeTableauData engineeringData scienceDevOpsCompliance

Posted 2024-11-07
Apply
Apply

πŸ“ US

πŸ’Έ 144500 - 171000 USD per year

πŸ” Password and identity management

🏒 Company: LastPass

  • Experience in data engineering, analytics engineering, or a similar role, preferably in the cybersecurity or tech industry.
  • Proficiency with SQL and experience working with relational databases and data warehouses.
  • Familiarity with Python for data processing.
  • Experience with Kimball data modeling and designing scalable data architectures.
  • Excellent communication skills and the ability to collaborate with cross-functional teams.
  • Strong problem-solving skills, attention to detail, and a proactive mindset.
  • Strong proficiency with AWS Cloud Services (S3, Redshift, EC2, Lambda, RDS, etc.).
  • Experience with Terraform or CloudFormation for infrastructure as code in AWS.
  • Familiarity with CI/CD pipelines for automated data workflow deployment.
  • Experience working in an Agile environment.

  • Design and build scalable, reliable, and efficient ETL pipelines to process and transform data from various sources.
  • Collaborate with data scientists and analysts to ensure data is easily accessible, clean, and enriched for downstream analysis.
  • Develop and maintain the company's data warehouse and analytics infrastructure using best practices.
  • Partner with engineering teams to create and maintain data models that power key business metrics.
  • Contribute to data strategy focusing on quality, governance, and compliance.
  • Lead monitoring and alerting on data pipelines.
  • Optimize data system performance, reducing query times and ensuring data availability.
  • Mentor and provide technical guidance to junior analytics engineers.

SQLETLStrategyCommunication SkillsCollaborationProblem SolvingAttention to detailCompliance

Posted 2024-11-07
Apply
Apply

πŸ“ United States

🧭 Internship

πŸ” Visual collaboration software

🏒 Company: Lucid Software - Extra Job Board

  • Currently pursuing graduate or undergraduate degree, ideally in a technical or quantitative field.
  • Ability and desire to develop technical skills to work with large data sets using SQL, Python, or R.
  • Comfortable using SQL for data transformations.
  • Ability to communicate clearly about data to both technical and non-technical audiences.
  • Detail-oriented, organized, and a team player.
  • Passion for structure and efficiency, with attention to naming conventions and coding style.

  • Write complex, production-quality data transformation code to meet the needs of analysts, data scientists, and business stakeholders.
  • Implement effective data tests to ensure the accuracy and reliability of data and ELT pipelines.
  • Assist in coaching analysts on data modeling, SQL query structure and optimization, and software engineering best practices.
  • Collaborate with data engineers on implementing new systems/tools/processes and ingesting and modeling data from new sources.
  • Troubleshoot and resolve data issues as they arise.
  • Ensure thorough documentation of data, systems, business logic, and metrics.
  • Maintain the quality of the analytics codebase by managing old code and addressing tech debt.

SQLSnowflakeAirflowCollaborationDocumentationCoaching

Posted 2024-11-07
Apply
Apply

πŸ“ United States

πŸ’Έ 140000 - 170000 USD per year

πŸ” Event ticketing

🏒 Company: Gametime United

  • Bachelor’s degree in Data Engineering, Computer Science, Information Systems, or related field.
  • 3+ years of experience in data engineering, analytics engineering, or business intelligence.
  • Proficiency with SQL, Snowflake, Python, and BI tools like Sigma or Tableau.
  • Hands-on experience with ETL tools such as Airflow or dbt.
  • Experience building self-service analytics capabilities is preferred.

  • Develop and enhance the Analytics Data Layer (ADL) for scalable data pipelines.
  • Build and maintain operational reports using Sigma and other BI tools.
  • Develop a deep understanding of business operations for effective modeling.
  • Collaborate with data engineers and analysts for reporting solutions.
  • Create intuitive data models and dashboards for self-service analytics.
  • Implement best practices for data modeling, governance, and ETL processes.

PythonSQLBusiness IntelligenceETLSnowflakeTableauAirflowData engineeringCollaborationDocumentation

Posted 2024-11-07
Apply
Apply

πŸ“ United States, Canada

πŸ” SaaS for restaurants and local service-based businesses

  • Proficiency in DBT and Snowflake.
  • Experience with building and maintaining data pipelines.
  • Ability to enhance data infrastructure.

  • Play a pivotal role in shaping the data landscape and driving strategic use of data.
  • Instrumental in building Single Source of Truth (SSOT) models using DBT and Snowflake.
  • Maintain robust and scalable data pipelines.
  • Enhance data infrastructure to support analytics and data-driven decision-making.

SQLSnowflakeAnalytical Skills

Posted 2024-10-30
Apply