Apply

Senior Analytics Engineer

Posted 2024-11-07

View full description

💎 Seniority level: Senior, 5+ years of Analytics Engineering experience, 5+ years of SQL experience, 3+ years of experience with dbt and Python, 3+ years of BI Software experience

📍 Location: AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, WA

💸 Salary: 162500 - 202500 USD per year

🔍 Industry: Analytics and Postal Services

🏢 Company: Lob

⏳ Experience: 5+ years of Analytics Engineering experience, 5+ years of SQL experience, 3+ years of experience with dbt and Python, 3+ years of BI Software experience

🪄 Skills: Project ManagementPythonSQLDynamoDBElasticSearchETLSnowflakeTableauAirflowElasticsearchNosqlCommunication Skills

Requirements:
  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience with big data warehouse systems like Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating production systems using dbt and Python.
  • 3+ years of BI Software experience with analytics platforms like Looker, Power BI, or Tableau.
  • Empathy and effective communication skills to convey complex analytical issues.
  • Strong interpretive skills for deconstructing complex data into usable models.
  • Product mindset to build long-lasting data systems for insights.
Responsibilities:
  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams to improve the quality of their metrics.
  • Champion data governance, security, privacy, and retention policies.
  • Support and mentor fellow engineers and data team members through various means.
Apply

Related Jobs

Apply

📍 United States

🔍 Financial technology

🏢 Company: Forward Financing

  • 4+ years of experience in Analytics Engineering, Data Engineering, or Business Intelligence.
  • 2+ years of hands-on experience with dbt or equivalent experience using Python for data transformations.
  • 2+ years of experience with a cloud-based data warehouse such as Snowflake or Redshift.
  • Advanced proficiency in SQL and data modeling.
  • Strong ability to understand business requirements and translate them into technical solutions.
  • Experience with business intelligence tools such as Looker, Tableau, or similar.
  • Preferred: Experience with Python.

  • Design, build, and maintain scalable data models and marts using dbt.
  • Partner with the Data Science team to create variables for predictive models.
  • Collaborate with Data Engineering on deploying variables for production models.
  • Work with the Technology team on schema changes and data migrations.
  • Collaborate with DevOps to monitor our streaming dbt project for real-time use.
  • Ensure code quality by reviewing analyst’s pull requests.
  • Assist in evaluating and integrating third-party data sources.
  • Adhere to data governance standards including data quality checks and security compliance.

SQLBusiness IntelligenceSnowflakeTableauData engineeringData scienceDevOpsCompliance

Posted 2024-11-07
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Technology/Analytics

🏢 Company: Fetch

  • 3+ years of professional experience in a technical role requiring advance knowledge of SQL.
  • Understand the difference between SQL that works and SQL that performs.
  • Experience with data modeling and orchestration tools.
  • Experience with relational (SQL) and non-relational (NoSQL) databases.
  • Experience with object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB).
  • Understanding of ETL vs. ELT processes, data warehouses, and business intelligence tools.
  • Experience clearly communicating about data with internal and external stakeholders from both technical and nontechnical backgrounds.
  • Ability to thrive in a highly autonomous, matrixed organization and manage multiple, concurrent work streams.

  • Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance.
  • Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices.
  • Generate innovative approaches to datasets with millions of daily active users and terabytes of data.
  • Translate business requirements for near-real-time actionable insights into data models and artifacts.
  • Communicate findings clearly both verbally and in writing to a broad range of stakeholders.
  • Perform administrative duties for Snowflake, Tableau, and DBT/Airflow infrastructure.
  • Test, monitor, and report on data health and data quality.
  • Lead the charge on data documentation and data discovery initiatives.

AWSPythonSQLBusiness IntelligenceDynamoDBETLGCPMongoDBSnowflakeTableauAirflowAzurePostgresRedisNosqlCI/CD

Posted 2024-10-24
Apply
Apply

📍 United States

🧭 Full-Time

🔍 SaaS

🏢 Company: Apollo.io

  • Experience with relevant tools in data engineering and analytics.
  • Sound understanding of the SaaS industry, especially regarding LTV:CAC, Activation levers, and Conversion Rates.
  • Ability to approach new projects objectively without bias.
  • Great time management skills.
  • Excellent written and oral communication skills for summarizing insights and designing data assets.

  • Design, develop, and maintain essential data models for various business functions to ensure consistency and accuracy.
  • Define and implement data engineering standards and best practices, providing guidance to other teams.
  • Collaborate with multiple business areas to understand their requirements and deliver scalable data solutions.
  • Influence the future direction of analytics infrastructure, offering strategic insights to drive business impact.

Business IntelligenceSnowflakeStrategyData engineeringCommunication SkillsCollaboration

Posted 2024-10-21
Apply
Apply

📍 United States

🧭 Full-Time

🔍 Software Development

🏢 Company: Appfire Technologies, LLC.

  • 5+ years of relevant industry experience.
  • Strong SQL skills for data modeling and query optimization.
  • 3+ years of experience with Python and DBT.
  • Passion for high data quality and impactful analytics engineering work.
  • Strong understanding of data warehousing concepts and best practices.
  • Efficient problem-solving and analytical skills.
  • Ability to work in both collaborative and independent environments.
  • Experience managing concurrent complex data projects.
  • Excellent communication and documentation skills.

  • Understand decision needs by interfacing with Analytics Engineers, Data Engineers, Data Analysts and Business Partners.
  • Architect, build, and launch efficient & reliable data models and pipelines with Data Engineering, using DBT and Snowflake.
  • Design and implement metrics and dimensions for self-service analysis and structured KPI views.
  • Define and share best practices for metrics, dimension, and data model development.

PythonSQLSnowflakeData engineeringAnalytical SkillsCollaboration

Posted 2024-10-19
Apply
Apply

📍 USA, UK, Philippines, Poland, South Africa

🧭 Permanent

🔍 Finance and technology (remittances)

🏢 Company: Zepz

  • Experience with DBT to design and implement data models.
  • Experience using a modern data warehouse (e.g., Databricks, Snowflake, BigQuery) for scalable and efficient data processing.
  • Proficiency in SQL and experience with database technologies (e.g., PostgreSQL, MySQL, Redshift).
  • You have the ability to work confidently with the tools of modern software engineering, for example, working with the command line, version control, testing, and performing code reviews.
  • You have previous use of any orchestration tools similar to Airflow, DBT Cloud, or Fivetran.
  • Familiarity with data visualization tools (e.g., Mode, Tableau, Looker).
  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their data services.
  • You’re comfortable, or have an interest in, reading and extracting data transformation processes from Python scripts.
  • You see yourself as a problem-solver who wants to understand the business problem and communicate the commercial impact alongside the technical solution.
  • Strong communication skills and the ability to work effectively in a distributed team environment.
  • You are comfortable collaborating across multiple time zones.
  • You have an open mind with respect to diversity and inclusivity.

  • Developing, testing, and implementing data models to ensure data integrity and performance using DBT.
  • Collaborating with cross-functional teams to understand data needs and develop solutions to meet those needs.
  • Contributing to and following best practices for data consumption, including educating data consumers on data quality, availability and interpretation.
  • Optimizing existing data processes and pipelines to improve efficiency and reduce latency.
  • Identifying opportunities to reduce complexity and increase efficiency across data models and our data warehouse.
  • Troubleshooting and resolving data issues, ensuring data quality and reliability.
  • Ensuring the data and output is of high quality - tested, automated, scalable, and documented.

AWSPostgreSQLPythonSQLGCPMySQLSnowflakeTableauAirflowAzureCommunication SkillsProblem Solving

Posted 2024-10-17
Apply