Apply

Senior Analytics Engineer

Posted 2024-10-24

View full description

💎 Seniority level: Senior, 3+ years

📍 Location: United States

🔍 Industry: Technology/Analytics

🏢 Company: Fetch

🗣️ Languages: English

⏳ Experience: 3+ years

🪄 Skills: AWSPythonSQLBusiness IntelligenceDynamoDBETLGCPMongoDBSnowflakeTableauAirflowAzurePostgresRedisNosqlCI/CD

Requirements:
  • 3+ years of professional experience in a technical role requiring advance knowledge of SQL.
  • Understand the difference between SQL that works and SQL that performs.
  • Experience with data modeling and orchestration tools.
  • Experience with relational (SQL) and non-relational (NoSQL) databases.
  • Experience with object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB).
  • Understanding of ETL vs. ELT processes, data warehouses, and business intelligence tools.
  • Experience clearly communicating about data with internal and external stakeholders from both technical and nontechnical backgrounds.
  • Ability to thrive in a highly autonomous, matrixed organization and manage multiple, concurrent work streams.
Responsibilities:
  • Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance.
  • Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices.
  • Generate innovative approaches to datasets with millions of daily active users and terabytes of data.
  • Translate business requirements for near-real-time actionable insights into data models and artifacts.
  • Communicate findings clearly both verbally and in writing to a broad range of stakeholders.
  • Perform administrative duties for Snowflake, Tableau, and DBT/Airflow infrastructure.
  • Test, monitor, and report on data health and data quality.
  • Lead the charge on data documentation and data discovery initiatives.
Apply

Related Jobs

Apply

📍 AZ, CA, CO, DC, FL, GA, IA, IL, MA, MD, MI, MN, NE, NC, NH, NJ, NV, NY, OH, OR, PA, RI, TN, TX, UT, WA

🧭 Full-Time

💸 162500 - 202500 USD per year

🔍 Analytics and Postal Services

🏢 Company: Lob

  • 5+ years of Analytics Engineering experience: ETL/ELT, OLAP modeling, data visualization, and data governance.
  • 5+ years of SQL experience with big data warehouse systems like Redshift, Snowflake, or BigQuery.
  • 3+ years of experience operating production systems using dbt and Python.
  • 3+ years of BI Software experience with analytics platforms like Looker, Power BI, or Tableau.
  • Empathy and effective communication skills to convey complex analytical issues.
  • Strong interpretive skills for deconstructing complex data into usable models.
  • Product mindset to build long-lasting data systems for insights.

  • Partner with stakeholders to identify problems, create insights, and develop durable data solutions.
  • Exemplify analytics engineering best practices, such as modularity, testing, cataloging, and version control.
  • Foster a curiosity mindset and guide other teams to improve the quality of their metrics.
  • Champion data governance, security, privacy, and retention policies.
  • Support and mentor fellow engineers and data team members through various means.

Project ManagementPythonSQLDynamoDBElasticSearchETLSnowflakeTableauAirflowElasticsearchNosqlCommunication Skills

Posted 2024-11-07
Apply
Apply

📍 United States

🔍 Financial technology

🏢 Company: Forward Financing

  • 4+ years of experience in Analytics Engineering, Data Engineering, or Business Intelligence.
  • 2+ years of hands-on experience with dbt or equivalent experience using Python for data transformations.
  • 2+ years of experience with a cloud-based data warehouse such as Snowflake or Redshift.
  • Advanced proficiency in SQL and data modeling.
  • Strong ability to understand business requirements and translate them into technical solutions.
  • Experience with business intelligence tools such as Looker, Tableau, or similar.
  • Preferred: Experience with Python.

  • Design, build, and maintain scalable data models and marts using dbt.
  • Partner with the Data Science team to create variables for predictive models.
  • Collaborate with Data Engineering on deploying variables for production models.
  • Work with the Technology team on schema changes and data migrations.
  • Collaborate with DevOps to monitor our streaming dbt project for real-time use.
  • Ensure code quality by reviewing analyst’s pull requests.
  • Assist in evaluating and integrating third-party data sources.
  • Adhere to data governance standards including data quality checks and security compliance.

SQLBusiness IntelligenceSnowflakeTableauData engineeringData scienceDevOpsCompliance

Posted 2024-11-07
Apply
Apply

📍 United States

🧭 Full-Time

🔍 SaaS

🏢 Company: Apollo.io

  • Experience with relevant tools in data engineering and analytics.
  • Sound understanding of the SaaS industry, especially regarding LTV:CAC, Activation levers, and Conversion Rates.
  • Ability to approach new projects objectively without bias.
  • Great time management skills.
  • Excellent written and oral communication skills for summarizing insights and designing data assets.

  • Design, develop, and maintain essential data models for various business functions to ensure consistency and accuracy.
  • Define and implement data engineering standards and best practices, providing guidance to other teams.
  • Collaborate with multiple business areas to understand their requirements and deliver scalable data solutions.
  • Influence the future direction of analytics infrastructure, offering strategic insights to drive business impact.

Business IntelligenceSnowflakeStrategyData engineeringCommunication SkillsCollaboration

Posted 2024-10-21
Apply
Apply

📍 USA, UK, Philippines, Poland, South Africa

🧭 Permanent

🔍 Finance and technology (remittances)

🏢 Company: Zepz

  • Experience with DBT to design and implement data models.
  • Experience using a modern data warehouse (e.g., Databricks, Snowflake, BigQuery) for scalable and efficient data processing.
  • Proficiency in SQL and experience with database technologies (e.g., PostgreSQL, MySQL, Redshift).
  • You have the ability to work confidently with the tools of modern software engineering, for example, working with the command line, version control, testing, and performing code reviews.
  • You have previous use of any orchestration tools similar to Airflow, DBT Cloud, or Fivetran.
  • Familiarity with data visualization tools (e.g., Mode, Tableau, Looker).
  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their data services.
  • You’re comfortable, or have an interest in, reading and extracting data transformation processes from Python scripts.
  • You see yourself as a problem-solver who wants to understand the business problem and communicate the commercial impact alongside the technical solution.
  • Strong communication skills and the ability to work effectively in a distributed team environment.
  • You are comfortable collaborating across multiple time zones.
  • You have an open mind with respect to diversity and inclusivity.

  • Developing, testing, and implementing data models to ensure data integrity and performance using DBT.
  • Collaborating with cross-functional teams to understand data needs and develop solutions to meet those needs.
  • Contributing to and following best practices for data consumption, including educating data consumers on data quality, availability and interpretation.
  • Optimizing existing data processes and pipelines to improve efficiency and reduce latency.
  • Identifying opportunities to reduce complexity and increase efficiency across data models and our data warehouse.
  • Troubleshooting and resolving data issues, ensuring data quality and reliability.
  • Ensuring the data and output is of high quality - tested, automated, scalable, and documented.

AWSPostgreSQLPythonSQLGCPMySQLSnowflakeTableauAirflowAzureCommunication SkillsProblem Solving

Posted 2024-10-17
Apply