Apply📍 United States
💸 $115,000 - $130,000 per year
🔍 Advertising management
🏢 Company: Mediavine
- 3+ years of experience in a data engineering role.
- Strong Python skills (Understands tradeoffs, optimization, etc).
- Strong SQL skills (CTEs, window functions, optimization).
- Experience working in cloud environments (AWS preferred, GCS, Azure).
- An understanding of how to best structure data to enable internal and external facing analytics.
- Familiarity with calling APIs to retrieve data (Authentication flows, filters, limits, pagination).
- Experience working with DevOps to deploy, scale and monitor data infrastructure.
- Scheduler experience either traditional or DAG based.
- Comfortable working with multi-TB cloud data warehouses (Snowflake preferred, Redshift, Big Query).
- Experience with other DBMS systems (Postgres in particular).
- Nice to haves: experience with web analysis such as creating data structure that supports product funnels, user behavior, and decision path analysis.
- Understanding of Snowflake external stages, file formats and snowpipe.
- Experience with orchestration tools particularly across different technologies and stacks.
- Experience with dbt.
- Knowledge of Ad Tech, Google Ad Manager and all its fun quirks.
- Familiarity with event tracking systems (NewRelic, Snowplow, etc).
- Experience with one or more major BI tools (Domo, Looker, PowerBI, etc).
- Create data pipelines that make data available for analytic and application use cases.
- Develop self-healing, resilient processes that do not require constant care and feeding to run smoothly.
- Create meaningful data quality notifications with clear actions for interested parties including other internal teams and other members of the data and analytics team.
- Leading projects from a technical standpoint, creating project Technical Design Documents.
- Support data analysts and analytics engineers ability to meet the needs of the organization.
- Participate in code reviews, understanding coding standards, ensuring test coverage and being aware of best practices.
- Build or implement tooling around data quality, governance and lineage, in the dbt framework and Snowflake but external to that as needed.
- Provide next level support when data issues are discovered and communicated by the data analysts.
- Work with data analysts and analytics engineers to standardize transformation logic in the dbt layer for consistency and ease of exploration by end users.
- Enable analytics engineers and data analysts by providing data modeling guidance, query optimization and aggregation advice.
AWSPythonSQLSnowflakeData engineeringPostgresDevOps
Posted 2024-10-13
Apply