Apply

Data Engineer II (Remote - US)

Posted 2 months agoViewed

View full description

💎 Seniority level: Middle, 4+ years

📍 Location: United States

💸 Salary: 110000 - 130000 USD per year

🔍 Industry: Energy solutions

🗣️ Languages: English

⏳ Experience: 4+ years

🪄 Skills: AWSPythonSQLETLData engineeringCommunication SkillsAnalytical SkillsDocumentation

Requirements:
  • Bachelor's degree in computer science, Physics/Engineering, Business, or Mathematics.
  • Experience building ETL pipelines; real-time pipelines are a plus.
  • Proficiency in Python and SQL.
  • At least 4 years of experience in data warehouse development with a strong foundation in dimensional data modeling.
  • 4+ years of experience in SQL query creation and ETL design, implementation, and maintenance.
  • 4+ years of experience developing data pipelines in Python and with AWS services like S3, Redshift, and RDS.
  • Strong analytical skills with experience handling diverse datasets.
  • Excellent oral, written, and interpersonal communication skills.
  • Detail-oriented with the ability to prioritize and work independently.
Responsibilities:
  • Collaborate with stakeholders to understand data requirements and deliver data solutions aligned with business objectives.
  • Analyze data sources and design scalable data pipelines and ETL processes using Python, SQL, and AWS technologies.
  • Develop and maintain data warehouses, optimizing data storage and retrieval.
  • Build and populate schemas, automate reporting processes, and document technical specifications, ETL processes, data mappings, and data dictionaries.
  • Support the Data Science Center of Excellence (DSCOE) in data engineering initiatives.
Apply

Related Jobs

Apply

📍 United States

💸 $115,000 - $130,000 per year

🔍 Advertising management

🏢 Company: Mediavine👥 101-250InternetAdvertisingInformation Technology

  • 3+ years of experience in a data engineering role.
  • Strong Python skills (Understands tradeoffs, optimization, etc).
  • Strong SQL skills (CTEs, window functions, optimization).
  • Experience working in cloud environments (AWS preferred, GCS, Azure).
  • An understanding of how to best structure data to enable internal and external facing analytics.
  • Familiarity with calling APIs to retrieve data (Authentication flows, filters, limits, pagination).
  • Experience working with DevOps to deploy, scale and monitor data infrastructure.
  • Scheduler experience either traditional or DAG based.
  • Comfortable working with multi-TB cloud data warehouses (Snowflake preferred, Redshift, Big Query).
  • Experience with other DBMS systems (Postgres in particular).
  • Nice to haves: experience with web analysis such as creating data structure that supports product funnels, user behavior, and decision path analysis.
  • Understanding of Snowflake external stages, file formats and snowpipe.
  • Experience with orchestration tools particularly across different technologies and stacks.
  • Experience with dbt.
  • Knowledge of Ad Tech, Google Ad Manager and all its fun quirks.
  • Familiarity with event tracking systems (NewRelic, Snowplow, etc).
  • Experience with one or more major BI tools (Domo, Looker, PowerBI, etc).

  • Create data pipelines that make data available for analytic and application use cases.
  • Develop self-healing, resilient processes that do not require constant care and feeding to run smoothly.
  • Create meaningful data quality notifications with clear actions for interested parties including other internal teams and other members of the data and analytics team.
  • Leading projects from a technical standpoint, creating project Technical Design Documents.
  • Support data analysts and analytics engineers ability to meet the needs of the organization.
  • Participate in code reviews, understanding coding standards, ensuring test coverage and being aware of best practices.
  • Build or implement tooling around data quality, governance and lineage, in the dbt framework and Snowflake but external to that as needed.
  • Provide next level support when data issues are discovered and communicated by the data analysts.
  • Work with data analysts and analytics engineers to standardize transformation logic in the dbt layer for consistency and ease of exploration by end users.
  • Enable analytics engineers and data analysts by providing data modeling guidance, query optimization and aggregation advice.

AWSPythonSQLSnowflakeData engineeringPostgresDevOps

Posted 3 months ago
Apply