Apply

Data Engineer II

Posted about 1 month agoViewed

View full description

πŸ’Ž Seniority level: Middle, 2+ years

πŸ“ Location: United States

πŸ” Industry: Healthcare

🏒 Company: PatientIQπŸ‘₯ 51-100πŸ’° $20,000,000 Series B over 2 years agoBig DataPredictive AnalyticsSaaSInformation TechnologyHealth Care

πŸ—£οΈ Languages: English

⏳ Experience: 2+ years

πŸͺ„ Skills: AWSPythonSQLApache AirflowETLGCPGit

Requirements:
  • BS/MS in Computer Science, Engineering, Mathematics, or a related field.
  • 2+ years of experience as a Data Engineer.
  • Experience designing, building, and maintaining ETL infrastructure in a production setting.
  • Experience working with large, complex datasets.
  • Deep knowledge of SQL and at least one programming language (e.g., Python, Java, Ruby).
  • Experience with Data Warehousing (Redshift, BigQuery, Snowflake).
  • Experience with cloud technologies such as AWS, Google Cloud Platform, or Azure.
  • Experience with version control systems (e.g., Git) and writing reusable and extensible code.
  • Highly self-motivated with strong analytical problem-solving skills and attention to detail.
Responsibilities:
  • Clean and process large, complex datasets using tools such as SQL and Python.
  • Migrate client data into PatientIQ's platform per established service level agreements (SLA).
  • Develop rigorous data quality assurance checks throughout the ETL process.
  • Design, develop, maintain, and streamline scalable data pipelines to support healthcare data analysis.
  • Work closely with data analysts to understand their data needs and develop solutions to meet those needs.
  • Monitor and optimize the performance of data pipelines and systems.
  • Collaborate with other teams to integrate data from multiple sources.
Apply

Related Jobs

Apply

πŸ“ United States

πŸ’Έ $115,000 - $130,000 per year

πŸ” Advertising management

🏒 Company: MediavineπŸ‘₯ 101-250InternetAdvertisingInformation Technology

  • 3+ years of experience in a data engineering role.
  • Strong Python skills (Understands tradeoffs, optimization, etc).
  • Strong SQL skills (CTEs, window functions, optimization).
  • Experience working in cloud environments (AWS preferred, GCS, Azure).
  • An understanding of how to best structure data to enable internal and external facing analytics.
  • Familiarity with calling APIs to retrieve data (Authentication flows, filters, limits, pagination).
  • Experience working with DevOps to deploy, scale and monitor data infrastructure.
  • Scheduler experience either traditional or DAG based.
  • Comfortable working with multi-TB cloud data warehouses (Snowflake preferred, Redshift, Big Query).
  • Experience with other DBMS systems (Postgres in particular).
  • Nice to haves: experience with web analysis such as creating data structure that supports product funnels, user behavior, and decision path analysis.
  • Understanding of Snowflake external stages, file formats and snowpipe.
  • Experience with orchestration tools particularly across different technologies and stacks.
  • Experience with dbt.
  • Knowledge of Ad Tech, Google Ad Manager and all its fun quirks.
  • Familiarity with event tracking systems (NewRelic, Snowplow, etc).
  • Experience with one or more major BI tools (Domo, Looker, PowerBI, etc).

  • Create data pipelines that make data available for analytic and application use cases.
  • Develop self-healing, resilient processes that do not require constant care and feeding to run smoothly.
  • Create meaningful data quality notifications with clear actions for interested parties including other internal teams and other members of the data and analytics team.
  • Leading projects from a technical standpoint, creating project Technical Design Documents.
  • Support data analysts and analytics engineers ability to meet the needs of the organization.
  • Participate in code reviews, understanding coding standards, ensuring test coverage and being aware of best practices.
  • Build or implement tooling around data quality, governance and lineage, in the dbt framework and Snowflake but external to that as needed.
  • Provide next level support when data issues are discovered and communicated by the data analysts.
  • Work with data analysts and analytics engineers to standardize transformation logic in the dbt layer for consistency and ease of exploration by end users.
  • Enable analytics engineers and data analysts by providing data modeling guidance, query optimization and aggregation advice.

AWSPythonSQLSnowflakeData engineeringPostgresDevOps

Posted 3 months ago
Apply