Apply

Staff Data Engineer (Remote)

Posted about 1 month agoViewed

View full description

💎 Seniority level: Staff, 7+ years

📍 Location: United States

🔍 Industry: Medical Technology

🏢 Company: external_career_site_usa

🗣️ Languages: English

⏳ Experience: 7+ years

🪄 Skills: PythonSQLETLJavaKafkaSnowflakeC#TableauAzureData engineeringCI/CDRESTful APIsTerraformMicroservicesData visualizationData modeling

Requirements:
  • 7+ years of experience programming in SQL
  • 5+ years programming in Snowflake
  • 3+ years working with FiveTran
  • 3+ years of experience in ETL/data movement including both batch and real-time data transmission using dbt
  • 5+ years of experience data modeling including normalizing and dimensional modelling
  • 5+ years of experience optimizing performance in ETL and reporting layers
  • 3+ years of hands-on development and deployment experience with Azure cloud using .NET, T-SQL, Azure SQL, Azure Storage, Azure Data Factory, Cosmos DB, GitHub, Azure DevOps, and CI/CD pipelines
  • 3+ years with APIs, microservices
  • 3+ years of experience programming in Python/Java/C#
Responsibilities:
  • Build next generation distributed streaming and data pipelines and analytics data stores using streaming frameworks (Kafka, Spark Streaming etc.) using Programming languages like Python and ELT tools like Fivetran, DBT etc.
  • Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance.
  • Develop and deploy structured, semi-structured, and unstructured data storage models such as data lake and dimensional modeling
  • Evaluate and define functional requirements for data and analytical solutions
  • Monitor system performance, identify issues related to performance, scalability, and reliability, and design solutions to remediate
  • Partner with IT, Data Governance, and subject matter experts to create, automate, and deploy effective data quality monitors.
  • Develop and maintain enterprise data dictionary of data warehouse transformations and business rules
  • Ensure healthcare information security best practice controls are in place and they adhere to HIPAA utilizing a common control framework (i.e. NIST, HITRUST)
  • Implement master data management (MDM) solutions to collect, maintain, and leverage data for common business entities
  • Implement metadata management solutions to collection, maintain, and leverage application and system metadata
Apply

Related Jobs

Apply

📍 United States

🧭 Full-Time

💸 160000.0 - 210000.0 USD per year

🔍 Software Development

🏢 Company: Jobgether👥 11-50💰 $1,493,585 Seed about 2 years agoInternet

  • 6–10 years of experience in data engineering, data analysis, or software development roles.
  • Proficient in SQL and NoSQL databases; strong hands-on experience with Spark/Databricks.
  • Skilled in modern data orchestration tools such as Airflow and cloud services like AWS (EC2, S3, Lambda, RDS, IAM).
  • Solid experience with web frameworks (Vue.js, React) and RESTful service development.
  • Familiarity with serverless architectures and messaging systems (SQS, SNS).
  • Bachelor’s degree in Computer Science or related discipline.
  • Demonstrated ability to manage complex codebases and deliver projects from design to deployment.
  • Design and implement scalable data pipelines to support integration across internal systems and third-party platforms.
  • Build and maintain data lake solutions using technologies such as Spark and Databricks.
  • Develop and manage cloud-based services and web applications to support data access and processing.
  • Ensure high data quality and availability standards across the enterprise.
  • Collaborate with engineering, analytics, and AI teams to understand data needs and deliver performant solutions.
  • Lead or contribute to architectural decisions and technical mentorship within the data team.
  • Implement DevOps and agile best practices across the development lifecycle.

AWSSQLCloud ComputingData AnalysisETLVue.JsAirflowData engineeringServerlessNosqlReactSparkRESTful APIsDevOps

Posted 24 days ago
Apply
Apply

📍 Paris, New York, San Francisco, Sydney, Madrid, London, Berlin

🔍 Communication technology

  • Passionate about data engineering.
  • Experience in designing and developing data infrastructure.
  • Technical skills to solve complex challenges.
  • Play a crucial role in designing, developing, and maintaining data infrastructure.
  • Collaborate with teams across the company to solve complex challenges.
  • Improve operational efficiency and lead business towards strategic goals.
  • Contribute to engineering efforts that enhance customer journey.

AWSPostgreSQLPythonSQLApache AirflowETLData engineering

Posted 6 months ago
Apply