Apply

Principal Data Engineer

Posted 6 months agoViewed

View full description

๐Ÿ’Ž Seniority level: Principal, 10+ years

๐Ÿ“ Location: United States

๐Ÿ’ธ Salary: 210000 - 220000 USD per year

๐Ÿ” Industry: Healthcare

๐Ÿ—ฃ๏ธ Languages: English

โณ Experience: 10+ years

๐Ÿช„ Skills: AWSLeadershipProject ManagementPythonSQLJavaKafkaSnowflakeC++AirflowData engineeringData scienceSparkCommunication SkillsC (Programming language)

Requirements:
  • Experienced: 10+ years of experience in data engineering.
  • Technical Expertise: Advanced working knowledge of SQL, relational databases, and big data tools.
  • Architectural Visionary: Experience in service-oriented and event-based architecture.
  • Problem Solver: Ability to manage and optimize processes for data transformation.
  • Collaborative Leader: Strong communication skills and ability to lead cross-functional teams.
  • Project Management: Strong project management and organizational skills.
Responsibilities:
  • Lead the Design and Implementation: Architect and implement cutting-edge data processing platforms and enterprise-wide data solutions.
  • Scale Data Platform: Develop a scalable Platform for optimal data extraction, transformation, and loading.
  • AI / ML platform: Design and build scalable AI and ML platforms to support Transcarent use cases.
  • Collaborate Across Teams: Partner with Executive, Product, Clinical, Data, and Design teams.
  • Optimize Data Pipelines: Build and optimize complex data pipelines for performance and reliability.
  • Innovate and Automate: Create and maintain data tools and pipelines for analytics and data science.
  • Mentor and Lead: Provide technical leadership and mentorship to the data engineering team.
Apply

Related Jobs

Apply

๐Ÿ“ AL, AR, AZ, CA (exempt only), CO, CT, FL, GA, ID, IL, IN, IA, KS, KY, MA, ME, MD, MI, MN, MO, MT, NC, NE, NJ, NM, NV, NY, OH, OK, OR, PA, SC, SD, TN, TX, UT, VT, VA, WA, and WI

๐Ÿงญ Full-Time

๐Ÿ” Insurance

๐Ÿข Company: Kin Insurance

  • 10+ years of experience in designing & architecting data systems, warehousing and/or ML Ops platforms
  • Proven experience in the design and architecture of large-scale data systems, including lakehouse and machine learning platforms, using modern cloud-based tools
  • Can communicate effectively with executives and team
  • Architect and implement solutions using Databricks for advanced analytics, data processing, and machine learning workloads
  • Expertise in data architecture and design, for both structured and unstructured data. Expertise in data modeling across transactional, BI and DS usage models
  • Fluency with modern cloud data stacks, SaaS solutions, and evolutionary architecture, enabling flexible, scalable, and cost-effective solutions
  • Expertise with all aspects of data management: data governance, data mastering, metadata management, data taxonomies and ontologies
  • Expertise in architecting and delivering highly-scalable and flexible, cost effective, cloud-based enterprise data solutions
  • Proven ability to develop and implement data architecture roadmaps and comprehensive implementation plans that align with business strategies
  • Experience working data integration tools and Kafka for real-time data streaming to ensure seamless data flow across the organization
  • Expertise in dbt for transformations, pipelines, and data modeling
  • Experience with data analytics, BI, data reporting and data visualization
  • Experience with the insurance domain is a plus
  • Lead the overall data architecture design, including ingestion, storage, management, and machine learning engineering platforms.
  • Implement the architecture and create a vision for how data will flow through the organization using a federated approach.
  • Manage data governance across multiple systems: data mastering, metadata management, data definitions, semantic-layer design, data taxonomies, and ontologies.
  • Architect and deliver highly scalable, flexible, and cost-effective enterprise data solutions that support the development of architecture patterns and standards.
  • Help define the technology strategy and roadmaps for the portfolio of data platforms and services across the organization.
  • Ensure data security and compliance, working within industry regulations.
  • Design and document data architecture at multiple levels across conceptual, logical, and physical views.
  • Provide โ€œhands-onโ€ architectural guidance and leadership throughout the entire lifecycle of development projects.
  • Translate business requirements into conceptual and detailed technology solutions that meet organizational goals.
  • Collaborate with other architects, engineering directors, and product managers to align data solutions with business strategy.
  • Lead proof-of-concept projects to test and validate new tools or architectural approaches.
  • Stay current with industry trends, vendor product offerings, and evolving data technologies to ensure the organization leverages the best available tools.
  • Cross-train peers and mentor team members to share knowledge and build internal capabilities.

AWSLeadershipPythonSQLApache AirflowCloud ComputingETLKafkaKubernetesMachine LearningSnowflakeAlgorithmsData engineeringData scienceData StructuresREST APIPandasSparkTensorflowCommunication SkillsAnalytical SkillsCollaborationCI/CDProblem SolvingMentoringTerraformMicroservicesJSONData visualizationData modelingData analyticsData management

Posted 4 days ago
Apply
Apply

๐Ÿ“ United States, Canada

๐Ÿ” Energy Storage

๐Ÿข Company: Plus Power๐Ÿ‘ฅ 101-250๐Ÿ’ฐ $98,000,000 Debt Financing about 1 year agoRenewable EnergyBatteryEnergy

  • 8 years of professional experience in SQL and Python programming, along with experience in data engineering tools such as Pandas, Polars, PyArrow, and similar tabular data interfaces
  • Proven experience designing and implementing scalable distributed systems for critical applications, with a strong emphasis on data integrity and transactional resilience
  • Demonstrated leadership abilities in technical initiatives, from conception to execution, including effective time management and stakeholder engagement
  • Excellent written and verbal communication skills, with the ability to collaborate effectively with diverse teams, including distributed teams
  • Hands-on experience designing, maintaining, and developing applications using relational database systems (RDBMS), preferably PostgreSQL or its variants such as TimescaleDB
  • Experience designing and supporting business intelligence layers that cater to a wide range of end-users
  • Experience working in an agile environment. Familiarity with tools such as JIRA and Confluence
  • Collaborate with other technical leaders and business stakeholders at Plus Power to architect and implement critical system components and features
  • Learn the Data Landscape of Plus Power - SCADA Site Sensor Data, Operational Performance Data, Data from ISO/RTOs, Fundamental Research and Weather Data, other Commodities, etc. and understand their relationships, utility to the business, and potential across internal contexts and teams
  • Collaboratively build a high quality, expressive Data Model that maps our institutional knowledge onto the raw data we collect, giving it comprehensible and actionable meaning
  • Drive architectural design and decision making, partnering with client leadership for knowledge sharing and prioritization
  • Be an implementation champion - hands-on build of critical components or broad-pattern templates - establish reusable solutions to common patterns, guide junior engineers in their application, and evangelize their usage among our more technical clients
  • Establish, implement, and evangelize best practices in data governance, long-term data asset evolution, and platform usage resource tracking

AWSLeadershipPostgreSQLPythonSQLAgileApache AirflowJiraAlgorithmsData engineeringData StructuresRDBMSREST APIPandasCommunication SkillsCI/CDProblem SolvingTerraformTeamworkData visualizationData modelingData analyticsData managementConfluence

Posted 12 days ago
Apply
Apply

๐Ÿ“ United States, Canada

๐Ÿงญ Full-Time

๐Ÿ” Healthcare

  • 8+ years of experience in data engineering
  • Experience with Python and NoSQL databases
  • Knowledge of AWS and cloud-based data pipelines
  • Deep understanding of data governance and security
  • Experience with Spark and Kafka
  • Architect and develop scalable data pipelines
  • Collaborate with data scientists and product teams
  • Design and optimize data models and databases
  • Implement best practices for data quality and security
  • Mentor junior team members

AWSPostgreSQLPythonETLKafkaNosqlSparkTerraform

Posted 14 days ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 142771.0 - 225000.0 USD per year

๐Ÿ” Media and Analytics

  • Master's degree in Computer Science, Data Science, engineering, mathematics, or a related quantitative field plus 3 years of experience in analytics software solutions.
  • Bachelor's degree in similar fields plus 5 years of experience is also acceptable.
  • 3 years of experience with Python and associated packages including Spark, AWS, S3, Java, JavaScript, and Adobe Analytics.
  • Proficiency in SQL for querying and managing data.
  • Experience in analytics programming languages such as Python (with Pandas).
  • Experience in handling large volumes of data and code management tools like Git.
  • 2 years of experience managing computer program orchestrations and using open-source management platforms like AirFlow.
  • Develop, test, and orchestrate econometric, statistical, and machine learning modules.
  • Conduct unit, integration, and regression testing.
  • Create data processing systems for analytic research and development.
  • Design, document, and present process flows for analytical systems.
  • Partner with Software Engineering for cloud-based solutions.
  • Orchestrate modules via directed acyclic graphs using workflow management systems.
  • Work in an agile development environment.

AWSPythonSQLApache AirflowGitMachine LearningData engineeringRegression testingPandasSpark

Posted about 1 month ago
Apply
Apply

๐Ÿ“ Texas, Maryland, Pennsylvania, Minnesota, Florida, Georgia, Illinois

๐Ÿ” Ecommerce, collectible card games

๐Ÿข Company: TCGPlayer_External_Career

  • Bachelorโ€™s degree in computer science, information technology, or related field, or equivalent experience.
  • 12 years or more experience in designing scalable and reliable datastores.
  • Mastery of MongoDB data modeling and query design, with significant experience in RDBMS technologies, preferably PostgreSQL.
  • Experience designing datastores for microservices and event-driven applications.
  • Experience with data governance support in medium-to-large organizations.
  • Strong written and verbal communication skills for collaboration across roles.
  • Act as a subject matter expert for MongoDB, providing guidance and materials to improve proficiency.
  • Guide selection of datastore technologies for applications to meet data needs.
  • Consult on database design for performance and scalability.
  • Write effective code for data management.
  • Support engineers with database interface advice.
  • Develop data flow strategies and define storage requirements for microservices.
  • Troubleshoot and enhance existing database designs.
  • Collaborate to ensure data architectures are efficient and scalable.
  • Lead cross-application datastore projects related to security and data governance.
  • Research emerging datastore capabilities for strategic planning.
  • Define and implement data storage strategies for microservices.

PostgreSQLMongoDBData engineeringMicroservicesData modeling

Posted about 2 months ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 185000.0 - 215000.0 USD per year

๐Ÿ” Media & Entertainment

  • 7+ years of experience in data engineering or software development
  • Expertise in AWS and distributed data technologies (e.g., Spark, Hadoop)
  • Proficiency with programming languages such as Python, PySpark, and JavaScript
  • Experience with large-scale data pipeline design and streaming data
  • Hands-on experience with AWS services like Lambda and S3
  • Strong SQL and data querying skills
  • Architect, design, and oversee development of data pipelines
  • Design and build analytics pipelines and business intelligence solutions
  • Work closely with cross-functional teams on data integration
  • Provide leadership and mentorship to data engineering team
  • Ensure compliance with best practices in data governance and quality
  • Drive data scalability and performance across cloud platforms

AWSPythonSQLApache AirflowCloud ComputingApache KafkaData engineering

Posted 2 months ago
Apply
Apply

๐Ÿ“ Sioux Falls, SD, Scottsdale, AZ, Troy, MI, Franklin, TN, Dallas, TX

๐Ÿงญ Full-Time

๐Ÿ’ธ 99501.91 - 183764.31 USD per year

๐Ÿ” Financial Services

๐Ÿข Company: Pathward, N.A.

  • Bachelorโ€™s degree or equivalent experience.
  • 10+ years delivering scalable, secure, and highly available technical data solutions.
  • 5+ years of experience designing and building Data Engineering pipelines with industry leading technologies such as Talend, Informatica, etc.
  • Extensive SQL experience.
  • Experience with ELT processes for transformation push down and data replication using tools such as Matillion.
  • Experience with Data Visualization tools such as PowerBI, ThoughtSpot, or others.
  • Experience with Python.
  • Experience with SQL Server, SSIS, and No-SQL DBs preferred.
  • 3+ years of complex enterprise experience leading distributed Data Engineering teams.
  • 3+ years of experience in cloud data environments leveraging technologies such as Snowflake and AWS.
  • 3+ years of banking or financial services and products domain experience.
  • Strong data architecture, critical thinking, and problem-solving abilities.
  • Ability to communicate effectively with all levels in the company.
  • Leads a Data Engineering team with responsibility for planning, prioritization, architectural & business alignment, quality, and value delivery.
  • Develops flexible, maintainable, and reusable Data Engineering solutions using standards, best practices, and frameworks with a focus on efficiency and innovation.
  • Solves complex development problems leveraging good design and practical experience.
  • Continually looks at ways to improve existing systems, processes, and performance.
  • Leads and participates in planning and feature/user story analysis by providing feedback and demonstrating an understanding of business needs.
  • Solves business problems by implementing technical solutions based on solid design principles and best practices.
  • Identifies and contributes to opportunities for departmental & team improvements.
  • Documents software, best practices, standards, and frameworks.
  • Mentors staff by providing technical advice, helping with issue resolution, and providing feedback.
  • Contributes as a subject matter expert in technical areas.
  • Keeps up to date with current and future changes in tools, technology, best practices, and industry standards through training and development opportunities.
  • Identifies and leads learning and training opportunities for other team members and staff.

AWSPythonSQLSnowflakeData engineering

Posted 2 months ago
Apply
Apply

๐Ÿ“ United States

๐Ÿ’ธ 210000 - 220000 USD per year

๐Ÿ” Healthcare

๐Ÿข Company: Transcarent๐Ÿ‘ฅ 251-500๐Ÿ’ฐ $126,000,000 Series D 10 months agoPersonal HealthHealth CareSoftware

  • Experienced: 10+ years in data engineering with a strong background in building and scaling data architectures.
  • Technical Expertise: Advanced working knowledge of SQL, relational databases, big data tools (e.g., Spark, Kafka), and cloud-based data warehousing (e.g., Snowflake).
  • Architectural Visionary: Experience in service-oriented and event-based architecture with strong API development skills.
  • Problem Solver: Manage and optimize processes for data transformation, metadata management, and workload management.
  • Collaborative Leader: Strong communication skills to present ideas clearly and lead cross-functional teams.
  • Project Management: Strong organizational skills, capable of leading multiple projects simultaneously.
  • Lead the Design and Implementation: Architect and implement cutting-edge data processing platforms and enterprise-wide data solutions using modern data architecture principles.
  • Scale Data Platform: Develop a scalable Platform for data extraction, transformation, and loading from various sources, ensuring data integrity and accessibility.
  • AI / ML platform: Design and build scalable AI and ML platforms for Transcarent use cases.
  • Collaborate Across Teams: Work with Executive, Product, Clinical, Data, and Design teams to meet their data infrastructure needs.
  • Optimize Data Pipelines: Build and optimize complex data pipelines for high performance and reliability.
  • Innovate and Automate: Create and maintain data tools and pipelines for analytics and data science.
  • Mentor and Lead: Provide leadership and mentorship to the data engineering team.

AWSLeadershipProject ManagementPythonSQLJavaKafkaSnowflakeC++AirflowData engineeringSparkCommunication SkillsProblem SolvingOrganizational skills

Posted 3 months ago
Apply