Apply

Principal Data Engineer

Posted 2024-09-02

View full description

💎 Seniority level: Principal, 10+ years

📍 Location: United States

💸 Salary: 210000 - 220000 USD per year

🔍 Industry: Healthcare

🗣️ Languages: English

⏳ Experience: 10+ years

🪄 Skills: AWSLeadershipProject ManagementPythonSQLJavaKafkaSnowflakeC++AirflowData engineeringData scienceSparkCommunication SkillsC (Programming language)

Requirements:
  • Experienced: 10+ years of experience in data engineering.
  • Technical Expertise: Advanced working knowledge of SQL, relational databases, and big data tools.
  • Architectural Visionary: Experience in service-oriented and event-based architecture.
  • Problem Solver: Ability to manage and optimize processes for data transformation.
  • Collaborative Leader: Strong communication skills and ability to lead cross-functional teams.
  • Project Management: Strong project management and organizational skills.
Responsibilities:
  • Lead the Design and Implementation: Architect and implement cutting-edge data processing platforms and enterprise-wide data solutions.
  • Scale Data Platform: Develop a scalable Platform for optimal data extraction, transformation, and loading.
  • AI / ML platform: Design and build scalable AI and ML platforms to support Transcarent use cases.
  • Collaborate Across Teams: Partner with Executive, Product, Clinical, Data, and Design teams.
  • Optimize Data Pipelines: Build and optimize complex data pipelines for performance and reliability.
  • Innovate and Automate: Create and maintain data tools and pipelines for analytics and data science.
  • Mentor and Lead: Provide technical leadership and mentorship to the data engineering team.
Apply

Related Jobs

Apply

📍 United States

🧭 Full-Time

💸 210000 - 220000 USD per year

🔍 Healthcare

  • 10+ years of experience in data engineering with a strong background in building and scaling data architectures.
  • Advanced knowledge of SQL, relational databases, and big data tools like Spark and Kafka.
  • Proficient in cloud-based data warehousing and services, especially Snowflake and AWS.
  • Understanding of AI/ML workflows.
  • Experience in service-oriented and event-based architecture with strong API development skills.
  • Strong communication and project management skills.

  • Lead the Design and Implementation using modern data architecture principles.
  • Scale Data Platform for optimal data extraction, transformation, and loading.
  • Design and build scalable AI and ML platforms.
  • Collaborate with Executive, Product, Clinical, Data, and Design teams.
  • Build and optimize complex data pipelines.
  • Create and maintain data tools and pipelines for analytics and data innovation.
  • Provide technical leadership and mentorship to the data engineering team.

AWSLeadershipProject ManagementPythonSQLJavaKafkaSnowflakeC++AirflowData engineeringSparkCommunication SkillsProblem SolvingOrganizational skills

Posted 2024-11-15
Apply
Apply

📍 AZ, CA, CO, FL, GA, KS, KY, IA, ID, IL, IN, MA, ME, MI, MN, MO, NC, NH, NJ, NV, NY, OH, OR, PA, RI, SC, SD, TN, TX, UT, VA, VT, WA, WI

🧭 Full-Time

💸 $201,601 - $336,001 per year

🔍 Logistics and E-commerce

  • Bachelor's or master's degree in computer science, Information Technology, or a related field.
  • 12+ years of experience in data architecture, data engineering, or similar roles, with a focus on cloud technologies and data platforms.
  • Proven experience in designing and scaling highly available data platforms, including data warehouses, data lakes, operational reporting, data science, and integrated data stores.
  • Strong expertise in modern data stack: Cloud data technologies (Databricks, Snowflake), data storage formats (Parquet, etc.), and query engines.
  • Extensive experience in data modeling, ETL processes, and both batch and real-time analytics frameworks.
  • Familiarity with modernizing data architectures and managing complex data migrations.
  • Proficiency in SQL and strong knowledge/proven experience with programming languages such as Python, Java, Scala, and tools like Airflow, dbt, and CI/CD tools (Git, Jenkins, Terraform).
  • Experience with streaming platforms like Kafka and integrating with BI tools like PowerBI, Tableau, or Looker.
  • Advanced analytical and problem-solving skills, with the ability to conduct complex data analysis and present findings in a clear, actionable way.
  • Strong communication and collaboration skills, with the ability to explain complex technical concepts to non-technical stakeholders.
  • Strong management experience, particularly in managing and mentoring technical teams, and driving large-scale, cross-functional data initiatives.
  • Ability to work in a fast-paced, dynamic environment, managing multiple projects effectively and meeting deadlines.
  • A mindset focused on innovation and continuous improvement, with a track record of initiating and running projects that have significantly improved data processes or outcomes.

  • Play a key role in defining and executing the data strategy for the organization, driving innovation, and influencing business outcomes through advanced data solutions.
  • Architect, design, and implement scalable data solutions that support diverse business needs across the organization.
  • Take full ownership of the entire data pipeline, from data ingestion and processing to storage, management, and delivery, ensuring seamless integration and high data quality.
  • Stay ahead of industry trends and emerging technologies, researching and experimenting with new tools, methodologies, and platforms to continually enhance the data infrastructure.
  • Drive the adoption of cutting-edge technologies that align with the company’s strategic goals.
  • Leverage cloud platforms to ensure the scalability, performance, and reliability of our data infrastructure.
  • Optimize databases through indexing, query optimization, and schema design, ensuring high availability and minimal downtime.
  • Run the modernization of existing data architectures, implementing advanced technologies and methodologies.
  • Plan and execute complex migrations to cutting-edge platforms, ensuring minimal disruption and high data integrity.
  • Work closely with various departments, including Data Science, Analytics, Product, Compliance, and Engineering, to design cohesive data models, schemas, and storage solutions.
  • Translate business needs into technical specifications, driving projects that enhance customer experiences and outcomes.
  • Create and set best practices for data ingestion, integration, and access patterns to support both real-time and batch-based consumer data needs.
  • Implement and maintain robust data governance practices to ensure compliance with data security and privacy regulations.
  • Establish and enforce standards for data ingestion, integration, processing, and delivery to ensure data quality and consistency.
  • Mentor and guide junior team members, fostering a culture of continuous improvement and technical excellence.
  • Develop detailed documentation to capture design specifications and processes, ensuring ongoing maintenance is supported.
  • Manage design reviews to verify that proposed solutions effectively resolve platform and stakeholder obstacles while adhering to business and technical requirements.
  • Craft and deliver clear communications that articulate the architectural strategy and its alignment with the company's goals.

PythonSQLData AnalysisETLGitJavaJenkinsKafkaSnowflakeTableauStrategyAirflowData analysisData engineeringData scienceCollaborationCI/CDProblem SolvingMentoringTerraform

Posted 2024-09-28
Apply
Apply

📍 Sioux Falls, SD, Scottsdale, AZ, Troy, MI, Franklin, TN, Dallas, TX

💸 $99,501.91 - $183,764.31 per year

🔍 Financial services

🏢 Company: Pathward, N.A.

  • Bachelor’s degree or equivalent experience
  • 10+ years delivering scalable, secure, and highly available technical data solutions
  • 5+ years of experience designing and building Data Engineering pipelines with industry leading technologies such as Talend, Informatica, etc.
  • Extensive SQL experience
  • Experience with ELT processes for transformation push down and data replication using tools such as Matillion
  • Experience with Data Visualization tools such as PowerBI, ThoughtSpot, or others
  • Experience with Python
  • Experience with SQL Server, SSIS, and No-SQL DBs preferred
  • 3+ years of complex enterprise experience leading distributed Data Engineering teams
  • 3+ years of experience in cloud data environments leveraging technologies such as Snowflake and AWS
  • 3+ years of banking or financial services and products domain experience.
  • Multiple successes driving Data Engineering in an agile environment (preferably with Kanban)
  • Prior experience designing and/or optimizing processes to ensure operational excellence
  • Experience with automated testing capabilities with Data Engineering preferred
  • Ability to communicate effectively with all levels in the company; this includes written and verbal communications as well as visualizations
  • Ability to collaborate in a team setting and work towards a common goal
  • Ability to influence stakeholders to consider alternate viewpoints
  • Ability to mentor and coach technical resources on Data Engineering guidelines and principles
  • Strong data architecture, critical thinking, and problem-solving abilities, along with an ability to handle ambiguous, and evolving requirements
  • Ability to navigate and work effectively across the organization to influence, inspire, and lead others around decisions that propel the organization towards set vision and goals

  • Leads a Data Engineering team with responsibility for planning, prioritization, architectural & business alignment, quality, and value delivery.
  • Develops flexible, maintainable, and reusable Data Engineering solutions using standards, best practices, and frameworks with a focus on efficiency and innovation.
  • Solves complex development problems leveraging good design and practical experience.
  • Continually looks at ways to improve existing systems, processes, and performance.
  • Leads and participates in planning and feature/user story analysis by providing feedback and demonstrating an understanding of business needs.
  • Solves business problems by implementing technical solutions based on solid design principles and best practices.
  • Identifies and contributes to opportunities for departmental & team improvements.
  • Documents software, best practices, standards, and frameworks.
  • Mentors staff by providing technical advice, helping with issue resolution, and providing feedback.
  • Contributes as a subject matter expert in technical areas.
  • Keeps up to date with current and future changes in tools, technology, best practices, and industry standards through training and development opportunities.
  • Identifies and leads learning and training opportunities for other team members and staff.

AWSLeadershipPythonSQLAgileSnowflakeData engineering

Posted 2024-09-18
Apply
Apply

📍 United States

🧭 Full-Time

💸 $163,282 - $192,262 per year

🔍 Software/Data Visualization

🏢 Company: Hypergiant💰 $ Corporate on 2019-06-05Artificial Intelligence (AI)Machine LearningInformation TechnologyMilitary

  • 15+ years of professional software development or data engineering experience (12+ with a STEM B.S. or 10+ with a relevant Master's degree).
  • Strong proficiency in Python and familiarity with Java and Bash scripting.
  • Hands-on experience implementing database technologies, messaging systems, and stream computing software (e.g., PostgreSQL, PostGIS, MongoDB, DuckDB, KsqlDB, RabbitMQ).
  • Experience with data fabric development using publish-subscribe models (e.g., Apache NiFi, Apache Pulsar, Apache Kafka and Kafka-based data service architecture).
  • Proficiency with containerization technologies (e.g., Docker, Docker-Compose, RKE2, Kubernetes, and Microk8s).
  • Experience with version control systems (e.g., Git), CI/CD tools (e.g., Jenkins), and collaborative development workflows.
  • Strong knowledge of data modeling and database optimization techniques.
  • Familiarity with data serialization languages (e.g., JSON, GeoJSON, YAML, XML).
  • Excellent problem-solving and analytical skills that have been applied to high visibility, important data engineering projects.
  • Strong communication skills and ability to lead the work of other engineers in a collaborative environment.
  • Demonstrated experience in coordinating team activities, setting priorities, and managing tasks to ensure balanced workloads and effective team performance.
  • Experience managing and mentoring development teams in an Agile environment.
  • Ability to make effective architecture decisions and document them clearly.
  • Must be a US Citizen and eligible to obtain and maintain a US Security Clearance.

  • Develop and continuously improve a data service that underpins cloud-based applications.
  • Support data and database modeling efforts.
  • Contribute to the development and maintenance of reusable component libraries and shared codebase.
  • Participate in the entire software development lifecycle, including requirement gathering, design, development, testing, and deployment, using an agile, iterative process.
  • Collaborate with developers, designers, testers, project managers, product owners, and project sponsors to integrate the data service to end-user applications.
  • Communicate tasking estimation and progress regularly to a development lead and product owner through appropriate tools.
  • Ensure seamless integration between database and messaging systems and the frontend/UI they support.
  • Ensure data quality, reliability, and performance through code reviews and effective testing strategies.
  • Write high-quality code, applying best practices, coding standards, and design patterns.
  • Team with other developers, fostering a culture of continuous learning and professional growth.

DockerLeadershipPostgreSQLPythonSoftware DevelopmentAgileBashDesign PatternsGitJavaJenkinsKafkaKubernetesMongoDBRabbitmqApache KafkaData engineeringData scienceCommunication SkillsAnalytical SkillsCI/CD

Posted 2024-09-11
Apply