Apply

Principal Data Engineer

Posted 5 months agoViewed

View full description

💎 Seniority level: Principal, 12+ years

📍 Location: AZ, CA, CO, FL, GA, KS, KY, IA, ID, IL, IN, MA, ME, MI, MN, MO, NC, NH, NJ, NV, NY, OH, OR, PA, RI, SC, SD, TN, TX, UT, VA, VT, WA, WI

💸 Salary: $201,601 - $336,001 per year

🔍 Industry: Logistics and E-commerce

🗣️ Languages: English

⏳ Experience: 12+ years

🪄 Skills: PythonSQLData AnalysisETLGitJavaJenkinsKafkaSnowflakeTableauStrategyAirflowData engineeringData scienceCollaborationCI/CDProblem SolvingMentoringTerraformScalaData modeling

Requirements:
  • Bachelor's or master's degree in computer science, Information Technology, or a related field.
  • 12+ years of experience in data architecture, data engineering, or similar roles, with a focus on cloud technologies and data platforms.
  • Proven experience in designing and scaling highly available data platforms, including data warehouses, data lakes, operational reporting, data science, and integrated data stores.
  • Strong expertise in modern data stack: Cloud data technologies (Databricks, Snowflake), data storage formats (Parquet, etc.), and query engines.
  • Extensive experience in data modeling, ETL processes, and both batch and real-time analytics frameworks.
  • Familiarity with modernizing data architectures and managing complex data migrations.
  • Proficiency in SQL and strong knowledge/proven experience with programming languages such as Python, Java, Scala, and tools like Airflow, dbt, and CI/CD tools (Git, Jenkins, Terraform).
  • Experience with streaming platforms like Kafka and integrating with BI tools like PowerBI, Tableau, or Looker.
  • Advanced analytical and problem-solving skills, with the ability to conduct complex data analysis and present findings in a clear, actionable way.
  • Strong communication and collaboration skills, with the ability to explain complex technical concepts to non-technical stakeholders.
  • Strong management experience, particularly in managing and mentoring technical teams, and driving large-scale, cross-functional data initiatives.
  • Ability to work in a fast-paced, dynamic environment, managing multiple projects effectively and meeting deadlines.
  • A mindset focused on innovation and continuous improvement, with a track record of initiating and running projects that have significantly improved data processes or outcomes.
Responsibilities:
  • Play a key role in defining and executing the data strategy for the organization, driving innovation, and influencing business outcomes through advanced data solutions.
  • Architect, design, and implement scalable data solutions that support diverse business needs across the organization.
  • Take full ownership of the entire data pipeline, from data ingestion and processing to storage, management, and delivery, ensuring seamless integration and high data quality.
  • Stay ahead of industry trends and emerging technologies, researching and experimenting with new tools, methodologies, and platforms to continually enhance the data infrastructure.
  • Drive the adoption of cutting-edge technologies that align with the company’s strategic goals.
  • Leverage cloud platforms to ensure the scalability, performance, and reliability of our data infrastructure.
  • Optimize databases through indexing, query optimization, and schema design, ensuring high availability and minimal downtime.
  • Run the modernization of existing data architectures, implementing advanced technologies and methodologies.
  • Plan and execute complex migrations to cutting-edge platforms, ensuring minimal disruption and high data integrity.
  • Work closely with various departments, including Data Science, Analytics, Product, Compliance, and Engineering, to design cohesive data models, schemas, and storage solutions.
  • Translate business needs into technical specifications, driving projects that enhance customer experiences and outcomes.
  • Create and set best practices for data ingestion, integration, and access patterns to support both real-time and batch-based consumer data needs.
  • Implement and maintain robust data governance practices to ensure compliance with data security and privacy regulations.
  • Establish and enforce standards for data ingestion, integration, processing, and delivery to ensure data quality and consistency.
  • Mentor and guide junior team members, fostering a culture of continuous improvement and technical excellence.
  • Develop detailed documentation to capture design specifications and processes, ensuring ongoing maintenance is supported.
  • Manage design reviews to verify that proposed solutions effectively resolve platform and stakeholder obstacles while adhering to business and technical requirements.
  • Craft and deliver clear communications that articulate the architectural strategy and its alignment with the company's goals.
Apply

Related Jobs

Apply
🔥 Principal Data Engineer
Posted about 1 hour ago

📍 AL, AR, AZ, CA (exempt only), CO, CT, FL, GA, ID, IL, IN, IA, KS, KY, MA, ME, MD, MI, MN, MO, MT, NC, NE, NJ, NM, NV, NY, OH, OK, OR, PA, SC, SD, TN, TX, UT, VT, VA, WA, and WI

🧭 Full-Time

🔍 Insurance

🏢 Company: Kin Insurance

  • 10+ years of experience in designing & architecting data systems, warehousing and/or ML Ops platforms
  • Proven experience in the design and architecture of large-scale data systems, including lakehouse and machine learning platforms, using modern cloud-based tools
  • Can communicate effectively with executives and team
  • Architect and implement solutions using Databricks for advanced analytics, data processing, and machine learning workloads
  • Expertise in data architecture and design, for both structured and unstructured data. Expertise in data modeling across transactional, BI and DS usage models
  • Fluency with modern cloud data stacks, SaaS solutions, and evolutionary architecture, enabling flexible, scalable, and cost-effective solutions
  • Expertise with all aspects of data management: data governance, data mastering, metadata management, data taxonomies and ontologies
  • Expertise in architecting and delivering highly-scalable and flexible, cost effective, cloud-based enterprise data solutions
  • Proven ability to develop and implement data architecture roadmaps and comprehensive implementation plans that align with business strategies
  • Experience working data integration tools and Kafka for real-time data streaming to ensure seamless data flow across the organization
  • Expertise in dbt for transformations, pipelines, and data modeling
  • Experience with data analytics, BI, data reporting and data visualization
  • Experience with the insurance domain is a plus
  • Lead the overall data architecture design, including ingestion, storage, management, and machine learning engineering platforms.
  • Implement the architecture and create a vision for how data will flow through the organization using a federated approach.
  • Manage data governance across multiple systems: data mastering, metadata management, data definitions, semantic-layer design, data taxonomies, and ontologies.
  • Architect and deliver highly scalable, flexible, and cost-effective enterprise data solutions that support the development of architecture patterns and standards.
  • Help define the technology strategy and roadmaps for the portfolio of data platforms and services across the organization.
  • Ensure data security and compliance, working within industry regulations.
  • Design and document data architecture at multiple levels across conceptual, logical, and physical views.
  • Provide “hands-on” architectural guidance and leadership throughout the entire lifecycle of development projects.
  • Translate business requirements into conceptual and detailed technology solutions that meet organizational goals.
  • Collaborate with other architects, engineering directors, and product managers to align data solutions with business strategy.
  • Lead proof-of-concept projects to test and validate new tools or architectural approaches.
  • Stay current with industry trends, vendor product offerings, and evolving data technologies to ensure the organization leverages the best available tools.
  • Cross-train peers and mentor team members to share knowledge and build internal capabilities.

AWSLeadershipPythonSQLApache AirflowCloud ComputingETLKafkaKubernetesMachine LearningSnowflakeAlgorithmsData engineeringData scienceData StructuresREST APIPandasSparkTensorflowCommunication SkillsAnalytical SkillsCollaborationCI/CDProblem SolvingMentoringTerraformMicroservicesJSONData visualizationData modelingData analyticsData management

Posted about 1 hour ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 142771.0 - 225000.0 USD per year

🔍 Media and Analytics

  • Master's degree in Computer Science, Data Science, engineering, mathematics, or a related quantitative field plus 3 years of experience in analytics software solutions.
  • Bachelor's degree in similar fields plus 5 years of experience is also acceptable.
  • 3 years of experience with Python and associated packages including Spark, AWS, S3, Java, JavaScript, and Adobe Analytics.
  • Proficiency in SQL for querying and managing data.
  • Experience in analytics programming languages such as Python (with Pandas).
  • Experience in handling large volumes of data and code management tools like Git.
  • 2 years of experience managing computer program orchestrations and using open-source management platforms like AirFlow.
  • Develop, test, and orchestrate econometric, statistical, and machine learning modules.
  • Conduct unit, integration, and regression testing.
  • Create data processing systems for analytic research and development.
  • Design, document, and present process flows for analytical systems.
  • Partner with Software Engineering for cloud-based solutions.
  • Orchestrate modules via directed acyclic graphs using workflow management systems.
  • Work in an agile development environment.

AWSPythonSQLApache AirflowGitMachine LearningData engineeringRegression testingPandasSpark

Posted 28 days ago
Apply
Apply
🔥 Principal Data Engineer, MongoDB
Posted about 2 months ago

📍 Texas, Maryland, Pennsylvania, Minnesota, Florida, Georgia, Illinois

🔍 Ecommerce, collectible card games

🏢 Company: TCGPlayer_External_Career

  • Bachelor’s degree in computer science, information technology, or related field, or equivalent experience.
  • 12 years or more experience in designing scalable and reliable datastores.
  • Mastery of MongoDB data modeling and query design, with significant experience in RDBMS technologies, preferably PostgreSQL.
  • Experience designing datastores for microservices and event-driven applications.
  • Experience with data governance support in medium-to-large organizations.
  • Strong written and verbal communication skills for collaboration across roles.
  • Act as a subject matter expert for MongoDB, providing guidance and materials to improve proficiency.
  • Guide selection of datastore technologies for applications to meet data needs.
  • Consult on database design for performance and scalability.
  • Write effective code for data management.
  • Support engineers with database interface advice.
  • Develop data flow strategies and define storage requirements for microservices.
  • Troubleshoot and enhance existing database designs.
  • Collaborate to ensure data architectures are efficient and scalable.
  • Lead cross-application datastore projects related to security and data governance.
  • Research emerging datastore capabilities for strategic planning.
  • Define and implement data storage strategies for microservices.

PostgreSQLMongoDBData engineeringMicroservicesData modeling

Posted about 2 months ago
Apply
Apply
🔥 Principal Data Engineer
Posted 2 months ago

📍 United States

🧭 Full-Time

💸 185000.0 - 215000.0 USD per year

🔍 Media & Entertainment

  • 7+ years of experience in data engineering or software development
  • Expertise in AWS and distributed data technologies (e.g., Spark, Hadoop)
  • Proficiency with programming languages such as Python, PySpark, and JavaScript
  • Experience with large-scale data pipeline design and streaming data
  • Hands-on experience with AWS services like Lambda and S3
  • Strong SQL and data querying skills
  • Architect, design, and oversee development of data pipelines
  • Design and build analytics pipelines and business intelligence solutions
  • Work closely with cross-functional teams on data integration
  • Provide leadership and mentorship to data engineering team
  • Ensure compliance with best practices in data governance and quality
  • Drive data scalability and performance across cloud platforms

AWSPythonSQLApache AirflowCloud ComputingApache KafkaData engineering

Posted 2 months ago
Apply
Apply
🔥 Principal Data Engineer
Posted 2 months ago

📍 Sioux Falls, SD, Scottsdale, AZ, Troy, MI, Franklin, TN, Dallas, TX

🧭 Full-Time

💸 99501.91 - 183764.31 USD per year

🔍 Financial Services

🏢 Company: Pathward, N.A.

  • Bachelor’s degree or equivalent experience.
  • 10+ years delivering scalable, secure, and highly available technical data solutions.
  • 5+ years of experience designing and building Data Engineering pipelines with industry leading technologies such as Talend, Informatica, etc.
  • Extensive SQL experience.
  • Experience with ELT processes for transformation push down and data replication using tools such as Matillion.
  • Experience with Data Visualization tools such as PowerBI, ThoughtSpot, or others.
  • Experience with Python.
  • Experience with SQL Server, SSIS, and No-SQL DBs preferred.
  • 3+ years of complex enterprise experience leading distributed Data Engineering teams.
  • 3+ years of experience in cloud data environments leveraging technologies such as Snowflake and AWS.
  • 3+ years of banking or financial services and products domain experience.
  • Strong data architecture, critical thinking, and problem-solving abilities.
  • Ability to communicate effectively with all levels in the company.
  • Leads a Data Engineering team with responsibility for planning, prioritization, architectural & business alignment, quality, and value delivery.
  • Develops flexible, maintainable, and reusable Data Engineering solutions using standards, best practices, and frameworks with a focus on efficiency and innovation.
  • Solves complex development problems leveraging good design and practical experience.
  • Continually looks at ways to improve existing systems, processes, and performance.
  • Leads and participates in planning and feature/user story analysis by providing feedback and demonstrating an understanding of business needs.
  • Solves business problems by implementing technical solutions based on solid design principles and best practices.
  • Identifies and contributes to opportunities for departmental & team improvements.
  • Documents software, best practices, standards, and frameworks.
  • Mentors staff by providing technical advice, helping with issue resolution, and providing feedback.
  • Contributes as a subject matter expert in technical areas.
  • Keeps up to date with current and future changes in tools, technology, best practices, and industry standards through training and development opportunities.
  • Identifies and leads learning and training opportunities for other team members and staff.

AWSPythonSQLSnowflakeData engineering

Posted 2 months ago
Apply
Apply
🔥 Principal Data Engineer
Posted 3 months ago

📍 United States

💸 210000 - 220000 USD per year

🔍 Healthcare

🏢 Company: Transcarent👥 251-500💰 $126,000,000 Series D 10 months agoPersonal HealthHealth CareSoftware

  • Experienced: 10+ years in data engineering with a strong background in building and scaling data architectures.
  • Technical Expertise: Advanced working knowledge of SQL, relational databases, big data tools (e.g., Spark, Kafka), and cloud-based data warehousing (e.g., Snowflake).
  • Architectural Visionary: Experience in service-oriented and event-based architecture with strong API development skills.
  • Problem Solver: Manage and optimize processes for data transformation, metadata management, and workload management.
  • Collaborative Leader: Strong communication skills to present ideas clearly and lead cross-functional teams.
  • Project Management: Strong organizational skills, capable of leading multiple projects simultaneously.
  • Lead the Design and Implementation: Architect and implement cutting-edge data processing platforms and enterprise-wide data solutions using modern data architecture principles.
  • Scale Data Platform: Develop a scalable Platform for data extraction, transformation, and loading from various sources, ensuring data integrity and accessibility.
  • AI / ML platform: Design and build scalable AI and ML platforms for Transcarent use cases.
  • Collaborate Across Teams: Work with Executive, Product, Clinical, Data, and Design teams to meet their data infrastructure needs.
  • Optimize Data Pipelines: Build and optimize complex data pipelines for high performance and reliability.
  • Innovate and Automate: Create and maintain data tools and pipelines for analytics and data science.
  • Mentor and Lead: Provide leadership and mentorship to the data engineering team.

AWSLeadershipProject ManagementPythonSQLJavaKafkaSnowflakeC++AirflowData engineeringSparkCommunication SkillsProblem SolvingOrganizational skills

Posted 3 months ago
Apply