Apply

Principal Data Engineer

Posted about 1 month agoViewed

View full description

💎 Seniority level: Principal, 7+ years

💸 Salary: 185000.0 - 215000.0 USD per year

🔍 Industry: Entertainment and News

🗣️ Languages: English

⏳ Experience: 7+ years

Requirements:
  • 7+ years of experience in data engineering, software development, or related technical roles.
  • Expertise in cloud computing platforms, preferably AWS, and distributed data technologies like Apache Spark, Hadoop.
  • Proficiency with programming languages such as Python, PySpark, and JavaScript.
  • In-depth knowledge of designing data structures and governance frameworks.
  • Experience with designing and maintaining large-scale data pipelines.
  • Hands-on experience with serverless computing platforms like AWS Lambda.
  • Strong SQL, Presto, and data querying skills.
  • Proven ability to lead and mentor engineers.
  • Strong interest in technology, data, and media.
Responsibilities:
  • Architect, design, and oversee the development of scalable and reliable data pipelines for NBC News’ digital platforms.
  • Design and build advanced analytics pipelines and business intelligence solutions.
  • Work closely with cross-functional teams to align data systems with organizational goals.
  • Mentor and guide junior engineers, fostering a culture of technical excellence.
  • Ensure compliance with data governance and quality best practices.
  • Demonstrate problem-solving abilities and improve system performance.
  • Drive scalability and performance optimization in cloud platforms.
Apply

Related Jobs

Apply

🧭 Full-Time

💸 163282.0 - 192262.0 USD per year

🔍 Software / Data Visualization

  • 15+ years of professional software development or data engineering experience (12+ with a STEM B.S. or 10+ with a relevant Master's degree)
  • Strong proficiency in Python and familiarity with Java and Bash scripting
  • Hands-on experience implementing database technologies, messaging systems, and stream computing software (e.g., PostgreSQL, PostGIS, MongoDB, DuckDB, KsqlDB, RabbitMQ)
  • Experience with data fabric development using publish-subscribe models (e.g., Apache NiFi, Apache Pulsar, Apache Kafka and Kafka-based data service architecture)
  • Proficiency with containerization technologies (e.g., Docker, Docker-Compose, RKE2, Kubernetes, and Microk8s)
  • Experience with version control systems (e.g., Git), CI/CD tools (e.g., Jenkins), and collaborative development workflows
  • Strong knowledge of data modeling and database optimization techniques
  • Familiarity with data serialization languages (e.g., JSON, GeoJSON, YAML, XML)
  • Excellent problem-solving and analytical skills that have been applied to high visibility, important data engineering projects
  • Strong communication skills and ability to lead the work of other engineers in a collaborative environment
  • Demonstrated experience in coordinating team activities, setting priorities, and managing tasks to ensure balanced workloads and effective team performance
  • Experience managing and mentoring development teams in an Agile environment
  • Ability to make effective architecture decisions and document them clearly
  • Must be a US Citizen and eligible to obtain and maintain a US Security Clearance
  • Develop and continuously improve a data service that underpins cloud-based applications
  • Support data and database modeling efforts
  • Contribute to the development and maintenance of reusable component libraries and shared codebase
  • Participate in the entire software development lifecycle, including requirement gathering, design, development, testing, and deployment, using an agile, iterative process
  • Collaborate with developers, designers, testers, project managers, product owners, and project sponsors to integrate the data service to end user applications
  • Communicate tasking estimation and progress regularly to a development lead and product owner through appropriate tools
  • Ensure seamless integration between database and messaging systems and the frontend / UI they support
  • Ensure data quality, reliability, and performance through code reviews and effective testing strategies
  • Write high-quality code, applying best practices, coding standards, and design patterns
  • Team with other developers, fostering a culture of continuous learning and professional growth
Posted 3 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 142771.0 - 225000.0 USD per year

🔍 Media and Analytics

  • Master's degree in Computer Science, Data Science, engineering, mathematics, or a related quantitative field plus 3 years of experience in analytics software solutions.
  • Bachelor's degree in similar fields plus 5 years of experience is also acceptable.
  • 3 years of experience with Python and associated packages including Spark, AWS, S3, Java, JavaScript, and Adobe Analytics.
  • Proficiency in SQL for querying and managing data.
  • Experience in analytics programming languages such as Python (with Pandas).
  • Experience in handling large volumes of data and code management tools like Git.
  • 2 years of experience managing computer program orchestrations and using open-source management platforms like AirFlow.
  • Develop, test, and orchestrate econometric, statistical, and machine learning modules.
  • Conduct unit, integration, and regression testing.
  • Create data processing systems for analytic research and development.
  • Design, document, and present process flows for analytical systems.
  • Partner with Software Engineering for cloud-based solutions.
  • Orchestrate modules via directed acyclic graphs using workflow management systems.
  • Work in an agile development environment.

AWSPythonSQLApache AirflowGitMachine LearningData engineeringRegression testingPandasSpark

Posted 6 days ago
Apply
Apply

📍 U.S.

🧭 Full-Time

💸 142771.0 - 225000.0 USD per year

🔍 Media and analytics

  • Master’s degree in Computer Science, Data Science, engineering, mathematics or a related quantitative field plus 3 years of experience in delivering analytics software solutions or a Bachelor’s degree plus 5 years.
  • Must have 3 years of experience with Python, associated packages including Spark, AWS, and SQL for data management.
  • Experience with analytics programming languages, parallel processing, and code management tools like Git.
  • Two years of experience managing program orchestrations and working with open-source management platforms such as AirFlow.
  • Modern analytics programming: developing, testing and orchestrating econometric, statistical and machine learning modules.
  • Unit, integration and regression testing.
  • Understanding the deployment of econometric models and learning methods.
  • Create data processing systems for analytics research and development.
  • Design, write, and test modules for Nielsen analytics cloud-based platforms.
  • Extract data using SQL and orchestrate modules via workflow management platforms.
  • Design, document, and present process flows for analytical systems.
  • Partner with software engineering to build analytical solutions in an agile environment.

AWSPythonSQLApache AirflowGitMachine LearningSpark

Posted 6 days ago
Apply
Apply

📍 Copenhagen, London, Stockholm, Berlin, Madrid, Montreal, Lisbon, 35 other countries

🧭 Full-Time

🔍 Financial Technology

  • Strong background in building and managing data infrastructure at scale.
  • Expertise in Python, AWS, dbt, Airflow, and Kubernetes.
  • Ability to translate business and product requirements into technical data solutions.
  • Experience in mentoring and fostering collaboration within teams.
  • Curiosity and enthusiasm for experimenting with new technologies to solve complex problems.
  • Hands-on experience with modern data tools and contributing to strategic decision-making.
  • Partnering with product and business teams to develop data strategies that enable new features and improve user experience.
  • Driving key strategic projects across the organisation, dipping in and out as needed to provide leadership and hands-on support.
  • Supporting multiple teams across Pleo in delivering impactful data and analytics solutions.
  • Building data products that directly support Pleo's product roadmap and business goals.
  • Collaborating with the VP of Data and other data leaders to set the vision for Pleo’s data strategy and ensure alignment with company objectives.
  • Enhancing our data infrastructure and pipelines to improve scalability, performance, and data quality.
  • Experimenting with and implementing innovative technologies to keep Pleo’s data stack at the forefront of the industry.
  • Mentoring engineers, analysts, and data scientists to foster growth and build a world-class data team.

AWSPythonApache AirflowKubernetesData engineering

Posted 17 days ago
Apply
Apply

📍 Texas, Maryland, Pennsylvania, Minnesota, Florida, Georgia, Illinois

🔍 Ecommerce, collectible card games

🏢 Company: TCGPlayer_External_Career

  • Bachelor’s degree in computer science, information technology, or related field, or equivalent experience.
  • 12 years or more experience in designing scalable and reliable datastores.
  • Mastery of MongoDB data modeling and query design, with significant experience in RDBMS technologies, preferably PostgreSQL.
  • Experience designing datastores for microservices and event-driven applications.
  • Experience with data governance support in medium-to-large organizations.
  • Strong written and verbal communication skills for collaboration across roles.
  • Act as a subject matter expert for MongoDB, providing guidance and materials to improve proficiency.
  • Guide selection of datastore technologies for applications to meet data needs.
  • Consult on database design for performance and scalability.
  • Write effective code for data management.
  • Support engineers with database interface advice.
  • Develop data flow strategies and define storage requirements for microservices.
  • Troubleshoot and enhance existing database designs.
  • Collaborate to ensure data architectures are efficient and scalable.
  • Lead cross-application datastore projects related to security and data governance.
  • Research emerging datastore capabilities for strategic planning.
  • Define and implement data storage strategies for microservices.

PostgreSQLMongoDBData engineeringMicroservicesData modeling

Posted 23 days ago
Apply
Apply

📍 United States

🧭 Full-Time

💸 210000.0 - 220000.0 USD per year

🔍 Healthcare

  • 10+ years of experience in data engineering with a strong background in data architectures.
  • Advanced working knowledge of SQL, relational databases, and big data tools (e.g., Spark, Kafka).
  • Proficient in cloud-based data warehousing (e.g., Snowflake) and cloud services (e.g., AWS).
  • Strong understanding of AI / ML workflows.
  • Demonstrated experience in service-oriented and event-based architecture with strong API development skills.
  • Ability to manage and optimize processes related to data transformation and workload management.
  • Strong communication skills for leading cross-functional teams.
  • Strong project management and organizational skills.
  • Lead the Design and Implementation: Architect and implement data processing platforms and enterprise-wide data solutions.
  • Scale Data Platform: Develop a scalable platform for data extraction, transformation, and loading from various sources.
  • AI / ML platform: Design and build scalable AI and ML platforms for business use cases.
  • Collaborate Across Teams: Partner with various teams to meet their data infrastructure needs.
  • Optimize Data Pipelines: Build and optimize complex data pipelines for performance and reliability.
  • Innovate and Automate: Create data tools and pipelines for analytics and data science.
  • Mentor and Lead: Provide technical leadership and foster a culture of continuous learning.

AWSPythonSQLApache AirflowCloud ComputingETLKafkaMachine LearningSnowflakeData engineeringSpark

Posted 27 days ago
Apply
Apply
🔥 Principal Data Engineer
Posted about 2 months ago

📍 Sioux Falls, SD, Scottsdale, AZ, Troy, MI, Franklin, TN, Dallas, TX

🧭 Full-Time

💸 99501.91 - 183764.31 USD per year

🔍 Financial services

🏢 Company: Pathward, N.A.

  • Bachelor’s degree or equivalent experience.
  • 10+ years delivering scalable, secure, and highly available technical data solutions.
  • 5+ years of experience designing and building Data Engineering pipelines with tools like Talend and Informatica.
  • Extensive SQL experience.
  • Experience with ELT processes and tools like Matillion.
  • Leads a Data Engineering team responsible for planning, prioritization, architectural & business alignment, quality, and value delivery.
  • Develops flexible, maintainable, and reusable Data Engineering solutions using standards, best practices, and frameworks.
  • Solves complex development problems leveraging good design and practical experience.
  • Continually seeks ways to improve existing systems, processes, and performance.
  • Participates in planning and feature/user story analysis, offering feedback.

AWSPythonSQLSnowflakeData engineering

Posted about 2 months ago
Apply
Apply
🔥 Principal Data Engineer
Posted about 2 months ago

📍 United States

💸 210000 - 220000 USD per year

🔍 Healthcare

🏢 Company: Transcarent👥 251-500💰 $126,000,000 Series D 9 months agoPersonal HealthHealth CareSoftware

  • Experienced: 10+ years in data engineering with a strong background in building and scaling data architectures.
  • Technical Expertise: Advanced working knowledge of SQL, relational databases, big data tools (e.g., Spark, Kafka), and cloud-based data warehousing (e.g., Snowflake).
  • Architectural Visionary: Experience in service-oriented and event-based architecture with strong API development skills.
  • Problem Solver: Manage and optimize processes for data transformation, metadata management, and workload management.
  • Collaborative Leader: Strong communication skills to present ideas clearly and lead cross-functional teams.
  • Project Management: Strong organizational skills, capable of leading multiple projects simultaneously.
  • Lead the Design and Implementation: Architect and implement cutting-edge data processing platforms and enterprise-wide data solutions using modern data architecture principles.
  • Scale Data Platform: Develop a scalable Platform for data extraction, transformation, and loading from various sources, ensuring data integrity and accessibility.
  • AI / ML platform: Design and build scalable AI and ML platforms for Transcarent use cases.
  • Collaborate Across Teams: Work with Executive, Product, Clinical, Data, and Design teams to meet their data infrastructure needs.
  • Optimize Data Pipelines: Build and optimize complex data pipelines for high performance and reliability.
  • Innovate and Automate: Create and maintain data tools and pipelines for analytics and data science.
  • Mentor and Lead: Provide leadership and mentorship to the data engineering team.

AWSLeadershipProject ManagementPythonSQLJavaKafkaSnowflakeC++AirflowData engineeringSparkCommunication SkillsProblem SolvingOrganizational skills

Posted about 2 months ago
Apply
Apply
🔥 Principal Data Engineer
Posted 2 months ago

📍 UK

🧭 Full-Time

🔍 Data infrastructure and enterprise technology

🏢 Company: Aker Systems👥 101-250💰 over 4 years agoCloud Data ServicesBusiness IntelligenceAnalyticsSoftware

  • Bachelor's degree.
  • Data pipeline development using data processing technologies and frameworks.
  • Agile or other rapid application development methods.
  • Data modeling and understanding of different data structures and their benefits and limitations under particular use cases.
  • Experience in Public Cloud services, such as AWS, with knowledge of core services like EC2, RDS, Lambda, Athena & Glue preferred.
  • Configuring and tuning Relational and NoSQL databases.
  • Programming or scripting languages, such as Python.
  • Test Driven Development with appropriate tools and frameworks.
  • Code, test, and document new or modified data pipelines that meet functional/non-functional business requirements.
  • Conduct logical and physical database design.
  • Expand and grow data platform capabilities to solve new data and analytics problems.
  • Conduct data analysis, identifying feasible solutions and enhancements to data processing challenges.
  • Ensure that data models are consistent with the data architecture.
  • Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

AWSPythonAgileData AnalysisData StructuresNosqlLinuxData modeling

Posted 2 months ago
Apply
Apply

🔍 Digital Product Engineering

  • Mandatory skills include expertise in Snowflake.
  • English language skills at native or advanced level.
  • Implement data models, including Data Vault 2.0 methodologies, to ensure scalable and efficient data architectures.
  • Develop and optimize data pipelines using Snowflake on Azure, ensuring data consistency and performance.
  • Design and build data products such as SQL data pipelines, Tableau/Power BI dashboards, and analytical reports to meet business requirements.
  • Use scripting languages like Terraform and Python to automate data workflows and infrastructure management.
  • Apply knowledge of Data Mesh architecture to help design and implement decentralized data products.
  • Ensure the efficiency and performance of data products, optimizing data pipelines and models for scalability.
  • Work independently with business teams to understand their data needs, translating requirements into actionable data products.
Posted 2 months ago
Apply

Related Articles

Posted 5 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 6 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 6 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 6 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.

Posted 6 months ago

The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.