ApplyData Engineer
Posted 2023-08-15
View full description
🔍 Industry: Industrial, transportation, consumer products, and construction markets and aftermarkets
🗣️ Languages: English
Requirements:
Bachelor's degree in information management, data science, computer science/related field or equivalent, 6 years of experience in data-related roles, knowledge in sql, microsoft bi stack, azure, r, and python, experience in database and data warehouse development, technical writing skills, adaptability to new technologies and continuous improvement attitude
Responsibilities:
Addressing needs with it solutions, providing technical expertise, promoting analytics and data sharing, defining requirements, developing solution options, maintaining knowledge of data processes
ApplyRelated Jobs
Apply🧭 Full-Time
💸 160000 - 200000 USD per year
🔍 Financial media and digital assets
- Significant knowledge of the crypto industry; must be crypto native.
- At least 4+ years of experience in data modeling, schema design, data operations, and warehousing.
- Proficient in Python, Go, Rust, and/or TypeScript.
- Strong SQL expertise, with experience in Parquet, Postgres, and Clickhouse.
- Deep experience creating data warehouses at scale (100M+ rows/day).
- Experience with DevOps tools and cloud solutions like Docker, Kubernetes, AWS, or GCP.
- Own Data Sourcing Pipelines by architecting data warehousing and aggregation strategies.
- Design and Implement ETL Solutions, including setting up blockchain nodes and sourcing data from 3rd parties.
- Grow Shared Knowledge by providing technical leadership and guidance to the team.
- Drive Operational Efficiency by improving data workflows and automating routine tasks.
- Cross-Functional Collaboration with other teams to lead initiatives.
Posted 2024-11-02
Apply Apply📍 United States
🔍 Advertising software for Connected TV
🏢 Company: MNTN
- 5+ years of experience related to data engineering, analysis, and modeling complex data.
- Experience with distributed processing engines such as Spark.
- Strong experience with programming languages like Python and familiarity with algorithms.
- Experience in SQL, data modeling, and manipulating large data sets.
- Hands-on experience with data warehousing and building data pipelines.
- Familiarity with software processes and tools such as Git, CI/CD, Linux, and Airflow.
- Experience with cloud computing environments like AWS, Azure, or GCP.
- Strong written and verbal communication skills for conveying technical topics.
- Become the expert on MNTN Data Pipelines, Infrastructure and Processes.
- Design architecture with observability to maintain high quality data pipelines.
- Create and manage ETL/ELT workflows for transforming large data sets.
- Organize data and metrics for ad buying features and client performance.
- Organize visualizations, reporting, and alerting for performance and trends.
- Investigate critical incidents and ensure issues are resolved.
AWSPythonSQLCloud ComputingETLGCPGitAirflowAlgorithmsAzureData engineeringGoSparkCommunication SkillsCI/CD
Posted 2024-11-01
Apply Apply📍 United States
🧭 Full-Time
💸 110000 - 130000 USD per year
🔍 Energy solutions
- Bachelor's degree in computer science, Physics/Engineering, Business, or Mathematics.
- Experience building ETL pipelines; real-time pipelines are a plus.
- Proficiency in Python and SQL.
- At least 4 years of experience in data warehouse development with a strong foundation in dimensional data modeling.
- 4+ years of experience in SQL query creation and ETL design, implementation, and maintenance.
- 4+ years of experience developing data pipelines in Python and with AWS services like S3, Redshift, and RDS.
- Strong analytical skills with experience handling diverse datasets.
- Excellent oral, written, and interpersonal communication skills.
- Detail-oriented with the ability to prioritize and work independently.
- Collaborate with stakeholders to understand data requirements and deliver data solutions aligned with business objectives.
- Analyze data sources and design scalable data pipelines and ETL processes using Python, SQL, and AWS technologies.
- Develop and maintain data warehouses, optimizing data storage and retrieval.
- Build and populate schemas, automate reporting processes, and document technical specifications, ETL processes, data mappings, and data dictionaries.
- Support the Data Science Center of Excellence (DSCOE) in data engineering initiatives.
AWSPythonSQLETLData engineeringCommunication SkillsAnalytical Skills
Posted 2024-11-01
Apply Apply📍 United States
🧭 Full-Time
🔍 Automotive
🏢 Company: Careers_GM
- 5+ years of hands-on experience delivering enterprise-scale data and analytics solutions using modern hybrid cloud technologies.
- Strong problem-solving and analytical skills with experience analyzing large scale customer event data products.
- In-depth knowledge of industry-standard Data Engineering practices, including data privacy & security, ETL/ELT, and data quality assurance.
- Expert programming skills in platforms such as Azure, Databricks, Spark, Python, SQL, and GitHub.
- Effective communication and collaboration skills with cross-functional teams.
- Experience working in an Agile development environment.
- Collaborate with the MAS team functions to develop data products for analytics solutions.
- Design and develop data products compliant with data privacy and security policies.
- Drive adoption of cloud-first technologies and industry-standard Data Engineering practices.
- Automate new and existing data processes, eliminating manual effort.
- Stay updated with emerging trends and technologies to find improvement opportunities.
- Promote a culture of continuous learning within the team.
PythonSQLAgileETLStrategyAzureData engineeringRelease ManagementSparkAnalytical Skills
Posted 2024-10-29
Apply Apply📍 Argentina
🔍 Experience innovation
🏢 Company: Valtech
- Proven industry experience executing data engineering, analytics, or data science projects or a Bachelors/Masters degree in quantitative studies.
- More than 2 years of experience with strong/expert Spark (PySpark), data pipeline development, orchestration tools, SQL, and denormalized data modeling.
- Collaborative, able to work remotely and engage with the team.
- Strong analytical and design skills.
- Demonstrate deep knowledge of data engineering to build and support non-interactive and real-time data capabilities.
- Build fault-tolerant, self-healing, adaptive, and highly accurate data computational pipelines.
- Provide consultation and lead the implementation of complex programs.
- Develop and maintain documentation for all assigned systems and projects.
- Tune queries running over billions of rows of data in a distributed query engine.
- Perform root cause analysis for permanent resolutions to software or business process issues.
AWSGraphQLPythonSQLHTMLCSSJavascriptMachine LearningJavaScriptTableauAirflowAzureData engineeringData scienceReactSpark
Posted 2024-10-28
Apply Apply📍 India
🧭 Contract
🏢 Company: Two95 International Inc.
- Bachelor’s degree in computer science or related field.
- 5-7 years of experience in Snowflake, Databricks management.
- Strong experience in Python and AWS Lambda.
- Knowledge of Scala and/or Java.
- Experience with data integration services, SQL, and ELT.
- Familiarity with Azure or AWS for development and deployment.
- Experience with Jira or similar tools during SDLC.
- Experience managing codebase using Git/GitHub or Bitbucket.
- Experience working with a data warehouse.
- Familiarity with structured and semi-structured data formats like JSON, Avro, ORC, Parquet, or XML.
- Exposure to agile work environments.
- Design and implement core data analytic platform components for various analytics groups.
- Review approaches and data pipelines for best practices.
- Maintain a common data flow pipeline including ETL activities.
- Support and troubleshoot data flow in cloud environments.
- Develop data pipeline code using Python, Java, AWS Lambda, or Azure Data Factory.
- Perform requirements planning and management throughout the data asset development life-cycle.
- Direct and help developers to adhere to data platform patterns.
- Design, build, and document RESTful APIs using OpenAPI specification tools.
AWSPythonSQLAgileETLGitHadoopJavaSnowflakeJiraAzureSpark
Posted 2024-10-27
Apply Apply🔍 Natural Gas
🏢 Company: EQT Corporation
- 4+ years of experience in reservoir engineering within unconventional plays.
- Bachelor's Degree in a relevant engineering discipline.
- Strong coding skills in R/RStudio, Python, Azure Data Studio, SQL, Spotfire, Power BI, Alteryx.
- Passion for technology, science, and innovation.
- Solid asset reservoir engineering experience.
- Understanding of basic engineering concepts.
- Support reservoir planning and analysis to enhance well designs.
- Analyze science tests and communicate results promptly.
- Track technology implementation and support innovation testing.
- Provide engineering support for asset A&D opportunities.
- Deliver insights that drive decision-making and performance metrics.
- Evaluate peers and non-operating partners to inform science and technology adoption.
- Collaborate with finance and operations for economic analysis.
Posted 2024-10-26
Apply Apply🧭 Full-Time
💸 100000 - 140000 USD per year
🔍 Technology / Data Engineering
🏢 Company: Wynd Labs - X Hiring
- Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related technical field.
- Extensive experience with database systems such as Redshift, Snowflake, or similar cloud-based solutions.
- Advanced proficiency in SQL with experience optimizing queries.
- Hands-on experience with building and managing data pipelines using tools like Apache Airflow, AWS Glue, or similar technologies.
- Solid understanding of ETL (Extract, Transform, Load) processes.
- Experience with infrastructure automation tools like Terraform or CloudFormation.
- Knowledge of programming languages such as Python, Scala, or Java.
- Strong analytical and problem-solving skills.
- Familiarity with containerization and orchestration technologies.
- Collaborative team player with strong communication skills.
- Designing, building, and optimizing scalable data pipelines to process and integrate data from various sources.
- Developing and managing ETL/ELT workflows for transforming raw data into structured formats.
- Integrating and configuring database infrastructure for performance and security.
- Automating data workflows and infrastructure setup using relevant tools.
- Collaborating with data scientists and other stakeholders for efficient data accessibility.
- Monitoring, troubleshooting, and improving performance of data pipelines.
- Working with cloud infrastructure to manage resources efficiently.
- Implementing best practices for data governance and security.
Posted 2024-10-26
Apply Apply📍 United States
🧭 Full-Time
🔍 Government technology
🏢 Company: 540
- Bachelor’s Degree in Computer Science or a related engineering field preferred.
- 6+ years of related experience.
- Proficient in Python.
- Experience building and managing data pipelines.
- Hands-on experience with AWS cloud services.
- Experience consuming data via APIs.
- Familiarity with GitLab, terminal/command line operations, Jira, and Confluence.
- Experience with data visualization tools like PowerBI.
- Develop data transformation and workflow solutions for the U.S. Army.
- Deploy and operate a cloud replacement for a legacy financial system.
- Innovate cloud-first solutions for data management.
- Responsible for the data pipeline cloud components, security, governance, and software delivery.
- Work with multiple teams on architectural problems.
- Communicate data design decisions to both technical and non-technical audiences.
- Implement practices to ensure high-quality, tested, and secure products.
AWSPythonSoftware DevelopmentGitMicrosoft Power BIJiraAmazon Web ServicesData engineering
Posted 2024-10-26
Apply Apply📍 Spain
🧭 Full-Time
🔍 Technology
🏢 Company: Plain Concepts
- At least 4 years of experience in software or data engineering.
- Experience in designing architectures.
- Solid experience in Python or Scala and Spark for handling large data volumes.
- Strong experience in Cloud services (Azure or AWS).
- Experience in creating data pipelines (CI/CD).
- Experience in testing (unit tests, integration tests, etc.).
- BI experience (Power BI) is a plus.
- Familiarity with Databricks, Snowflake, or Fabric is a plus.
- Experience with Infrastructure as Code (IaC) is a plus.
- Knowledge of SQL and NoSQL databases.
- Good level of English is essential.
- Ability to work as a team player.
- Involved in projects from initial client interactions to understand business needs and propose suitable technical solutions.
- Develop projects from scratch with minimal supervision and team collaboration.
- Participate in architectural designs and decision-making in a constructive co-creation environment.
- Key role in developing best practices, clean and reusable code.
- Create ETLs using Spark (Python/Scala).
- Work on cloud projects (Azure/AWS).
- Build scalable pipelines with various technologies.
AWSPythonSQLAgileAzureNosqlSparkCI/CD
Posted 2024-10-26
Apply Related Articles
Remote Job Certifications and Courses to Boost Your Career
August 22, 2024
Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?
How to Balance Work and Life While Working Remotely
August 19, 2024
Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.
How to Onboard Remote Employees Successfully
August 16, 2024
Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.
Remote Work Statistics and Insights for 2024
August 13, 2024
The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.