Apply

Data Engineer

Posted 2024-11-30

View full description

πŸ“ Location: CA, US / TX, US / FL, US

πŸ” Industry: Healthcare

🏒 Company: Tekton Labs

πŸͺ„ Skills: PythonSQLETLTableauDocumentation

Requirements:
  • Expertise in ETL processes and data integration.
  • Strong understanding of healthcare workflows and reporting requirements.
  • Proficiency in SQL and Python.
  • Hands-on experience with BI tools such as Freshpaint, Tableau, or Power BI.
Responsibilities:
  • Assess existing reporting infrastructure and refactor legacy systems to optimize performance.
  • Design schemas for healthcare workflows and centralize session form data.
  • Develop ETL processes for integrating patient data into reporting pipelines.
  • Deliver the first version (V1) of NIH reports and business performance reports.
  • Create an interactive BI dashboard using tools like Freshpaint, Tableau, or Power BI.
  • Define funnel metrics including conversion and drop-off rates and establish baseline metrics.
Apply

Related Jobs

Apply

πŸ“ United States

πŸ’Έ 210000 - 220000 USD per year

πŸ” Healthcare

🏒 Company: Transcarent

  • Experienced: 10+ years in data engineering with a strong background in building and scaling data architectures.
  • Technical Expertise: Advanced working knowledge of SQL, relational databases, big data tools (e.g., Spark, Kafka), and cloud-based data warehousing (e.g., Snowflake).
  • Architectural Visionary: Experience in service-oriented and event-based architecture with strong API development skills.
  • Problem Solver: Manage and optimize processes for data transformation, metadata management, and workload management.
  • Collaborative Leader: Strong communication skills to present ideas clearly and lead cross-functional teams.
  • Project Management: Strong organizational skills, capable of leading multiple projects simultaneously.

  • Lead the Design and Implementation: Architect and implement cutting-edge data processing platforms and enterprise-wide data solutions using modern data architecture principles.
  • Scale Data Platform: Develop a scalable Platform for data extraction, transformation, and loading from various sources, ensuring data integrity and accessibility.
  • AI / ML platform: Design and build scalable AI and ML platforms for Transcarent use cases.
  • Collaborate Across Teams: Work with Executive, Product, Clinical, Data, and Design teams to meet their data infrastructure needs.
  • Optimize Data Pipelines: Build and optimize complex data pipelines for high performance and reliability.
  • Innovate and Automate: Create and maintain data tools and pipelines for analytics and data science.
  • Mentor and Lead: Provide leadership and mentorship to the data engineering team.

AWSLeadershipProject ManagementPythonSQLJavaKafkaSnowflakeC++AirflowData engineeringSparkCommunication SkillsProblem SolvingOrganizational skills

Posted 2024-12-03
Apply
Apply

πŸ“ United States, Latin America, India

πŸ” Cloud Data Technologies

🏒 Company: phData

  • At least 4+ years experience as a Software Engineer, Data Engineer or Data Analyst.
  • Programming expertise in Java, Python and/or Scala.
  • Core cloud data platforms: Snowflake, AWS, Azure, Databricks, and GCP.
  • SQL proficiency and the ability to write, debug, and optimize SQL queries.
  • Experience creating and delivering detailed presentations.

  • Develop end-to-end technical solutions into production.
  • Ensure performance, security, scalability, and robust data integration.
  • Client-facing communication and presentation delivery.
  • Create detailed solution documentation.

AWSPythonSoftware DevelopmentSQLElasticSearchGCPHadoopJavaKafkaSnowflakeAirflowAzureCassandraElasticsearchNosqlSparkCommunication SkillsDocumentation

Posted 2024-12-03
Apply
Apply

πŸ“ NY, PA, CA, CO, DC, FL, IL, MD, NJ, MA, SC, TX, VA

🧭 Full-Time

πŸ’Έ 100000 - 130000 USD per year

πŸ” Credit rating agency

🏒 Company: KBRA

  • 3+ years of professional experience in Python development.
  • Strong understanding of the Python programming language.
  • Experience with web frameworks such as Django, Flask, and/or Dash.
  • Knowledge of relational databases (Snowflake preferred).
  • Experience with version control systems (e.g., Git).
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and teamwork skills.

  • Develop, test, and maintain scalable Python applications.
  • Collaborate with product managers, designers, and other engineers to deliver high-quality software.
  • Write clean, efficient, and reusable code following best practices.
  • Participate in code reviews to ensure code quality and share knowledge with the team.
  • Troubleshoot and debug issues in a timely manner.
  • Contribute to the design and architecture of new features and systems.
  • Stay up-to-date with the latest industry trends and technologies.

PythonDjangoFlaskGitSnowflakeAttention to detail

Posted 2024-12-01
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 125000 - 165000 USD per year

🏒 Company: FSAStore.com

  • Bachelor’s degree in Computer Science, Engineering, or related field.
  • Minimum of 5+ years experience in data engineering roles with a focus on building and maintaining data pipelines.
  • Strong proficiency in Azure Synapse Analytics and Azure DevOps.
  • Expertise in Apache Spark pools.
  • Proficient in Python scripting and programming languages such as SQL and Java.
  • Senior level knowledge of data modeling and data warehousing concepts.
  • Strong communication skills to convey technical concepts to non-technical stakeholders.
  • Experience with data modeling, warehousing, and building ETL pipelines.
  • Familiarity with Azure Data Factory and Azure Data Lake Storage.

  • Design, develop, and maintain robust data pipelines and infrastructure to support data integration, processing, and storage.
  • Collaborate with analysts and stakeholders to understand data requirements and translate them into technical solutions.
  • Implement best practices for data governance, security, and quality assurance.
  • Explore and evaluate new technologies and tools to enhance data infrastructure.
  • Design and implement data models on Azure Synapse Analytics.
  • Utilize Azure DevOps for CI/CD and agile project management.
  • Manage and optimize Apache Spark pools for high-performance processing.
  • Develop and maintain ETL processes using Python.
  • Ensure data quality and integrity across platforms.
  • Collaborate with cross-functional teams.
  • Stay current with industry trends in Azure data services.

Project ManagementPythonSQLAgileETLJavaAzureData engineeringSparkCommunication SkillsDevOps

Posted 2024-11-27
Apply
Apply

πŸ“ US

πŸ’Έ 84000 - 120000 USD per year

πŸ” Consumer insights

  • Strong PL/SQL, SQL development skills.
  • Proficient in multiple programming languages used in data engineering such as Python and Java.
  • Minimum 3-5 years of experience in Data engineering with Oracle and MS SQL.
  • Experience with data warehousing concepts and technologies including cloud-based services (e.g. Snowflake).
  • Experience with cloud platforms like Azure and knowledge of infrastructure.
  • Experience with data orchestration tools (e.g. Azure Data Factory, DataBricks workflows).
  • Understanding of data privacy regulations and best practices.
  • Experience working with remote teams.
  • Experience working on a team with a CI/CD process.
  • Familiarity using tools like Git and Jira.
  • Great problem-solving abilities and work ethics.
  • Bachelor's degree in Computer Science or Computer Engineering.

  • Design, implement and maintain scalable pipelines and architecture to collect, process, and store data from various sources.
  • Unit test and document solutions that meet product quality standards prior to release to QA.
  • Identify and resolve performance bottlenecks in pipelines to ensure efficient and timely data delivery.
  • Implement data quality checks and validations to ensure accuracy, completeness, and consistency of data delivery.
  • Work with Data Architect on data governance, quality, and security best practices.
  • Collaborate with cross-functional teams to identify and address data needs.
  • Define and document technical requirements.

PythonSQLETLGitJavaOracleQASnowflakeJiraAzureData engineeringCI/CD

Posted 2024-11-26
Apply
Apply

πŸ“ United States

πŸ” Data and technology

  • 5+ years of experience making contributions in the form of code.
  • Experience with algorithms and data structures and knowing when to apply them.
  • Deep familiarity with Scala or Java.
  • Experience working with high-scale systems: realtime and batch.
  • Interest in data engineering to develop ingestion engines, ETL pipelines, and organizing data.
  • Experience in Machine Learning techniques and tools is a plus.

  • Be a senior member of the team by contributing to the architecture, design, and implementation of EMS systems.
  • Mentor junior engineers and promote their growth.
  • Lead technical projects, managing the planning, execution, and success of complex technical projects.
  • Collaborate with other engineering, product, and data science teams to ensure we're building the best products.
  • Be on call if required and accommodate Eastern Time Zone.

SQLETLGCPAlgorithmsData engineeringData StructuresSparkCollaboration

Posted 2024-11-26
Apply
Apply

πŸ“ United States, Latin America, India

🧭 Full-Time

πŸ” Modern data stack and cloud data services

  • At least 4+ years experience as a Software Engineer, Data Engineer, or Data Analyst.
  • Ability to develop end-to-end technical solutions in production environments.
  • Proficient in Java, Python, or Scala.
  • Experience with core cloud data platforms like Snowflake, AWS, Azure, Databricks, and GCP.
  • Strong SQL skills, capable of writing, debugging, and optimizing queries.
  • Client-facing communication skills for presentations and documentation.
  • Bachelor's degree in Computer Science or a related field.

  • Develop end-to-end technical solutions and ensure their performance, security, and scalability.
  • Help with robust data integration and contribute to the production deployment of solutions.
  • Create and deliver detailed presentations to clients.
  • Document solutions including POCs, roadmaps, diagrams, and logical system views.

AWSPythonSoftware DevelopmentSQLElasticSearchGCPHadoopJavaKafkaSnowflakeAirflowAzureCassandraElasticsearchNosqlSparkCommunication SkillsDocumentation

Posted 2024-11-24
Apply
Apply

πŸ“ US

πŸ’Έ 84000 - 120000 USD per year

πŸ” Consumer insights

  • Strong PL/SQL, SQL development skills.
  • Proficient in multiple data engineering languages such as Python and Java.
  • 3-5 years of experience in data engineering specific to Oracle and MS SQL.
  • Experience with data warehousing technologies and cloud-based services like Snowflake.
  • Experience with cloud platforms such as Azure and infrastructure knowledge.
  • Familiarity with data orchestration tools like Azure Data Factory and DataBricks workflows.
  • Understanding of data privacy regulations and best practices.
  • Experience working on a remote team.
  • Familiarity with CI/CD processes and tools like Git, Jira.

  • Design, implement, and maintain scalable data pipelines and architecture.
  • Unit test and document solutions to meet product quality standards.
  • Identify and resolve performance bottlenecks in data pipelines.
  • Implement data quality checks and validation processes.
  • Collaborate with cross-functional teams to address data needs.
  • Ensure technology solutions support customer and organizational needs.
  • Define and document technical requirements.

PythonSQLETLGitJavaOracleQASnowflakeJiraAzureData engineeringCI/CD

Posted 2024-11-23
Apply
Apply

πŸ“ United States

πŸ” Consumer insights

  • Strong PL/SQL and SQL development skills.
  • Proficient in Python and Java.
  • 3-5 years of experience in Data Engineering with Oracle and MS SQL.
  • Experience with cloud services like Snowflake and Azure.
  • Familiar with data orchestration tools such as Azure Data Factory and DataBricks.
  • Understanding of data privacy regulations.

  • Design, implement and maintain scalable data pipelines and architecture.
  • Unit test and document solutions that meet product quality standards.
  • Identify and resolve performance bottlenecks in data processing workflows.
  • Implement data quality checks to ensure accuracy and consistency.
  • Collaborate with cross-functional teams to address data needs.

PythonSQLETLGitJavaOracleQASnowflakeJiraAzureData engineeringCI/CD

Posted 2024-11-21
Apply
Apply

πŸ“ US

🧭 Full-Time

πŸ’Έ 206700 - 289400 USD per year

πŸ” Social media / Online community

  • MS or PhD in a quantitative discipline: engineering, statistics, operations research, computer science, informatics, applied mathematics, economics, etc.
  • 7+ years of experience with large-scale ETL systems, building clean, maintainable, object-oriented code (Python preferred).
  • Strong programming proficiency in Python, SQL, Spark, Scala.
  • Experience with data modeling, ETL concepts, and manipulating large structured and unstructured data.
  • Experience with data workflows (e.g., Airflow) and data visualization tools (e.g., Looker, Tableau).
  • Deep understanding of technical and functional designs for relational and MPP databases.
  • Proven track record of collaboration and excellent communication skills.
  • Experience in mentoring junior data scientists and analytics engineers.

  • Act as the analytics engineering lead within Ads DS team and contribute to data science data quality and automation initiatives.
  • Ensure high-quality data through ETLs, reporting dashboards, and data aggregations for business tracking and ML model development.
  • Develop and maintain robust data pipelines and workflows for data ingestion, processing, and transformation.
  • Create user-friendly tools for internal use across Data Science and cross-functional teams.
  • Lead efforts to build a data-driven culture by enabling data self-service.
  • Provide mentorship and coaching to data analysts and act as a thought partner for data teams.

LeadershipPythonSQLData AnalysisETLTableauStrategyAirflowData analysisData engineeringData scienceSparkCommunication SkillsCollaborationMentoringCoaching

Posted 2024-11-21
Apply