Apply

Data Engineer

Posted 2024-10-26

View full description

πŸ’Ž Seniority level: Middle, Minimum of three (3) years

πŸ“ Location: United States

πŸ’Έ Salary: 100000 - 130000 USD per year

πŸ” Industry: Nonprofit, Civic Engagement, Data Analytics

🏒 Company: MurmurationπŸ‘₯ 1-10

πŸ—£οΈ Languages: English

⏳ Experience: Minimum of three (3) years

πŸͺ„ Skills: AWSDockerPythonSoftware DevelopmentSnowflakeAirflowData engineeringCommunication SkillsCI/CDProblem Solving

Requirements:
  • Problem-solver with a passion for using data and technology to drive social impact.
  • Education and/or experience in Computer Science, Computer Engineering, or a relevant field.
  • Minimum of three (3) years of relevant experience in data engineering or a related field.
  • Curiosity and a drive to continuously learn and adapt to new technologies and challenges.
  • Familiarity with data orchestration tools (e.g., Dagster, Airflow) and ELT processes (e.g., dbt).
  • Familiarity with analytic databases (e.g., Snowflake) and cloud infrastructure (e.g., AWS).
  • Experience working flexibly within smaller teams.
  • Practical knowledge of software development lifecycle (SDLC).
  • Proficiency in Python, Docker, and container orchestration tools.
  • Understanding of CI/CD pipelines and automation tools.
  • Strong written and verbal communication skills.
Responsibilities:
  • Collaborate closely with cross-functional teams to understand challenges, design solutions, and implement data pipelines that meet both immediate and long-term needs.
  • Build and maintain scalable, reliable data pipelines using tools such as Dagster, Airflow, Snowflake, AWS, MongoDB, and dbt.
  • Manage data from various sources, ensuring timely ingestion, quality, and integrity.
  • Transform raw data into structured, usable formats that empower our analytical and product teams.
  • Implement and maintain robust monitoring, alerting, and documentation processes.
  • Continuously optimize our data infrastructure for performance and efficiency.
  • Provide support and troubleshooting for data-related issues across the organization.
  • Contribute to a culture of knowledge sharing and continuous improvement within the team.
Apply

Related Jobs

Apply

πŸ“ Greater San Francisco area

🧭 Full-Time

πŸ” Business Transformation Management (BTM), AI, SaaS

🏒 Company: Workhelix

  • 3+ years of software development experience, ideally in full-stack or data-centric roles.
  • Strong programming skills in Python and SQL, with familiarity in building statistical and machine learning models.
  • Proven ability to engage with enterprise customers and deliver solutions.
  • Exceptional problem-solving skills with a solution-oriented mindset.
  • Excellent communication skills to distill complex ideas into actionable insights.

  • Engage directly with clients to understand challenges and design custom technical solutions.
  • Build and deploy solutions integrating into client infrastructure.
  • Apply a problem-first mindset to break down complex problems into actionable tasks.
  • Map existing processes for analytical intervention.
  • Write clean, scalable code and support implementation of analytical solutions.
  • Collaborate with clients to iterate on solutions and deliver impactful results.

PythonSoftware DevelopmentSQLMachine LearningStrategyData scienceCommunication SkillsCollaborationProblem Solving

Posted 2024-12-04
Apply
Apply

πŸ“ United States

πŸ’Έ 210000 - 220000 USD per year

πŸ” Healthcare

🏒 Company: Transcarent

  • Experienced: 10+ years in data engineering with a strong background in building and scaling data architectures.
  • Technical Expertise: Advanced working knowledge of SQL, relational databases, big data tools (e.g., Spark, Kafka), and cloud-based data warehousing (e.g., Snowflake).
  • Architectural Visionary: Experience in service-oriented and event-based architecture with strong API development skills.
  • Problem Solver: Manage and optimize processes for data transformation, metadata management, and workload management.
  • Collaborative Leader: Strong communication skills to present ideas clearly and lead cross-functional teams.
  • Project Management: Strong organizational skills, capable of leading multiple projects simultaneously.

  • Lead the Design and Implementation: Architect and implement cutting-edge data processing platforms and enterprise-wide data solutions using modern data architecture principles.
  • Scale Data Platform: Develop a scalable Platform for data extraction, transformation, and loading from various sources, ensuring data integrity and accessibility.
  • AI / ML platform: Design and build scalable AI and ML platforms for Transcarent use cases.
  • Collaborate Across Teams: Work with Executive, Product, Clinical, Data, and Design teams to meet their data infrastructure needs.
  • Optimize Data Pipelines: Build and optimize complex data pipelines for high performance and reliability.
  • Innovate and Automate: Create and maintain data tools and pipelines for analytics and data science.
  • Mentor and Lead: Provide leadership and mentorship to the data engineering team.

AWSLeadershipProject ManagementPythonSQLJavaKafkaSnowflakeC++AirflowData engineeringSparkCommunication SkillsProblem SolvingOrganizational skills

Posted 2024-12-03
Apply
Apply

πŸ“ United States, Latin America, India

πŸ” Cloud Data Technologies

🏒 Company: phData

  • At least 4+ years experience as a Software Engineer, Data Engineer or Data Analyst.
  • Programming expertise in Java, Python and/or Scala.
  • Core cloud data platforms: Snowflake, AWS, Azure, Databricks, and GCP.
  • SQL proficiency and the ability to write, debug, and optimize SQL queries.
  • Experience creating and delivering detailed presentations.

  • Develop end-to-end technical solutions into production.
  • Ensure performance, security, scalability, and robust data integration.
  • Client-facing communication and presentation delivery.
  • Create detailed solution documentation.

AWSPythonSoftware DevelopmentSQLElasticSearchGCPHadoopJavaKafkaSnowflakeAirflowAzureCassandraElasticsearchNosqlSparkCommunication SkillsDocumentation

Posted 2024-12-03
Apply
Apply

πŸ“ NY, PA, CA, CO, DC, FL, IL, MD, NJ, MA, SC, TX, VA

🧭 Full-Time

πŸ’Έ 100000 - 130000 USD per year

πŸ” Credit rating agency

🏒 Company: KBRA

  • 3+ years of professional experience in Python development.
  • Strong understanding of the Python programming language.
  • Experience with web frameworks such as Django, Flask, and/or Dash.
  • Knowledge of relational databases (Snowflake preferred).
  • Experience with version control systems (e.g., Git).
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and teamwork skills.

  • Develop, test, and maintain scalable Python applications.
  • Collaborate with product managers, designers, and other engineers to deliver high-quality software.
  • Write clean, efficient, and reusable code following best practices.
  • Participate in code reviews to ensure code quality and share knowledge with the team.
  • Troubleshoot and debug issues in a timely manner.
  • Contribute to the design and architecture of new features and systems.
  • Stay up-to-date with the latest industry trends and technologies.

PythonDjangoFlaskGitSnowflakeAttention to detail

Posted 2024-12-01
Apply
Apply
πŸ”₯ Data Engineer
Posted 2024-11-30

πŸ“ CA, US / TX, US / FL, US

🧭 Temporary

πŸ” Healthcare

🏒 Company: Tekton Labs

  • Expertise in ETL processes and data integration.
  • Strong understanding of healthcare workflows and reporting requirements.
  • Proficiency in SQL and Python.
  • Hands-on experience with BI tools such as Freshpaint, Tableau, or Power BI.

  • Assess existing reporting infrastructure and refactor legacy systems to optimize performance.
  • Design schemas for healthcare workflows and centralize session form data.
  • Develop ETL processes for integrating patient data into reporting pipelines.
  • Deliver the first version (V1) of NIH reports and business performance reports.
  • Create an interactive BI dashboard using tools like Freshpaint, Tableau, or Power BI.
  • Define funnel metrics including conversion and drop-off rates and establish baseline metrics.

PythonSQLETLTableauDocumentation

Posted 2024-11-30
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 125000 - 165000 USD per year

🏒 Company: FSAStore.com

  • Bachelor’s degree in Computer Science, Engineering, or related field.
  • Minimum of 5+ years experience in data engineering roles with a focus on building and maintaining data pipelines.
  • Strong proficiency in Azure Synapse Analytics and Azure DevOps.
  • Expertise in Apache Spark pools.
  • Proficient in Python scripting and programming languages such as SQL and Java.
  • Senior level knowledge of data modeling and data warehousing concepts.
  • Strong communication skills to convey technical concepts to non-technical stakeholders.
  • Experience with data modeling, warehousing, and building ETL pipelines.
  • Familiarity with Azure Data Factory and Azure Data Lake Storage.

  • Design, develop, and maintain robust data pipelines and infrastructure to support data integration, processing, and storage.
  • Collaborate with analysts and stakeholders to understand data requirements and translate them into technical solutions.
  • Implement best practices for data governance, security, and quality assurance.
  • Explore and evaluate new technologies and tools to enhance data infrastructure.
  • Design and implement data models on Azure Synapse Analytics.
  • Utilize Azure DevOps for CI/CD and agile project management.
  • Manage and optimize Apache Spark pools for high-performance processing.
  • Develop and maintain ETL processes using Python.
  • Ensure data quality and integrity across platforms.
  • Collaborate with cross-functional teams.
  • Stay current with industry trends in Azure data services.

Project ManagementPythonSQLAgileETLJavaAzureData engineeringSparkCommunication SkillsDevOps

Posted 2024-11-27
Apply
Apply

πŸ“ US

πŸ’Έ 84000 - 120000 USD per year

πŸ” Consumer insights

  • Strong PL/SQL, SQL development skills.
  • Proficient in multiple programming languages used in data engineering such as Python and Java.
  • Minimum 3-5 years of experience in Data engineering with Oracle and MS SQL.
  • Experience with data warehousing concepts and technologies including cloud-based services (e.g. Snowflake).
  • Experience with cloud platforms like Azure and knowledge of infrastructure.
  • Experience with data orchestration tools (e.g. Azure Data Factory, DataBricks workflows).
  • Understanding of data privacy regulations and best practices.
  • Experience working with remote teams.
  • Experience working on a team with a CI/CD process.
  • Familiarity using tools like Git and Jira.
  • Great problem-solving abilities and work ethics.
  • Bachelor's degree in Computer Science or Computer Engineering.

  • Design, implement and maintain scalable pipelines and architecture to collect, process, and store data from various sources.
  • Unit test and document solutions that meet product quality standards prior to release to QA.
  • Identify and resolve performance bottlenecks in pipelines to ensure efficient and timely data delivery.
  • Implement data quality checks and validations to ensure accuracy, completeness, and consistency of data delivery.
  • Work with Data Architect on data governance, quality, and security best practices.
  • Collaborate with cross-functional teams to identify and address data needs.
  • Define and document technical requirements.

PythonSQLETLGitJavaOracleQASnowflakeJiraAzureData engineeringCI/CD

Posted 2024-11-26
Apply
Apply

πŸ“ United States

πŸ” Data and technology

  • 5+ years of experience making contributions in the form of code.
  • Experience with algorithms and data structures and knowing when to apply them.
  • Deep familiarity with Scala or Java.
  • Experience working with high-scale systems: realtime and batch.
  • Interest in data engineering to develop ingestion engines, ETL pipelines, and organizing data.
  • Experience in Machine Learning techniques and tools is a plus.

  • Be a senior member of the team by contributing to the architecture, design, and implementation of EMS systems.
  • Mentor junior engineers and promote their growth.
  • Lead technical projects, managing the planning, execution, and success of complex technical projects.
  • Collaborate with other engineering, product, and data science teams to ensure we're building the best products.
  • Be on call if required and accommodate Eastern Time Zone.

SQLETLGCPAlgorithmsData engineeringData StructuresSparkCollaboration

Posted 2024-11-26
Apply
Apply

πŸ“ United States, Latin America, India

🧭 Full-Time

πŸ” Modern data stack and cloud data services

  • At least 4+ years experience as a Software Engineer, Data Engineer, or Data Analyst.
  • Ability to develop end-to-end technical solutions in production environments.
  • Proficient in Java, Python, or Scala.
  • Experience with core cloud data platforms like Snowflake, AWS, Azure, Databricks, and GCP.
  • Strong SQL skills, capable of writing, debugging, and optimizing queries.
  • Client-facing communication skills for presentations and documentation.
  • Bachelor's degree in Computer Science or a related field.

  • Develop end-to-end technical solutions and ensure their performance, security, and scalability.
  • Help with robust data integration and contribute to the production deployment of solutions.
  • Create and deliver detailed presentations to clients.
  • Document solutions including POCs, roadmaps, diagrams, and logical system views.

AWSPythonSoftware DevelopmentSQLElasticSearchGCPHadoopJavaKafkaSnowflakeAirflowAzureCassandraElasticsearchNosqlSparkCommunication SkillsDocumentation

Posted 2024-11-24
Apply
Apply

πŸ“ US

πŸ’Έ 84000 - 120000 USD per year

πŸ” Consumer insights

  • Strong PL/SQL, SQL development skills.
  • Proficient in multiple data engineering languages such as Python and Java.
  • 3-5 years of experience in data engineering specific to Oracle and MS SQL.
  • Experience with data warehousing technologies and cloud-based services like Snowflake.
  • Experience with cloud platforms such as Azure and infrastructure knowledge.
  • Familiarity with data orchestration tools like Azure Data Factory and DataBricks workflows.
  • Understanding of data privacy regulations and best practices.
  • Experience working on a remote team.
  • Familiarity with CI/CD processes and tools like Git, Jira.

  • Design, implement, and maintain scalable data pipelines and architecture.
  • Unit test and document solutions to meet product quality standards.
  • Identify and resolve performance bottlenecks in data pipelines.
  • Implement data quality checks and validation processes.
  • Collaborate with cross-functional teams to address data needs.
  • Ensure technology solutions support customer and organizational needs.
  • Define and document technical requirements.

PythonSQLETLGitJavaOracleQASnowflakeJiraAzureData engineeringCI/CD

Posted 2024-11-23
Apply