Apply

Data Engineer

Posted 15 days agoViewed

View full description

πŸ“ Location: Americas

πŸ” Industry: Code analysis solutions

πŸ—£οΈ Languages: English

πŸͺ„ Skills: AWSDockerETLData engineeringHaskellScala

Requirements:
  • Strong functional programming background (e.g., Haskell, Clojure, Scala, F# or similar).
  • Experience in data engineering and ETL development.
  • Familiarity with AWS, Docker, and workflow systems.
  • Strong problem-solving skills with a proactive approach to development.
  • Excellent communication skills and ability to work in a high-expectation environment.
Responsibilities:
  • Design and implement modular ETL components for the code analysis platform.
  • Leverage functional programming principles to build scalable and maintainable solutions.
  • Optimize performance and handle edge cases in data processing workflows.
  • Work with AWS, Docker, and workflow orchestration tools to enhance system efficiency.
  • Collaborate with cross-functional teams to align development with business goals.
  • Maintain high coding standards and contribute to a culture of engineering excellence.
Apply

Related Jobs

Apply
πŸ”₯ Principal Data Engineer
Posted about 1 hour ago

πŸ“ AL, AR, AZ, CA (exempt only), CO, CT, FL, GA, ID, IL, IN, IA, KS, KY, MA, ME, MD, MI, MN, MO, MT, NC, NE, NJ, NM, NV, NY, OH, OK, OR, PA, SC, SD, TN, TX, UT, VT, VA, WA, and WI

🧭 Full-Time

πŸ” Insurance

🏒 Company: Kin Insurance

  • 10+ years of experience in designing & architecting data systems, warehousing and/or ML Ops platforms
  • Proven experience in the design and architecture of large-scale data systems, including lakehouse and machine learning platforms, using modern cloud-based tools
  • Can communicate effectively with executives and team
  • Architect and implement solutions using Databricks for advanced analytics, data processing, and machine learning workloads
  • Expertise in data architecture and design, for both structured and unstructured data. Expertise in data modeling across transactional, BI and DS usage models
  • Fluency with modern cloud data stacks, SaaS solutions, and evolutionary architecture, enabling flexible, scalable, and cost-effective solutions
  • Expertise with all aspects of data management: data governance, data mastering, metadata management, data taxonomies and ontologies
  • Expertise in architecting and delivering highly-scalable and flexible, cost effective, cloud-based enterprise data solutions
  • Proven ability to develop and implement data architecture roadmaps and comprehensive implementation plans that align with business strategies
  • Experience working data integration tools and Kafka for real-time data streaming to ensure seamless data flow across the organization
  • Expertise in dbt for transformations, pipelines, and data modeling
  • Experience with data analytics, BI, data reporting and data visualization
  • Experience with the insurance domain is a plus
  • Lead the overall data architecture design, including ingestion, storage, management, and machine learning engineering platforms.
  • Implement the architecture and create a vision for how data will flow through the organization using a federated approach.
  • Manage data governance across multiple systems: data mastering, metadata management, data definitions, semantic-layer design, data taxonomies, and ontologies.
  • Architect and deliver highly scalable, flexible, and cost-effective enterprise data solutions that support the development of architecture patterns and standards.
  • Help define the technology strategy and roadmaps for the portfolio of data platforms and services across the organization.
  • Ensure data security and compliance, working within industry regulations.
  • Design and document data architecture at multiple levels across conceptual, logical, and physical views.
  • Provide β€œhands-on” architectural guidance and leadership throughout the entire lifecycle of development projects.
  • Translate business requirements into conceptual and detailed technology solutions that meet organizational goals.
  • Collaborate with other architects, engineering directors, and product managers to align data solutions with business strategy.
  • Lead proof-of-concept projects to test and validate new tools or architectural approaches.
  • Stay current with industry trends, vendor product offerings, and evolving data technologies to ensure the organization leverages the best available tools.
  • Cross-train peers and mentor team members to share knowledge and build internal capabilities.

AWSLeadershipPythonSQLApache AirflowCloud ComputingETLKafkaKubernetesMachine LearningSnowflakeAlgorithmsData engineeringData scienceData StructuresREST APIPandasSparkTensorflowCommunication SkillsAnalytical SkillsCollaborationCI/CDProblem SolvingMentoringTerraformMicroservicesJSONData visualizationData modelingData analyticsData management

Posted about 1 hour ago
Apply
Apply
πŸ”₯ Junior Data Engineer
Posted about 5 hours ago

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 90000.0 - 105000.0 CAD per year

πŸ” Blockchain Infrastructure

🏒 Company: FigmentπŸ‘₯ 11-50HospitalityTravel AccommodationsArt

  • At least 1 year of IT experience
  • Proficiency in SQL
  • Experience with a programming language (ideally Python)
  • Strong communication and written skills
  • Experience in troubleshooting data quality issues
  • Skills in data analysis and visualization
  • Develop and maintain dashboards and reports
  • Investigate new chains for data collection
  • Review ingested data for quality issues
  • Automate manual processes
  • Collaborate with internal teams for data solutions

PythonSQLCloud ComputingData AnalysisGitSnowflakeData engineeringTroubleshootingData visualization

Posted about 5 hours ago
Apply
Apply
πŸ”₯ Senior Data Engineer
Posted about 6 hours ago

πŸ“ United States, Canada

🧭 Full-Time

πŸ” Software Development

🏒 Company: Float.com

  • Expertise in machine learning and advanced algorithms
  • Proficient in Python or Java, comfortable with SQL and Javascript/Typescript
  • Experience with large-scale data pipelines and stream processing
  • Skilled in data integration, cleaning, and validation
  • Familiar with vector and graph databases
  • Lead technical viability discussions
  • Develop and test proof-of-concepts for Resource Recommendation Engine
  • Conduct comprehensive analysis of existing data
  • Implement and maintain data streaming pipeline
  • Develop and implement advanced algorithms
  • Design, implement, and maintain streaming data architecture
  • Establish best practices for optimization and AI
  • Mentor and train team members

PythonSQLKafkaMachine LearningAlgorithmsData engineering

Posted about 6 hours ago
Apply
Apply

πŸ“ Mexico

🧭 Full-Time

πŸ” Insurance

🏒 Company: CapgeminiπŸ‘₯ 10001-350000IT Services and IT Consulting

  • Advanced SQL/Snowflake experience (5+ years)
  • Advanced knowledge of Python (2+ years)
  • Intermediate Power-BI experience (3+ years)
  • Experience in Data Analysis and Profiling
  • Analyze and translate business data stories
  • Design, build, and implement data products

PythonSQLETLSnowflake

Posted 1 day ago
Apply
Apply

πŸ“ LATAM

🧭 Full-Time

πŸ” Healthtech

🏒 Company: UrrlyπŸ‘₯ 1-10Artificial Intelligence (AI)Business DevelopmentSalesInformation Technology

  • Proficiency in SQL and Python
  • Strong experience with PostgreSQL
  • Experience working with APIs and cloud-based ETL processes using Airflow
  • Power BI experience is a huge advantage
  • Design, develop, and maintain ETL pipelines
  • Build and manage cloud-based workflows using Airflow
  • Integrate data from multiple sources using APIs
  • Collaborate with cross-functional teams to support data-driven decision-making
  • Create insightful reports and dashboards using Power BI

PostgreSQLPythonSQLApache AirflowCloud ComputingETL

Posted 1 day ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” B2B SaaS

🏒 Company: Sanity

  • 4+ years of experience building data pipelines at scale
  • Deep expertise in SQL, Python, and Node.js/TypeScript
  • Production experience with Airflow and RudderStack
  • Track record of building reliable data infrastructure
  • Design, develop, and maintain scalable ETL/ELT pipelines
  • Collaborate to implement and scale product telemetry
  • Establish best practices for data ingestion and transformation
  • Monitor and optimize data pipeline performance

Node.jsPythonSQLApache AirflowETLTypeScript

Posted 2 days ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” E-commerce

  • Bachelor's or Master's degree in Computer Science or related field
  • 5+ years of experience in data engineering
  • Strong proficiency in SQL and database technologies
  • Experience with data pipeline orchestration tools
  • Proficiency in programming languages like Python and Scala
  • Hands-on experience with AWS cloud data services
  • Familiarity with big data frameworks like Apache Spark
  • Knowledge of data modeling and warehousing
  • Experience implementing CI/CD for data pipelines
  • Real-time data processing architectures experience
  • Design, develop, and maintain ETL/ELT pipelines
  • Optimize data architecture and storage solutions
  • Work with AWS for scalable data solutions
  • Ensure data quality, integrity, and security
  • Collaborate with cross-functional teams
  • Monitor and troubleshoot data workflows
  • Create APIs for analytical information

AWSPostgreSQLPythonSQLApache AirflowETLKafkaMySQLSnowflakeCI/CDScala

Posted 2 days ago
Apply
Apply

πŸ“ AZ, CA, CO, FL, GA, KS, KY, IA, ID, IL, IN, LA, MA, ME, MI, MN, MO, NC, NH, NJ, NV, NY, OH, OK, OR, PA, RI, SC, SD, TN, TX, UT, VA, VT, WA, WI

πŸ’Έ 121595.0 - 202659.0 USD per year

πŸ” E-commerce

  • SQL
  • Performance Tuning and Optimization
  • Data Modeling
  • Azure SQL Administration, including Security
  • PowerShell
  • Azure Data Factory
  • Focus on growing as a data engineer in an Azure SQL environment.
  • Performing SQL programming and performance tuning.
  • Capable of taking well-defined sub-tasks and completing these tasks.
  • Creating Power BI visualizations and paginated reports.
  • Modeling and implementing databases and warehouses.
  • Demonstrates they are a high energy individual.
  • Effective in communicating status to the team.
  • Exhibits ShipBob's core values, focuses on understanding and living these values.
  • Accepts feedback graciously and learns from everything they do.
  • Other duties/responsibilities as necessary.

SQLAgileGitAzureData engineeringData visualizationData modelingData management

Posted 3 days ago
Apply
Apply

πŸ“ Europe, APAC, Americas

🧭 Full-Time

πŸ” Software Development

🏒 Company: DockerπŸ‘₯ 251-500πŸ’° $105,000,000 Series C almost 3 years agoDeveloper ToolsDeveloper PlatformInformation TechnologySoftware

  • 4+ years of relevant industry experience
  • Experience with data modeling and building scalable pipelines
  • Proficiency with Snowflake or BigQuery
  • Experience with data governance and security controls
  • Experience creating ETL scripts using Python and SQL
  • Familiarity with a cloud ecosystem: AWS/Azure/Google Cloud
  • Experience with Tableau or Looker
  • Manage and develop ETL jobs, warehouse, and event collection tools
  • Build and manage the Central Data Model for reporting
  • Integrate emerging methodologies and technologies
  • Build data pipelines for ML and AI projects
  • Contribute to SOC2 compliance across the data platform
  • Document technical architecture

PythonSQLETLSnowflakeAirflowData engineeringData visualizationData modeling

Posted 3 days ago
Apply
Apply
πŸ”₯ Data Engineer
Posted 3 days ago

πŸ“ United States

🧭 Full-Time

πŸ” Electric Vehicle Charging

  • 5+ years experience in Data Analysis and architecture
  • ETL/ELT experience
  • Strong SQL knowledge
  • Experience with AWS and Snowflake
  • Ability to apply data quality principles
  • Assist with process improvements in data retrieval and transformation
  • Develop automated reporting solutions
  • Analyze and optimize existing implementations
  • Collaborate with engineers and analysts for strong solutions
  • Maintain compliance with cybersecurity protocols

AWSSQLETLSnowflakeData engineeringData modeling

Posted 3 days ago
Apply