Apply

Data Engineer

Posted 3 days agoViewed

View full description

πŸ” Industry: Software Development

🏒 Company: G2i Inc.

Requirements:
  • Strong functional programming background (e.g., Haskell, Clojure, Scala, F# or similar).
  • Experience in data engineering and ETL development.
  • Familiarity with AWS, Docker, and workflow systems.
  • Strong problem-solving skills with a proactive approach to development.
  • Excellent communication skills and ability to work in a high-expectation environment.
Responsibilities:
  • Design and implement modular ETL components for our code analysis platform.
  • Leverage functional programming principles to build scalable and maintainable solutions.
  • Optimize performance and handle edge cases in data processing workflows.
  • Work with AWS, Docker, and workflow orchestration tools to enhance system efficiency.
  • Collaborate with cross-functional teams to align development with business goals.
  • Maintain high coding standards and contribute to a culture of engineering excellence.
Apply

Related Jobs

Apply

πŸ“ Mexico

🧭 Full-Time

πŸ” IT

🏒 Company: Rehire

  • 8+ years of experience in data engineering. (MANDATORY).
  • Advanced Conversational English Skills. (C1/C2 Level REQUIRED)
  • Proficiency in Python, Scala, or Java within the data ecosystem.
  • Strong knowledge of SQL and NoSQL databases.
  • Experience with Snowflake, AWS, and distributed data processing systems.
  • Familiarity with data pipeline orchestration tools like Airflow.
  • Experience designing and implementing cross-cloud data pipelines (AWS, Snowflake, Databricks).
  • Design, develop, and optimize scalable data pipelines.
  • Build and manage large datasets for analysis and processing.
  • Implement distributed data processing systems to handle large-scale data.
  • Work with SQL and NoSQL databases for efficient data storage and retrieval.
  • Collaborate with technical teams to design cloud-based data architectures.

AWSPythonSQLETLSnowflakeData engineeringData modelingData management

Posted about 11 hours ago
Apply
Apply
πŸ”₯ Senior Data Engineer
Posted about 13 hours ago

πŸ“ Madrid, Barcelona

🏒 Company: Clarity AI

  • 5+ years in data architecture, data engineering, or a related role, with hands-on experience in data modeling, schema design, and cloud-based data systems
  • Strong Python and SQL skills and proficiency with distributed data stores
  • Proficiency in schema design, data modeling, and building data products
  • Proficient in at least one major programming language, preferably Python, with a software engineering mindset for writing clean, maintainable code
  • Deep understanding of architectural principles in microservices, distributed systems, and data pipeline design
  • Familiarity with containerized environments, public/private API integrations, and security best practices
  • Strong communication and interpersonal skills, with experience working cross-functionally
  • Proven ability to guide teams and drive complex projects to successful completion
  • Self-starter, able to take ownership and initiative, with high energy and stamina
  • Decisive and action-oriented, able to make rapid decisions even when they are short of information
  • Highly motivated, independent and deeply passionate about sustainability and impact
  • Excellent oral and written English communication skills (minimum C1 level-proficient user)
  • Transforming raw data into intuitive, high-quality data models
  • Implementing and monitor data quality checks
  • Collaborating across functions
  • Working closely with engineering and product teams
  • Acting as a bridge between technical and non-technical stakeholders
  • Leading initiatives to improve data practices
  • Guiding the team in experimenting with new tools and technologies
  • Continuously evolving the data architecture
  • Applying a pragmatic approach to performance metrics and scaling decisions
  • Implementing performance metrics to monitor system health
  • Maintaining comprehensive documentation of data systems, processes, and best practices

DockerPythonSQLApache AirflowCloud ComputingKubernetesAlgorithmsData engineeringData StructuresPostgresREST APICommunication SkillsAnalytical SkillsCI/CDProblem SolvingAgile methodologiesMicroservicesData visualizationData modelingData analyticsData management

Posted about 13 hours ago
Apply
Apply
πŸ”₯ Principal Data Engineer
Posted about 16 hours ago

πŸ“ AL, AR, AZ, CA (exempt only), CO, CT, FL, GA, ID, IL, IN, IA, KS, KY, MA, ME, MD, MI, MN, MO, MT, NC, NE, NJ, NM, NV, NY, OH, OK, OR, PA, SC, SD, TN, TX, UT, VT, VA, WA, and WI

🧭 Full-Time

πŸ” Insurance

🏒 Company: Kin Insurance

  • 10+ years of experience in designing & architecting data systems, warehousing and/or ML Ops platforms
  • Proven experience in the design and architecture of large-scale data systems, including lakehouse and machine learning platforms, using modern cloud-based tools
  • Can communicate effectively with executives and team
  • Architect and implement solutions using Databricks for advanced analytics, data processing, and machine learning workloads
  • Expertise in data architecture and design, for both structured and unstructured data. Expertise in data modeling across transactional, BI and DS usage models
  • Fluency with modern cloud data stacks, SaaS solutions, and evolutionary architecture, enabling flexible, scalable, and cost-effective solutions
  • Expertise with all aspects of data management: data governance, data mastering, metadata management, data taxonomies and ontologies
  • Expertise in architecting and delivering highly-scalable and flexible, cost effective, cloud-based enterprise data solutions
  • Proven ability to develop and implement data architecture roadmaps and comprehensive implementation plans that align with business strategies
  • Experience working data integration tools and Kafka for real-time data streaming to ensure seamless data flow across the organization
  • Expertise in dbt for transformations, pipelines, and data modeling
  • Experience with data analytics, BI, data reporting and data visualization
  • Experience with the insurance domain is a plus
  • Lead the overall data architecture design, including ingestion, storage, management, and machine learning engineering platforms.
  • Implement the architecture and create a vision for how data will flow through the organization using a federated approach.
  • Manage data governance across multiple systems: data mastering, metadata management, data definitions, semantic-layer design, data taxonomies, and ontologies.
  • Architect and deliver highly scalable, flexible, and cost-effective enterprise data solutions that support the development of architecture patterns and standards.
  • Help define the technology strategy and roadmaps for the portfolio of data platforms and services across the organization.
  • Ensure data security and compliance, working within industry regulations.
  • Design and document data architecture at multiple levels across conceptual, logical, and physical views.
  • Provide β€œhands-on” architectural guidance and leadership throughout the entire lifecycle of development projects.
  • Translate business requirements into conceptual and detailed technology solutions that meet organizational goals.
  • Collaborate with other architects, engineering directors, and product managers to align data solutions with business strategy.
  • Lead proof-of-concept projects to test and validate new tools or architectural approaches.
  • Stay current with industry trends, vendor product offerings, and evolving data technologies to ensure the organization leverages the best available tools.
  • Cross-train peers and mentor team members to share knowledge and build internal capabilities.

AWSLeadershipPythonSQLApache AirflowCloud ComputingETLKafkaKubernetesMachine LearningSnowflakeAlgorithmsData engineeringData scienceData StructuresREST APIPandasSparkTensorflowCommunication SkillsAnalytical SkillsCollaborationCI/CDProblem SolvingMentoringTerraformMicroservicesJSONData visualizationData modelingData analyticsData management

Posted about 16 hours ago
Apply
Apply
πŸ”₯ Junior Data Engineer
Posted about 20 hours ago

πŸ“ United States, Canada

🧭 Full-Time

πŸ’Έ 90000.0 - 105000.0 CAD per year

πŸ” Blockchain Infrastructure

🏒 Company: FigmentπŸ‘₯ 11-50HospitalityTravel AccommodationsArt

  • At least 1 year of IT experience
  • Proficiency in SQL
  • Experience with a programming language (ideally Python)
  • Strong communication and written skills
  • Experience in troubleshooting data quality issues
  • Skills in data analysis and visualization
  • Develop and maintain dashboards and reports
  • Investigate new chains for data collection
  • Review ingested data for quality issues
  • Automate manual processes
  • Collaborate with internal teams for data solutions

PythonSQLCloud ComputingData AnalysisGitSnowflakeData engineeringTroubleshootingData visualization

Posted about 20 hours ago
Apply
Apply

πŸ“ LATAM

🧭 Full-Time

πŸ” Healthtech

🏒 Company: UrrlyπŸ‘₯ 1-10Artificial Intelligence (AI)Business DevelopmentSalesInformation Technology

  • Proficiency in SQL and Python
  • Strong experience with PostgreSQL
  • Experience working with APIs and cloud-based ETL processes using Airflow
  • Power BI experience is a huge advantage
  • Design, develop, and maintain ETL pipelines
  • Build and manage cloud-based workflows using Airflow
  • Integrate data from multiple sources using APIs
  • Collaborate with cross-functional teams to support data-driven decision-making
  • Create insightful reports and dashboards using Power BI

PostgreSQLPythonSQLApache AirflowCloud ComputingETL

Posted 2 days ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” B2B SaaS

🏒 Company: Sanity

  • 4+ years of experience building data pipelines at scale
  • Deep expertise in SQL, Python, and Node.js/TypeScript
  • Production experience with Airflow and RudderStack
  • Track record of building reliable data infrastructure
  • Design, develop, and maintain scalable ETL/ELT pipelines
  • Collaborate to implement and scale product telemetry
  • Establish best practices for data ingestion and transformation
  • Monitor and optimize data pipeline performance

Node.jsPythonSQLApache AirflowETLTypeScript

Posted 3 days ago
Apply
Apply

πŸ“ United States, Canada

🧭 Full-Time

πŸ” E-commerce

  • Bachelor's or Master's degree in Computer Science or related field
  • 5+ years of experience in data engineering
  • Strong proficiency in SQL and database technologies
  • Experience with data pipeline orchestration tools
  • Proficiency in programming languages like Python and Scala
  • Hands-on experience with AWS cloud data services
  • Familiarity with big data frameworks like Apache Spark
  • Knowledge of data modeling and warehousing
  • Experience implementing CI/CD for data pipelines
  • Real-time data processing architectures experience
  • Design, develop, and maintain ETL/ELT pipelines
  • Optimize data architecture and storage solutions
  • Work with AWS for scalable data solutions
  • Ensure data quality, integrity, and security
  • Collaborate with cross-functional teams
  • Monitor and troubleshoot data workflows
  • Create APIs for analytical information

AWSPostgreSQLPythonSQLApache AirflowETLKafkaMySQLSnowflakeCI/CDScala

Posted 3 days ago
Apply
Apply

πŸ“ AZ, CA, CO, FL, GA, KS, KY, IA, ID, IL, IN, LA, MA, ME, MI, MN, MO, NC, NH, NJ, NV, NY, OH, OK, OR, PA, RI, SC, SD, TN, TX, UT, VA, VT, WA, WI

πŸ’Έ 121595.0 - 202659.0 USD per year

πŸ” E-commerce

  • SQL
  • Performance Tuning and Optimization
  • Data Modeling
  • Azure SQL Administration, including Security
  • PowerShell
  • Azure Data Factory
  • Focus on growing as a data engineer in an Azure SQL environment.
  • Performing SQL programming and performance tuning.
  • Capable of taking well-defined sub-tasks and completing these tasks.
  • Creating Power BI visualizations and paginated reports.
  • Modeling and implementing databases and warehouses.
  • Demonstrates they are a high energy individual.
  • Effective in communicating status to the team.
  • Exhibits ShipBob's core values, focuses on understanding and living these values.
  • Accepts feedback graciously and learns from everything they do.
  • Other duties/responsibilities as necessary.

SQLAgileGitAzureData engineeringData visualizationData modelingData management

Posted 3 days ago
Apply
Apply

πŸ“ Europe, APAC, Americas

🧭 Full-Time

πŸ” Software Development

🏒 Company: DockerπŸ‘₯ 251-500πŸ’° $105,000,000 Series C almost 3 years agoDeveloper ToolsDeveloper PlatformInformation TechnologySoftware

  • 4+ years of relevant industry experience
  • Experience with data modeling and building scalable pipelines
  • Proficiency with Snowflake or BigQuery
  • Experience with data governance and security controls
  • Experience creating ETL scripts using Python and SQL
  • Familiarity with a cloud ecosystem: AWS/Azure/Google Cloud
  • Experience with Tableau or Looker
  • Manage and develop ETL jobs, warehouse, and event collection tools
  • Build and manage the Central Data Model for reporting
  • Integrate emerging methodologies and technologies
  • Build data pipelines for ML and AI projects
  • Contribute to SOC2 compliance across the data platform
  • Document technical architecture

PythonSQLETLSnowflakeAirflowData engineeringData visualizationData modeling

Posted 3 days ago
Apply
Apply

πŸ“ United States

🧭 Full-Time

πŸ’Έ 121595.0 - 202659.0 USD per year

πŸ” Supply Chain and Fulfillment Technology

🏒 Company: ShipBob, Inc.

  • 6-9 years of experience
  • Excellent SQL skills
  • Experience with Azure Data Factory
  • Experience in Performance Tuning and Optimization
  • Experience with Data Modeling
  • Perform SQL programming and performance tuning
  • Create Power BI visualizations and paginated reports
  • Model and implement databases and warehouses

SQLData modeling

Posted 4 days ago
Apply

Related Articles

Posted 6 months ago

Insights into the evolving landscape of remote work in 2024 reveal the importance of certifications and continuous learning. This article breaks down emerging trends, sought-after certifications, and provides practical solutions for enhancing your employability and expertise. What skills will be essential for remote job seekers, and how can you navigate this dynamic market to secure your dream role?

Posted 6 months ago

Explore the challenges and strategies of maintaining work-life balance while working remotely. Learn about unique aspects of remote work, associated challenges, historical context, and effective strategies to separate work and personal life.

Posted 6 months ago

Google is gearing up to expand its remote job listings, promising more opportunities across various departments and regions. Find out how this move can benefit job seekers and impact the market.

Posted 6 months ago

Learn about the importance of pre-onboarding preparation for remote employees, including checklist creation, documentation, tools and equipment setup, communication plans, and feedback strategies. Discover how proactive pre-onboarding can enhance job performance, increase retention rates, and foster a sense of belonging from day one.

Posted 6 months ago

The article explores the current statistics for remote work in 2024, covering the percentage of the global workforce working remotely, growth trends, popular industries and job roles, geographic distribution of remote workers, demographic trends, work models comparison, job satisfaction, and productivity insights.