Apply

Data Engineer

Posted 2024-11-20

View full description

💎 Seniority level: Middle, 5+ years in analytics stack, 3+ years with SQL templating engines

📍 Location: Argentina, Spain, England, United Kingdom, Lisbon, Portugal

🔍 Industry: Web3

🏢 Company: Reown

🗣️ Languages: English

⏳ Experience: 5+ years in analytics stack, 3+ years with SQL templating engines

🪄 Skills: AWSPythonSQLData AnalysisDesign PatternsETLGCPGitTableauAzureClickhouseData analysisPandasSparkCommunication SkillsCI/CDTerraformWritten communication

Requirements:
  • 5+ years working in the analytics stack within a fast-paced environment.
  • 3+ years production experience with SQL templating engines like DBT.
  • Experience with distributed query engines (Bigquery, Athena, Spark), data warehouses, and BI tools.
  • Strong understanding of software engineering principles, coding standards, design patterns, version control (e.g., Git), testing methodologies, and CI/CD processes.
  • Experience with AWS/GCP/Azure services for deployment and management.
  • Familiarity with GitHub, CI/CD pipelines, GitHub Actions, and Terraform.
  • Ability to write Python scripts for ETL processes and data manipulation.
  • Proficient with libraries like pandas for analysis and transformation.
  • Experience handling various data formats (e.g., CSV, JSON, Parquet).
  • Strong problem-solving skills and communication abilities to discuss technical concepts.
Responsibilities:
  • Write complex SQL queries that extract and combine data from on-chain and off-chain logs for analytics.
  • Create dashboards and tools for team data discoverability and KR tracking.
  • Perform deep-dive analyses into specific topics for internal stakeholders.
  • Help design, implement, and evolve Reown's on-chain data infrastructure.
  • Build, maintain, and monitor end-to-end data pipelines for new datasets and features.
  • Write health-checks and alerts to ensure data correctness, consistency, and freshness.
  • Meet with product managers and stakeholders to understand data needs and detect new product opportunities.
Apply

Related Jobs

Apply

📍 Latin America

🧭 Contract

💸 1300 - 2300 USD per month

🔍 Talent-as-a-Service

🏢 Company: GoFasti

  • Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent work experience.
  • 4+ years working as a Data Engineer or similar role focusing on data infrastructure.
  • Experience with AWS (Redshift, RDS) and Google Cloud (BigQuery, Firebase).
  • Proficiency in at least one programming language (Python, Java, Scala).
  • Hands-on experience with AWS Kinesis for real-time data ingestion.
  • Familiarity with data pipeline orchestration tools such as dbt and Airflow.
  • Strong knowledge of data warehousing concepts and ETL best practices.

  • Enhance and maintain the infrastructure powering analytics and data products.
  • Ingest new data from files, APIs, and external databases into a centralized data store.
  • Continuously monitor, optimize, and troubleshoot data pipelines and infrastructure.
  • Work closely with business stakeholders to understand data needs and translate requirements into solutions.
  • Implement and enforce data governance policies for security, integrity, and compliance.
  • Drive efforts to ensure high data quality by identifying and resolving pipeline issues.
  • Participate in architecture discussions and engage in peer code reviews.

AWSPythonETLAirflowFirebaseAttention to detailCompliance

Posted 2024-11-19
Apply
Apply

📍 Latin America

🧭 Contract

💸 800 - 1200 USD per month

🔍 Talent-as-a-Service

🏢 Company: GoFasti

  • Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent work experience.
  • 1+ years working as a Data Engineer or in a similar technical role.
  • Experience with AWS (Redshift, RDS) and Google Cloud (BigQuery, Firebase).
  • Proficiency in at least one programming language (Python, Java, Scala).
  • Hands-on experience with real-time data streaming solutions (AWS Kinesis).
  • Familiarity with data pipeline orchestration tools such as dbt and Airflow.
  • Strong knowledge of data warehousing concepts and ETL best practices.

  • Enhance and maintain the infrastructure powering analytics and data products.
  • Ingest new data from files, APIs, and external databases into a centralized data store.
  • Continuously monitor, optimize, and troubleshoot data pipelines.
  • Work closely with business stakeholders to understand data needs.
  • Implement and enforce data governance policies.
  • Contribute to ensuring high data quality across the organization.
  • Participate in technical architecture discussions.

AWSPythonETLAirflowFirebaseAttention to detailCompliance

Posted 2024-11-19
Apply
Apply

📍 Latin America and other parts of the world

🔍 Insurance

  • Experience in Data Engineering.
  • Proficient in Python or Scala.
  • Excellent communication skills.
  • Attention to detail and strong problem-solving abilities.
  • Ability to work in an Agile environment.

  • Responsible for development and maintenance of systems in enterprise data and analytics environments.
  • Optimize performance by building and supporting decision-making tools.
  • Collaborate closely with software engineering, AI/ML, Cybersecurity, and DevOps/SysOps teams.
  • Design data pipelines and databases, build infrastructure and alerting frameworks.
  • Process data using SFTPs to APIs.

PythonSoftware DevelopmentAgileData engineeringDevOpsAttention to detail

Posted 2024-11-15
Apply
Apply

📍 Latin America, United States, Canada

🔍 Life insurance

  • The ideal candidate will be independent and a great communicator.
  • Attention to detail is critical.
  • Must possess problem-solving skills.

  • Develop and maintain enterprise data and analytics systems for a US client.
  • Optimize performance by building and supporting decision-making tools.
  • Collaborate closely with software engineering, AI/ML, Cybersecurity, and DevOps/SysOps teams.
  • Support end-to-end data pipelines using Python or Scala.
  • Participate in Agile framework-related tasks.

PythonSoftware DevelopmentAgileData engineeringDevOpsAttention to detail

Posted 2024-11-15
Apply
Apply
🔥 Big Data Engineer
Posted 2024-11-15

📍 Spain

🧭 Full-Time

🔍 Technology

🏢 Company: Plain Concepts

  • At least 4 years of experience in software/data engineering.
  • Experience in designing architectures.
  • Strong proficiency in Python or Scala and Spark for processing large data volumes.
  • Solid experience with cloud platforms (Azure or AWS).
  • Experience in creating CI/CD data pipelines.
  • Familiarity with testing methodologies (unit and integration).
  • Knowledge of SQL and NoSQL databases.
  • Good command of English (mandatory).
  • Desirable experience with BI tools like Power BI, and technologies like Databricks, Snowflake, Fabric, and IaC.

  • Engage in projects from initial client interactions to understand business needs and propose suitable technical solutions.
  • Develop projects from scratch with minimal supervision, collaborating with the team.
  • Participate in architecture design and decision-making in a constructive environment.
  • Contribute to good practices for clean and reusable code.
  • Develop ETLs using Spark in Python or Scala.
  • Execute projects in cloud environments like Azure or AWS.
  • Build scalable pipelines using various technologies.

AWSPythonSQLAgileAzureNosqlSparkCI/CD

Posted 2024-11-15
Apply
Apply

📍 North America, South America, Europe

💸 100000 - 500000 USD per year

🔍 Web3, blockchain

🏢 Company: Edge & Node

  • A self-motivated, team member with keen attention to detail.
  • Proactive collaboration with team members and a willingness to adapt to a growing environment.
  • Familiarity and experience with Rust, particularly focusing on data transformation and ingestion.
  • A strong understanding of blockchain data structures and ingestion interfaces.
  • Experience in real-time data handling, including knowledge of reorg handling.
  • Familiarity with blockchain clients like Geth and Reth is a plus.
  • Adaptability to a dynamic and fully-remote work environment.
  • Rigorous approach to software development that reflects a commitment to excellence.

  • Develop and maintain data ingestion adapters for various blockchain networks and web3 protocols.
  • Implement data ingestion strategies for both historical and recent data.
  • Apply strategies for handling block reorgs.
  • Optimize the latency of block ingestion at the chain head.
  • Write interfaces with file storage protocols such as IPFS and Arweave.
  • Collaborate with upstream data sources, such as chain clients and tracing frameworks, and monitor the latest upstream developments.
  • Perform data quality checks, cross-checking data across multiple sources and investigating any discrepancies that arise.

Software DevelopmentBlockchainData StructuresRustCollaborationAttention to detail

Posted 2024-11-15
Apply
Apply

📍 United Kingdom

🧭 Contract

🏢 Company: Axiom Software Solutions Limited

  • Expertise in L2/L3 IP protocols such as HSRP, OSPF, BGP, MPLS, and VRF.
  • Deep knowledge of configuration and troubleshooting of IP protocols.
  • Experience in multi-vendor environments specialized in Viptela SD-WAN, Cisco SD-Access.
  • CCNP-ENCOR or CCIE Enterprise Infrastructure certified.
  • Good understanding of Data Center infrastructure and connectivity.
  • Knowledge of Vulnerability Management and Lifecycle Management.

  • Extensive support for DC Operation support.
  • Troubleshooting expert of network-related issues.
  • Interaction with customers to understand network and service delivery requirements.
  • Handling deployment and implementation of network solutions.
  • Preparing high-level and low-level design documents.
  • Providing consultation on network design with SD-WAN and SD-Access.

CiscoProject CoordinationCommunication SkillsCollaborationProblem SolvingAttention to detailOrganizational skillsPresentation skillsTime ManagementWritten communicationDocumentation

Posted 2024-11-14
Apply
Apply
🔥 Senior Data Engineer
Posted 2024-11-13

📍 United Kingdom

🔍 Payment and Financial Services

🏢 Company: Vitesse PSP

  • Experience with data pipeline orchestration tools such as Airflow, Luigi, or similar.
  • Experience with version control systems and CI/CD best practices using GitHub Actions.
  • Knowledge of data governance, privacy regulations (e.g., GDPR), and security best practices.
  • Proficiency with SQL and experience with distributed data processing tools such as Apache Spark.
  • Strong understanding of relational and NoSQL databases (e.g., PostgreSQL, MongoDB, Impala, Cassandra).
  • Experience with cloud infrastructure (Docker and Kubernetes, Terraform).
  • Experience in AWS platform architecture and cloud services.
  • A collaborative team member with Agile experience.
  • Familiarity with stream processing technologies (Kafka or Kinesis).
  • Nice to have: Experience with machine learning frameworks and pipelines, Delta Live Tables, Great Expectations, search optimizers (ElasticSearch/Lucene), REST alternatives (GraphQL, AsyncAPI), data science kits (Jupyter, Anaconda).

  • Design, build, and maintain scalable data pipelines and architectures to handle large volumes of structured and unstructured data.
  • Develop, enhance, and optimize ELT processes for ingesting, processing, and distributing data across multiple platforms in real time.
  • Build and manage data warehouses to support advanced analytics, reporting, and machine learning.
  • Implement data governance, quality checks, and validation processes to ensure the accuracy, consistency, observability, and security of data.
  • Optimize query performance and data storage costs through techniques like partitioning, indexing, vacuuming, and compression.
  • Build monitoring and alerting systems for data pipelines to proactively detect and resolve issues.
  • Optimize existing data pipelines for better performance, cost-efficiency, and scalability.
  • Work with data scientists, analysts, and business stakeholders to understand data needs.
  • Continuously research and integrate cutting-edge data technologies, tools, and practices to improve data engineering processes.
  • Team up with product engineers to identify, root cause, and resolve bugs.
  • Update documentation to help users navigate data products.
  • Ensure the data platform performs well and is always available for blue-chip clients.

AWSDockerGraphQLPostgreSQLSQLAgileElasticSearchKafkaKubernetesMongoDBTableauAirflowCassandraData engineeringElasticsearchNosqlSparkCI/CDTerraformDocumentation

Posted 2024-11-13
Apply
Apply

📍 Spain

🧭 Permanent

🔍 Travel-Tech

  • At least 2 years of experience in a similar role in a fast-paced environment.
  • Advanced knowledge of SQL.
  • Experience in Data Modelling.
  • Experience in ETL design, implementation, and maintenance.
  • Experience with workflow management engines (e.g., Airflow, Google Cloud Composer, Talend).
  • Experience with data quality and validation.
  • Fluent in English, both written and spoken.

  • Focus on reports, tables, analysis, and deliverables related to company's sales data.
  • Support decision-making in the business by leveraging engineering skills.
  • Collaborate with multiple stakeholders from business, partnership, and product optimization areas.
  • Accountable for project results in terms of efficiency and value for the company.

SQLETLAirflow

Posted 2024-11-11
Apply
Apply

📍 Portugal

🧭 Permanent

🔍 Travel-Tech

  • At least 2 years of experience in a similar role in a fast-paced environment.
  • Advanced knowledge of SQL.
  • Experience in Data Modelling.
  • Experience in ETL design, implementation, and maintenance.
  • Familiarity with workflow management engines (e.g., Airflow, Google Cloud Composer, Talend).
  • Experience with data quality and validation.
  • Fluent in English both written and spoken.

  • Work in the Sales & Partnerships domain team focused on all deliverables related to the company's sales data.
  • Leverage engineering skills to acquire, manipulate, orchestrate, and monitor data.
  • Collaborate with multiple stakeholders from business, partnership, and product optimization areas.
  • Accountable for project results in terms of efficiency and value for the company.

SQLETLAirflow

Posted 2024-11-11
Apply