Apply

Data Engineer

Posted 2024-10-23

View full description

๐Ÿ“ Location: USA

๐Ÿ’ธ Salary: 147900 - 174000 USD per year

๐Ÿ” Industry: Cryptocurrency and blockchain technology

๐Ÿข Company: Coinbase Careers Page

๐Ÿ—ฃ๏ธ Languages: English

๐Ÿช„ Skills: PythonSQLApache AirflowBlockchainETLStrategyAirflowData scienceCollaboration

Requirements:
  • Proficient in scripting with Python, especially for data manipulation and integration.
  • Strong understanding of advanced SQL techniques for data querying and performance optimization.
  • Knowledge of data modeling best practices including star schemas and data normalization.
  • Experience in designing and optimizing ETL/ELT pipelines for large datasets.
  • Familiarity with Apache Airflow or similar tools for managing data workflows.
  • Experience with version control in GitHub.
  • Knowledge of data visualization tools such as Superset, Looker, or Python libraries.
  • Ability to collaborate effectively with stakeholders and document technical solutions.
  • Fluency in English is required.
Responsibilities:
  • Capture customer and partner team user stories to develop appropriate solutions.
  • Collaborate with stakeholders to manage a long-term engineering roadmap.
  • Identify areas for improvement through investigations and processes.
  • Drive the vision and adoption of self-service analytic capabilities.
  • Understand and utilize data resources effectively for projects.
  • Break down complex business problems into manageable components.
  • Optimize data infrastructure efficiency and costs.
  • Foster a collaborative team environment for innovation and improvement.
  • Work with analysts to solve complex data problems.
Apply

Related Jobs

Apply

๐Ÿ“ United States of America

๐Ÿ’ธ 90000 - 154000 USD per year

๐Ÿข Company: VSPVisionCareers

  • Bachelorโ€™s degree in computer science, data science, statistics, economics or related area.
  • Excellent written and verbal communication skills.
  • 6+ years of experience in development teams focusing on analytics.
  • 6+ years of hands-on experience in data preparation and SQL.
  • Knowledge of data architectures like event-driven architecture and real-time data.
  • Familiarity with DataOps practices and multiple data integration platforms.

  • Design, build, and optimize data pipelines for analytics.
  • Collaborate with multi-disciplinary teams for data integration.
  • Analyze data requirements to develop scalable pipeline solutions.
  • Profile data for accuracy and completeness in data gathering.
  • Drive automation of data tasks to enhance productivity.
  • Participate in architecture and design reviews.

AWSSQLAgileETLKafkaOracleSCRUMSnowflakeApache KafkaData scienceData StructuresCommunication SkillsCollaboration

Posted 2024-11-22
Apply
Apply

๐Ÿ“ United States

๐Ÿ” Consumer insights

  • Strong PL/SQL and SQL development skills.
  • Proficient in Python and Java.
  • 3-5 years of experience in Data Engineering with Oracle and MS SQL.
  • Experience with cloud services like Snowflake and Azure.
  • Familiar with data orchestration tools such as Azure Data Factory and DataBricks.
  • Understanding of data privacy regulations.

  • Design, implement and maintain scalable data pipelines and architecture.
  • Unit test and document solutions that meet product quality standards.
  • Identify and resolve performance bottlenecks in data processing workflows.
  • Implement data quality checks to ensure accuracy and consistency.
  • Collaborate with cross-functional teams to address data needs.

PythonSQLETLGitJavaOracleQASnowflakeJiraAzureData engineeringCI/CD

Posted 2024-11-21
Apply
Apply

๐Ÿ“ US

๐Ÿงญ Full-Time

๐Ÿ’ธ 206700 - 289400 USD per year

๐Ÿ” Social media / Online community

  • MS or PhD in a quantitative discipline: engineering, statistics, operations research, computer science, informatics, applied mathematics, economics, etc.
  • 7+ years of experience with large-scale ETL systems, building clean, maintainable, object-oriented code (Python preferred).
  • Strong programming proficiency in Python, SQL, Spark, Scala.
  • Experience with data modeling, ETL concepts, and manipulating large structured and unstructured data.
  • Experience with data workflows (e.g., Airflow) and data visualization tools (e.g., Looker, Tableau).
  • Deep understanding of technical and functional designs for relational and MPP databases.
  • Proven track record of collaboration and excellent communication skills.
  • Experience in mentoring junior data scientists and analytics engineers.

  • Act as the analytics engineering lead within Ads DS team and contribute to data science data quality and automation initiatives.
  • Ensure high-quality data through ETLs, reporting dashboards, and data aggregations for business tracking and ML model development.
  • Develop and maintain robust data pipelines and workflows for data ingestion, processing, and transformation.
  • Create user-friendly tools for internal use across Data Science and cross-functional teams.
  • Lead efforts to build a data-driven culture by enabling data self-service.
  • Provide mentorship and coaching to data analysts and act as a thought partner for data teams.

LeadershipPythonSQLData AnalysisETLTableauStrategyAirflowData analysisData engineeringData scienceSparkCommunication SkillsCollaborationMentoringCoaching

Posted 2024-11-21
Apply
Apply

๐Ÿ“ US

๐Ÿ” Consumer insights

  • Strong PL/SQL and SQL development skills.
  • Proficient in multiple data engineering languages such as Python and Java.
  • Minimum 3-5 years of experience in Data engineering with Oracle and MS SQL.
  • Experience with data warehousing concepts and cloud-based technologies like Snowflake.
  • Experience with cloud platforms such as Azure.
  • Knowledge of data orchestration tools like Azure Data Factory and DataBricks workflows.
  • Understanding of data privacy regulations and best practices.
  • Experience working with remote teams.
  • Familiarity with tools like Git, Jira.
  • Bachelor's degree in Computer Science or Computer Engineering.

  • Design, implement and maintain scalable pipelines and architecture to collect, process, and store data from various sources.
  • Unit test and document solutions that meet product quality standards prior to release to QA.
  • Identify and resolve performance bottlenecks in pipelines to ensure efficient data delivery.
  • Implement data quality checks and validation processes.
  • Work with Data Architect to implement best practices for data governance, quality, and security.
  • Collaborate with cross-functional teams to address data needs.

PythonSQLETLGitJavaOracleQASnowflakeJiraAzureData engineeringCI/CD

Posted 2024-11-21
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 95000 - 110000 USD per year

๐Ÿ” Healthcare

๐Ÿข Company: Wider Circle

  • Degree in Computer Science, Information Systems, or equivalent education or work experience.
  • 3+ Years experience with AWS or similar technologies (S3, Redshift, RDS, EMR).
  • 3+ Years strong abilities with SQL and Python.
  • Experience building test automation suites for test and production environments.
  • Experience using APIs for data extraction and updating.
  • Experience with Git and version control.

  • Develop and maintain data quality and accuracy dashboards, and scorecards to track data quality and model performance.
  • Develop, maintain, and enhance a comprehensive data quality framework that defines data standards, quality and accuracy expectations, and validation processes.
  • Enhance data quality through rapid testing, feedback, and insights.
  • Partner with Engineering & Product to predict data quality issues and production flaws.
  • Conceptualize data architecture (visually) and implement practically into logical structures.
  • Perform testing of data after ingesting and database loading.
  • Manage internal SLAs for data quality and frequency.
  • Provide expert support for solving complex problems of data integration across multiple data sets.
  • Update and evolve data ecosystem to streamline processes for maximum efficiency.

AWSPythonSQLGitProduct DevelopmentData engineering

Posted 2024-11-17
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 124300 - 186500 USD per year

๐Ÿ” Technology / Cloud Services

๐Ÿข Company: SMX

  • Two + years of experience in relevant fields.
  • Expertise in complex SQL.
  • Knowledge of AWS technologies.
  • Solid understanding of RDBMS concepts, including Postgres, RedShift, and SQL Server.
  • Experience with logical data modeling and database/query optimization.
  • Familiarity with AWS data migration tools (DMS).
  • Scripting knowledge, especially in Python and Lambda.
  • Experience with version control and CI/CD tools such as Git, TFS, and Azure DevOps.
  • Awareness of network authentication and authorization protocols (Kerberos, SAML/OAUTH).
  • Some knowledge of networks.
  • Ability to obtain and maintain a Public Trust clearance; US Citizenship is required.
  • Strong collaboration and communication skills.

  • Assist Data Architect and customer in collecting requirements and documenting tasks for the data loading platform, focusing on performance and data quality.
  • Implement data loading and quality control activities based on project requirements and customer tickets.
  • Create CI/CD pipelines for infrastructure and data loading related to data warehouse maintenance.
  • Code and implement unique data migration requirements using AWS technologies like DMS and Lambda/Python.
  • Resolve identity and access management issues for various data sets in Postgres, Redshift, and SQL Server.

AWSPythonSQLETLGitOAuthAzurePostgresRDBMSCI/CDDevOps

Posted 2024-11-15
Apply
Apply

๐Ÿ“ Latin America, United States, Canada

๐Ÿ” Life insurance

  • The ideal candidate will be independent and a great communicator.
  • Attention to detail is critical.
  • Must possess problem-solving skills.

  • Develop and maintain enterprise data and analytics systems for a US client.
  • Optimize performance by building and supporting decision-making tools.
  • Collaborate closely with software engineering, AI/ML, Cybersecurity, and DevOps/SysOps teams.
  • Support end-to-end data pipelines using Python or Scala.
  • Participate in Agile framework-related tasks.

PythonSoftware DevelopmentAgileData engineeringDevOpsAttention to detail

Posted 2024-11-15
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 210000 - 220000 USD per year

๐Ÿ” Healthcare

  • 10+ years of experience in data engineering with a strong background in building and scaling data architectures.
  • Advanced knowledge of SQL, relational databases, and big data tools like Spark and Kafka.
  • Proficient in cloud-based data warehousing and services, especially Snowflake and AWS.
  • Understanding of AI/ML workflows.
  • Experience in service-oriented and event-based architecture with strong API development skills.
  • Strong communication and project management skills.

  • Lead the Design and Implementation using modern data architecture principles.
  • Scale Data Platform for optimal data extraction, transformation, and loading.
  • Design and build scalable AI and ML platforms.
  • Collaborate with Executive, Product, Clinical, Data, and Design teams.
  • Build and optimize complex data pipelines.
  • Create and maintain data tools and pipelines for analytics and data innovation.
  • Provide technical leadership and mentorship to the data engineering team.

AWSLeadershipProject ManagementPythonSQLJavaKafkaSnowflakeC++AirflowData engineeringSparkCommunication SkillsProblem SolvingOrganizational skills

Posted 2024-11-15
Apply
Apply

๐Ÿ“ North America, South America, Europe

๐Ÿ’ธ 100000 - 500000 USD per year

๐Ÿ” Web3, blockchain

๐Ÿข Company: Edge & Node

  • A self-motivated, team member with keen attention to detail.
  • Proactive collaboration with team members and a willingness to adapt to a growing environment.
  • Familiarity and experience with Rust, particularly focusing on data transformation and ingestion.
  • A strong understanding of blockchain data structures and ingestion interfaces.
  • Experience in real-time data handling, including knowledge of reorg handling.
  • Familiarity with blockchain clients like Geth and Reth is a plus.
  • Adaptability to a dynamic and fully-remote work environment.
  • Rigorous approach to software development that reflects a commitment to excellence.

  • Develop and maintain data ingestion adapters for various blockchain networks and web3 protocols.
  • Implement data ingestion strategies for both historical and recent data.
  • Apply strategies for handling block reorgs.
  • Optimize the latency of block ingestion at the chain head.
  • Write interfaces with file storage protocols such as IPFS and Arweave.
  • Collaborate with upstream data sources, such as chain clients and tracing frameworks, and monitor the latest upstream developments.
  • Perform data quality checks, cross-checking data across multiple sources and investigating any discrepancies that arise.

Software DevelopmentBlockchainData StructuresRustCollaborationAttention to detail

Posted 2024-11-15
Apply
Apply

๐Ÿ“ United States (U.S.)

๐Ÿงญ Full-Time

๐Ÿ” Cloud Computing and Data Engineering

๐Ÿข Company: Via Logic

  • Bachelor's degree required.
  • Minimum of 5 years of experience designing and implementing solutions on AWS.
  • Expertise in CloudFormation, Terraform, Ansible, and Chef for Infrastructure as Code strategies.
  • Proficiency in CI/CD automation with tools like Jenkins and Gitlab.
  • Strong programming and scripting skills in Java, Python, and Bash.
  • Skilled in data models and management using SQL and Python.
  • Experience with cloud security standards like FISMA, FedRAMP, and NIST SP 800-53.
  • Ability to manage multiple projects and demonstrate problem-solving capabilities.
  • Strong teamwork and communication skills.

  • Design, manage, and optimize secure CI/CD pipelines integrating security checks.
  • Utilize containerization technologies to manage microservices.
  • Build, deploy, and optimize cloud infrastructure to support scalable applications.
  • Ensure high availability and reliability of production systems.
  • Implement monitoring solutions for performance tracking.
  • Develop infrastructure as code using Terraform, Ansible, or CloudFormation.
  • Assess and maintain security postures to meet compliance standards.
  • Engage in agile teams for process improvement and DevSecOps practices.
  • Develop automated tests for infrastructure and deployment scripts.
  • Manage information flow ensuring data integrity and accuracy.

AWSDockerPostgreSQLProject ManagementPythonSQLAgileBashCybersecurityDynamoDBETLJavaJenkinsKubernetesOracleProduct ManagementTableauData StructuresRedisCommunication SkillsCollaborationCI/CDProblem SolvingDevOpsTerraformMicroservicesCompliance

Posted 2024-11-14
Apply