Kapitus

Private Company
ShareTweet

Open Positions5

Remote USFull-TimePosted
  • Design and implement scalable, secure, and high-performance data architectures using Snowflake and Databricks
  • Develop and manage ELT/ETL workflows using dbt, ensuring modular, testable, and well-documented transformations
  • Develop and maintain cloud-native data pipelines on AWS (e.g., S3, Glue, Lambda, Redshift, EMR)
  • Architect and optimize data models for analytics, reporting, and machine learning use cases
  • Build and manage data ingestion, transformation, and orchestration workflows
  • Leverage Python for data engineering, automation, and advanced data processing
  • Implement and manage Feature Store solutions for ML lifecycle management
  • Utilize Snowflake Cortex for AI/ML-powered data applications and advanced analytics
  • Collaborate with data scientists, analysts, and business stakeholders to translate requirements into technical solutions
  • Establish data governance, data quality, and security best practices
  • Optimize performance, cost, and scalability of data platforms
  • Mentor and guide data engineers and other team members
AWSPythonKafka+6 more
Showing 1 of 5 positions

Similar Companies