Apply

Senior Software Engineer, Data

Posted about 20 hours agoViewed

View full description

๐Ÿ’Ž Seniority level: Senior, 8+ years

๐Ÿ“ Location: United States

๐Ÿ’ธ Salary: 205000.0 - 235000.0 USD per year

๐Ÿ” Industry: Fintech

๐Ÿข Company: Found

๐Ÿ—ฃ๏ธ Languages: English

โณ Experience: 8+ years

๐Ÿช„ Skills: AWSPostgreSQLPythonSQLCloud ComputingData AnalysisETLGCPMachine LearningMySQLAirflowAzureData engineeringCommunication SkillsCollaborationMentoringData visualizationData modeling

Requirements:
  • 8+ years experience in data infrastructure, data engineering, or analytics engineering roles.
  • Experience with relational databases (Postgres/MySQL), dbt, and Airflow
  • Advocated for and rolled out new technologies or open-source frameworks that empower data and product organizations to make data driven decisions.
  • Experience working with cloud-native big data infrastructure on the public cloud (GCP/AWS/Azure, BigQuery/Redshift/Synapse).
  • Ability to write clean and maintainable code (primarily Python).
  • Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams.
Responsibilities:
  • Design, build, and operate large scale data infrastructure systems across multiple environments to store, aggregate, and progress large amounts of data.
  • Build a data platform-as-a-service for internal customers, ensuring data integrity, sanity, tagging, and discoverability.
  • Bridge the gap between engineering and analytics, helping inform the roadmap for data infrastructure for the company.
  • Implement various ETL infrastructures for the companyโ€™s topline metrics, reporting, and product functionality.
  • Contribute to the development of best practices, standards, and frameworks for data engineering at Found.
  • Provide mentorship and guidance to help grow and develop the skills of the broader data team, fostering a culture of continuous learning and excellence.
Apply

Related Jobs

Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ’ธ 177000.0 - 213000.0 USD per year

๐Ÿ” FinTech

๐Ÿข Company: Flex

  • A minimum of 6 years of industry experience in the data infrastructure/data engineering domain.
  • A minimum of 6 years of experience with Python and SQL.
  • A minimum of 3 years of industry experience using DBT.
  • A minimum of 3 years of industry experience using Snowflake and its basic features.
  • Familiarity with AWS services, with industry experience using Lambda, Step Functions, Glue, RDS, EKS, DMS, EMR, etc.
  • Industry experience with different big data platforms and tools such as Snowflake, Kafka, Hadoop, Hive, Spark, Cassandra, Airflow, etc.
  • Industry experience working with relational and NoSQL databases in a production environment.
  • Strong fundamentals in data structures, algorithms, and design patterns.
  • Design, implement, and maintain high-quality data infrastructure services, including but not limited to Data Lake, Kafka, Amazon Kinesis, and data access layers.
  • Develop robust and efficient DBT models and jobs to support analytics reporting and machine learning modeling.
  • Closely collaborating with the Analytics team for data modeling, reporting, and data ingestion.
  • Create scalable real-time streaming pipelines and offline ETL pipelines.
  • Design, implement, and manage a data warehouse that provides secure access to large datasets.
  • Continuously improve data operations by automating manual processes, optimizing data delivery, and redesigning infrastructure for greater scalability.
  • Create engineering documentation for design, runbooks, and best practices.

AWSPythonSQLBashDesign PatternsETLHadoopJavaKafkaSnowflakeAirflowAlgorithmsCassandraData engineeringData StructuresNosqlSparkCommunication SkillsCI/CDRESTful APIsTerraformWritten communicationDocumentationData modelingDebugging

Posted 9 days ago
Apply
Apply

๐Ÿ“ United States, Canada

๐Ÿงญ Full-Time

๐Ÿ’ธ 140000.0 - 160000.0 USD per year

๐Ÿ” Fraud Prevention and AML Compliance

๐Ÿข Company: Sardine๐Ÿ‘ฅ 101-250๐Ÿ’ฐ $70,000,000 Series C about 1 month agoCryptocurrencyFraud DetectionFinTechSoftware

  • 5+ years of experience in backend or data engineering roles
  • Strong knowledge of database systems (SQL and NoSQL)
  • Expertise in a modern programming language (Go, Python, Java)
  • Familiarity with cloud platforms (AWS, GCP, Azure)
  • Experience with containerization (Docker, Kubernetes)
  • Design and implement ETL pipelines for large datasets
  • Develop and optimize APIs for data retrieval
  • Architect and manage scalable storage solutions
  • Collaborate on data product development
  • Perform data analysis for client value
  • Document processes and mentor junior engineers

AWSDockerPythonSQLDynamoDBElasticSearchETLGCPKubernetesNosqlCI/CD

Posted 30 days ago
Apply
Apply

๐Ÿ“ California, Colorado, Hawaii, Illinois, Maryland, Massachusetts, New York, Oregon, Texas, Washington

๐Ÿงญ Full-Time

๐Ÿ’ธ 150000.0 - 170000.0 USD per year

๐Ÿ” Recruiting and feedback tools

๐Ÿข Company: Textio๐Ÿ‘ฅ 51-100๐Ÿ’ฐ $999,972 about 3 years ago๐Ÿซ‚ Last layoff about 1 year agoArtificial Intelligence (AI)Human ResourcesMachine LearningEnterprise SoftwareNatural Language ProcessingSoftware

  • Hands-on experience shipping customer-facing features including reports and analytics
  • Solid experience with data warehouse software and cloud services (SQL, Redshift, AWS CDK, Meltano)
  • Track record of writing complex SQL queries for performance and efficiency
  • Ability to work with design and product to build user-friendly features
  • Collaboration skills in a diverse and inclusive environment
  • Fast-paced startup experience is a plus
  • Maintain a strong, user-centric approach to feature development
  • Work on ambiguous problems and advocate for clear solutions
  • Enhance data warehouse to meet business needs
  • Collaborate with diverse teams using AI/LLM technologies
  • Enable advanced analytics through real-time and batch processing
  • Improve scalability and performance of data pipelines
  • Implement monitoring, alerting, and self-healing mechanisms for data systems

AWSSQLETL

Posted about 2 months ago
Apply
Apply

๐Ÿ“ United States

๐Ÿงญ Full-Time

๐Ÿ” Blockchain intelligence and financial technology

๐Ÿข Company: TRM Labs๐Ÿ‘ฅ 101-250๐Ÿ’ฐ $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 5+ years of hands-on experience in architecting scalable API development and distributed system architecture.
  • Exceptional programming skills in Python and proficiency in SQL or SparkSQL.
  • In-depth experience with data stores such as BigQuery and Postgres.
  • Proficiency in data pipeline tools like Airflow and DBT.
  • Expertise in data processing technologies including Dataflow, Spark, Kafka, and Flink.
  • Competence in deploying and monitoring infrastructure using tools like Docker, Terraform, Kubernetes, and Datadog.
  • Proven ability in loading, querying, and transforming extensive datasets.
  • Build highly scalable features integrating with multiple blockchains.
  • Design intricate data models for optimal storage and retrieval supporting sub-second latency for querying blockchain data.
  • Collaborate across departments with data scientists, backend engineers, and product managers to enhance TRMโ€™s products.

DockerPythonSQLApache AirflowKafkaKubernetesPostgresSparkTerraform

Posted 3 months ago
Apply
Apply

๐Ÿ“ Alabama, Arizona, California, Colorado, Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois, Indiana, Iowa, Kansas, Massachusetts, Maine, Maryland, Michigan, Missouri, Minnesota, Montana, New Hampshire, New Jersey, New Mexico, New York, Nevada, North Carolina, Ohio, Oklahoma, Oregon, Pennsylvania, Tennessee, Texas, Utah, Vermont, Virginia, Washington, West Virginia and Wisconsin

๐Ÿงญ Full-Time

๐Ÿ’ธ 180000 - 190000 USD per year

๐Ÿ” FinTech

๐Ÿข Company: Esusu๐Ÿ‘ฅ 101-250๐Ÿ’ฐ $130,000,000 Series B about 3 years agoCreditFinancial ServicesFinTech

  • Strong back-end and front-end engineering work experience.
  • Mastery of core development practices: Agile, TDD, CI/CD, and DevOps.
  • Fluency in languages used at Esusu: Go, Typescript, Python, SQL.
  • Mastery of security protocols and practices.
  • Extensive experience building scalable microservice systems.
  • Experience with AWS services: API Gateway, Lambda, Cognito, S3, ECS, RDS/Aurora, SNS, SQS, SES, Cloudformation.
  • Experience writing and maintaining web services on containerized and serverless environments.
  • Experience with SQL and NoSQL databases like PostgreSQL and MongoDB.
  • You and your team will drive the evolution of the cloud-based suite of services that support all customer-facing activities.
  • You will collaborate with engineers, product managers, and business stakeholders to design, build and deliver secure, reliable, fast, and scalable solutions.
  • You will mentor other developers on the team.
  • You will maintain existing back-end systems including testing, troubleshooting, refactoring, and adding new features.

AWSPostgreSQLPythonSQLAgileFlutterJavascriptKubernetesMongoDBTypeScriptAmazon Web ServicesGoServerlessReactCollaborationCI/CDDevOpsDocumentationCompliance

Posted 4 months ago
Apply
Apply

๐Ÿ“ United States

๐Ÿ” Life sciences

  • Applicants must have the unrestricted right to work in the United States.
  • Veeva will not provide sponsorship at this time.
  • Spearhead the development of new architecture for the Data platform from the ground up.
  • Design and build a resilient, scalable cloud-based platform along with its accompanying tools.
  • Empower Opendata teams to efficiently create and distribute valuable data assets.
  • Exercise end-to-end ownership for the project.

Backend DevelopmentLeadershipSoftware DevelopmentCross-functional Team LeadershipCommunication SkillsAnalytical SkillsCollaboration

Posted 4 months ago
Apply
Apply

๐Ÿ“ United States

๐Ÿ’ธ 200000 - 255000 USD per year

๐Ÿ” Blockchain intelligence and financial services

๐Ÿข Company: TRM Labs๐Ÿ‘ฅ 101-250๐Ÿ’ฐ $70,000,000 Series B over 2 years agoCryptocurrencyComplianceBlockchainBig Data

  • A Bachelor's degree (or equivalent) in Computer Science or a related field.
  • 8+ years of hands-on experience in architecting distributed systems.
  • Exceptional programming skills in Python and proficiency in SQL or SparkSQL.
  • In-depth experience with data stores such as Iceberg, Trino, BigQuery, and StarRocks.
  • Proficiency in tools like Airflow and DBT for data pipeline orchestration.
  • Expertise in technologies like Spark, Kafka, and Flink.
  • Competence in deploying and monitoring cloud infrastructure with tools like Docker and Terraform.
  • Proven ability in managing extensive datasets.
  • Build highly reliable data services to integrate with blockchains.
  • Develop complex ETL pipelines for real-time data processing.
  • Design intricate data models for optimal storage and retrieval.
  • Oversee deployment and monitoring of large database clusters.
  • Collaborate with data scientists and engineers to enhance products.

DockerPythonSQLBlockchainETLKafkaKubernetesAirflowData engineeringSparkCollaborationTerraform

Posted 7 months ago
Apply