Senior Software Engineer, Data Platform
New
L
LithicFinTech, Payments
United States, Canada, Netherlands, Poland, or Czech RepublicFull-TimeSenior
Salary170000 - 280000 CAD per year
Apply NowOpens the employer's application page
Job Details
- Required Skills
- AWSPythonSQLDjangoFlaskKafkaSnowflakeAirflowFastAPITerraformdbtAWS Lambda
Requirements
- Strong Python proficiency with experience building backend services and REST APIs
- Experience with web frameworks such as FastAPI, Flask, Django, or similar
- Solid SQL skills and hands-on experience with modern cloud data warehouses (Snowflake strongly preferred)
- Experience designing and building production APIs with proper authentication, versioning, and error handling
- Familiarity with CI/CD, automated testing, and operational reliability practices
- A track record of shipping reliable, well-tested services in production environments
- Comfort navigating ambiguity and driving projects forward with minimal oversight
- Experience with data pipeline development using tools like Airflow, Airbyte, Dagster, or similar (preferred)
- Familiarity with dbt or similar transformation frameworks (preferred)
- Experience in fintech, payments, or other financial services environments (preferred)
- Familiarity with AWS services (Lambda, S3, RDS, API Gateway, ECS/Fargate) (preferred)
- Kafka or event streaming experience (preferred)
- Infrastructure-as-code experience (Terraform, Pulumi) (preferred)
- Experience at a company processing high transaction volumes where correctness and reliability are non-negotiable (preferred)
Responsibilities
- Design, build, and maintain backend services and REST APIs that serve data from various SQL subsystems and other data sources
- Develop well-tested, production-grade Python services with clean API contracts, proper authentication, versioning, and error handling
- Work closely with the Analytics Engineering team to expose modeled data (billing, settlement, finance) through APIs that downstream consumers can rely on
- Build internal tooling and services that enable the broader organization to self-serve their data needs without writing SQL
- Participate in code reviews, system design discussions, and engineering best practices across the Infrastructure org
- Contribute to service observability: logging, metrics, alerting, and on-call practices for the services you own
- Maintain and improve existing data pipelines that move data from source systems into Snowflake (Airflow, Airbyte)
- Contribute to the dbt project alongside the Analytics Engineering team — model improvements, test coverage, and data quality
- Support data governance practices including access controls, lineage documentation, and data quality standards
View Full Description & ApplyYou'll be redirected to the employer's site