Finance Staff Data Engineer, AI Native

New
Within the US, CanadaFull-TimeStaff
Salary190000 - 280500 USD per year
Apply NowOpens the employer's application page

Job Details

Experience
8+ years
Required Skills
AWSPythonSQLAirflowSparkCI/CDTerraformdbtDatabricksGitHub ActionsLLM

Requirements

  • 8+ years designing and operating high-volume distributed data systems in production.
  • Deep expertise with a cloud data platform (Databricks strongly preferred) and AWS from an infrastructure / services architecture, deployment, and ownership perspective.
  • Strong proficiency in Python, SQL, and Spark for large-scale processing.
  • Strong proficiency with modern CI/CD practices (creating GitHub Actions, writing Terraform code to manage infrastructure in Databricks / Airflow / AWS / and others).
  • Hands-on experience with dbt from an infrastructure / deployment perspective and understanding of how platform decisions impact downstream modeling.
  • Strong grasp of data modeling, partitioning strategies, storage formats, and analytical workload optimization.
  • Experience with Airflow and data flow orchestration.
  • Experience with networking challenges in data ingestion (e.g., VPC peering, firewall traversal, API rate limiting, cross-AWS account access, etc.).
  • Able to effectively leverage / oversee LLM-supported code development while maintaining a high quality bar.
  • Demonstrated experience with AI tools to support / enhance development - Claude Code, Cursor, etc.
  • Demonstrated ability to independently scope ambiguous problems and drive them to decisive outcomes.
  • Track record of proactively escalating risks and closing long-running efforts with clear recommendations.
  • Experience defining ingestion validation standards and implementing data quality controls.
  • Proven ability to reduce operational fragility and eliminate single points of failure.
  • Strong systems design skills across distributed and event-based architectures.
  • Demonstrated technical leadership influencing cross-team architectural decisions.
  • Excellent communication skills across engineering, analytics, product, and executive stakeholders.
  • BS in Computer Science, Engineering, Mathematics, or equivalent experience.

Responsibilities

  • Architect and evolve scalable data ingestion and egress frameworks and pipelines that are well tested and offer strong data quality monitoring.
  • Architect and evolve our CI/CD processes - enhancing the testing environment and observability.
  • Architect delivery architecture of data assets to external team partners to reduce manual operational overhead associated with month end close.
  • Enhance our Claude Code / LLM development support capabilities - creating tools / skills / agents that give our LLMs more context.
  • Enhance our security posture in our AWS / Databricks environment.
  • Design and implement distributed data processing systems using Spark and Databricks on AWS.
  • Establish clear ingestion and integration boundaries that eliminate single points of failure.
  • Proactively surface risks, dependencies, and tradeoffs before they impact delivery.
  • Produce clear technical artifacts and recommendations for stakeholders and leadership.
  • Design logical and physical data models balancing flexibility, performance, governance, and scalability.
  • Partner closely with the Analytics Engineers on the Finance Data Team to support high-quality downstream data modeling & reporting.
  • Harden pipelines with monitoring, alerting, SLAs, and recovery mechanisms.
  • Mentor engineers and elevate distributed systems rigor across the team.
View Full Description & ApplyYou'll be redirected to the employer's site
190000 - 280500 USD per year
Apply Now