Staff Data Engineer
New
E
ExacteraFinTech SaaS
RemoteFull-TimeStaff
Salary not disclosed
Apply NowOpens the employer's application page
Job Details
- Experience
- 5+ years
- Required Skills
- AWSPythonSQLETLTerraformData modelingDatabricksPySpark
Requirements
- SQL, Python, and PySpark for production pipeline implementation and performance optimization
- Databricks experience (Delta Lake, Workflows, and Databricks SQL)
- Unity Catalog familiarity preferred
- 5+ years in data engineering with demonstrated ability to own problems end-to-end
- Experience building and maintaining ETL/ELT pipelines at scale, including error handling, monitoring, and data quality validation
- Strong data modeling skills across structured and semi-structured data sources
- AWS experience (S3, IAM, VPC)
- Infrastructure-as-code experience (Terraform preferred)
- Familiarity with data governance patterns (Unity Catalog, data lineage, access controls)
- Demonstrated ability to exercise independent architectural judgment
- Experience mentoring or guiding junior and mid-level data engineers
- Strong written and verbal communication
- Onshore (US-based) role requiring timezone overlap, async-light communication, and direct stakeholder engagement
Responsibilities
- Build and maintain production data pipelines within established patterns and governance, ensuring reliability and performance at multi-terabyte scale.
- Exercise architectural judgment on data modeling, pipeline design, and platform usage, translating complex business requirements into scalable data solutions.
- Engage proactively with product and engineering stakeholders to translate requirements into data solutions, serving as the primary onshore technical point of contact.
- Drive platform quality through code reviews, testing practices, and engineering standards.
- Serve as onshore escalation point and institutional knowledge backup for platform decisions, reducing single-point-of-failure risk.
- Implement data pipelines that serve multiple product lines (Transfer Pricing, R&D Services, RoyaltyStat, Provisioning).
- Lead pipeline implementation for migrating multi-terabyte datasets from legacy systems to Databricks.
- Provide senior judgment, own problems end-to-end, make independent architectural decisions, and mentor engineers.
- Bridge the gap between product teams and data infrastructure, translating business requirements into data solutions.
View Full Description & ApplyYou'll be redirected to the employer's site