4–7 years of combined experience and education in Computer Science, Information Systems, Software Engineering, Data Engineering, or a related field. Hands-on experience building and maintaining ETL/ELT data pipelines and working with data warehouses, data lakes, or analytics platforms. Experience working with SQL and familiarity with performance tuning and optimization concepts. Exposure to data visualization or BI tools such as Tableau, Power BI, Superset, or similar. Strong Python development experience Familiarity with modern data tooling and workflows; experience using AI-assisted development tools is a plus. Strong written and verbal communication skills. Ability to work effectively in a collaborative, remote-first environment.