Degree (or equivalent work experience) in Computer Science, Engineering, Information Science, Data Science or a related field (graduate degree preferred). 2+ years of professional experience in data engineering or a closely related field. Ability to communicate complex technical ideas clearly to non-technical audiences Proficiency in Python, SQL. Experience with web scraping/crawling (e.g., Beautiful Soup, Selenium, Scrapy). Familiarity with Google Cloud Platform (or similar), including storage and database services (e.g., Cloud Storage, CloudSQL, Cloud Spanner) and workflow orchestration (e.g., Cloud Composer/Airflow, Cloud Run, Pub/Sub). Experience building and managing data pipelines, especially for text data. Comfort working in fast-moving, high-impact environments, such as startups, AI research labs, or security-focused teams. Experience deploying APIs on cloud platforms (GCP, AWS, Azure) with robust testing, CI/CD, and performance monitoring practices.