Apply📍 United States
💸 126100.0 - 168150.0 USD per year
- 5+ years of development experience with any of the following software languages: Python or Scala, and SQL (we use SQL & Python) with cloud experience (Azure preferred or AWS).
- Hands-on data security and cloud security methodologies.
- Experience in configuration and management of data security to meet compliance and CISO security requirements.
- Experience creating and maintaining data intensive distributed solutions (especially involving data warehouse, data lake, data analytics) in a cloud environment.
- Hands-on experience in modern Data Analytics architectures encompassing data warehouse, data lake etc. designed and engineered in a cloud environment.
- Proven professional working experience in Event Streaming Platforms and data pipeline orchestration tools like Apache Kafka, Fivetran, Apache Airflow, or similar tools
- Proven professional working experience in any of the following: Databricks, Snowflake, BigQuery, Spark in any flavor, HIVE, Hadoop, Cloudera or RedShift.
- Experience developing in a containerized local environment like Docker, Rancher, or Kubernetes preferred
- Build high-performing cloud data solutions to meet our analytical and BI reporting needs.
- Design, implement, test, deploy, and maintain distributed, stable, secure, and scalable data intensive engineering solutions and pipelines in support of data and analytics projects on the cloud, including integrating new sources of data into our central data warehouse, and moving data out to applications and other destinations.
- Identify, design, and implement internal process improvements, such as automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability, etc.
- Build and enhance a shared data lake that powers decision-making and model building.
- Partner with teams across the business to understand their needs and develop end-to-end data solutions.
- Collaborate with analysts and data scientists to perform exploratory analysis and troubleshoot issues.
- Manage and model data using visualization tools to provide the company with a collaborative data analytics platform.
- Build tools and processes to help make the correct data accessible to the right people.
- Participate in active rotational support role for production during or after business hours supporting business continuity.
- Engage in collaboration and decision making with other engineers.
- Design schema and data pipelines to extract, transform, and load (ETL) data from various sources into the data warehouse or data lake.
- Create, maintain, and optimize database structures to efficiently store and retrieve large volumes of data.
- Evaluate data trends and model simple to complex data solutions that meet day-to-day business demand and plan for future business and technological growth.
- Implement data cleansing processes and oversee data quality to maintain accuracy.
- Function as a key member of the team to drive development, delivery, and continuous improvement of the cloud-based enterprise data warehouse architecture.
AWSDockerPythonSQLAgileApache AirflowCloud ComputingETLHadoopKubernetesSnowflakeApache KafkaAzureData engineeringSparkScalaData visualizationData modelingData analytics
Posted 3 days ago
Apply