Verikai_External

Verikai is the leading provider of predictive analytics and AI-powered solutions for the insurance industry. Through our innovative platform, Verikai empowers insurers with advanced data-driven risk assessment insights, streamlining the underwriting process and optimizing business outcomes. With a commitment to technological advancement, Verikai is changing the insurance industry by assisting underwriters with decision-making when it comes to risk. We are a growing and well-funded company with a great product and even better ideas. If you want to make an immediate impact on the way the insurance industry works while working in a fun, fast-paced, and supportive environment, you’ve come to the right place. To learn more, visit our Career Page .

Related companies:

Jobs at this company:

Apply

📍 United States of America

🧭 Full-Time

💸 110000.0 - 160000.0 USD per year

🔍 Insurance industry

  • Bachelor's degree or above in Computer Science, Data Science, or a related field.
  • At least 5 years of relevant experience.
  • Proficient in SQL, Python, and data processing frameworks such as Spark.
  • Hands-on experience with AWS services including Lambda, Athena, Dynamo, Glue, Kinesis, and Data Wrangler.
  • Expertise in handling large datasets using technologies like Hadoop and Spark.
  • Experience working with PII and PHI under HIPAA constraints.
  • Strong commitment to data security, accuracy, and compliance.
  • Exceptional ability to communicate complex technical concepts to stakeholders.
  • Design, build, and maintain robust ETL processes and data pipelines for large-scale data ingestion and transformation.
  • Manage third-party data sources and customer data to ensure clean and deduplicated datasets.
  • Develop scalable data storage systems using cloud platforms like AWS.
  • Collaborate with data scientists and product teams to support data needs.
  • Implement data validation and quality checks, ensuring accuracy and compliance with regulations.
  • Integrate new data sources to enhance the data ecosystem and document data strategies.
  • Continuously optimize data workflows and research new tools for the data infrastructure.

AWSPythonSQLDynamoDBETLSpark

Posted 7 days ago
Apply