About the role
you will play a key role in operationalizing artificial intelligence solutions that improve underwriting, claims, risk modeling, and customer experience. You will work closely with data scientists, data engineers, and actuarial teams to ensure ai models are production-ready, scalable, and resilient.
key responsibilities:
* design and implement automated ai pipelines for training, testing, deployment, and monitoring of models used in insurance applications such as claims prediction, fraud detection, and policy pricing.
* build scalable data workflows using pyspark and apache spark within databricks.
* collaborate with data scientists and actuaries to package models and deliver reproducible, governed solutions.
* implement ci/cd pipelines for ai using tools such as mlflow, azure devops, or github actions.
* develop and apply techniques for data drift and model drift detection, including statistical monitoring, performance baselines, and alerts.
* set up monitoring, logging, and alerting frameworks to maintain ai model reliability in production.
* ensure compliance with data privacy, regulatory standards, and model governance practices required in the insurance sector.
qualifications:
desired qualifications:
* 3+ years of experience in mlops, data science engineering, or ai platform roles.
* strong programming in python, with solid expertise in pyspark.
* deep knowledge of spark and databricks for big data processing and scalable ai.
* proficient in sql with the ability to work on complex joins and performance tuning.
* experience operationalizing ai models in production (batch and real-time).
* working knowledge of mlflow, docker, kubernetes, and cloud-native services (preferably azure).
* proven experience in implementing and managing data drift and model drift detection using statistical and ai-based methods.
* familiar with insurance data domains (e.g., claims, underwriting, loss ratio, customer churn).
* understanding of data governance, model risk management, and compliance in regulated industries.
nice to have:
* experience with delta lake, unity catalog, or feature stores.
* knowledge of data mesh, event-driven architectures, or real-time streaming.
* familiarity with actuarial modeling, telematics, or fraud analytics.
* certifications in azure, databricks, or ai tools.