Eating the elephant, one iteration at a time
Estimating Value at Risk for an IFRS17 Insurance Portfolio via Monte Carlo Simulation using SAS Viya
Supporting IFRS17 portfolio cash flow modeling and simulation
Corios has been busy lately supporting our client’s actuaries who are implementing the IFRS17 standard on their set of insurance portfolios. The purpose for this engagement is to better estimate the Value at Risk (VaR) on their portfolios’ liability for remaining coverage (LRC) and liability for incurred claims (LIC). LIC in turn includes both the liability for claims reported but not fully reserved, and for those claims incurred but not yet reported. The approach we and our client are following uses a cash flow projection analysis that is conceptually similar to the way our banking clients model future cash flows for secured and unsecured lending portfolios.
Five D’s of Analytic Model Deployment
Moving your models from the lab to the field for business impact
The Challenges: Business Adoption of Analytic Models
In order to increase business adoption of analytic models, there is a great deal of work that must occur in addition to model development, and it extends well beyond the model development team.
- First, businesses need to establish connections between model scores and business decisions. This connection usually takes place outside the analytics team building the model.
- Second, the data structures and systems used by model developers for building models are often different from the one that will be used for implementing in production. Adaptation of the model asset into production should incorporate these differences.
- Third, businesses must be able to easily interpret, assess, and catalogue the model scores and the changes in scores over time on an ongoing basis.
- Fourth, to deploy and execute these models in a production information technology environment and in the field requires diligence, planning, design, execution, and quality assurance practices that are not commonly adopted by model developers.
The Five Ds of Model Deployment
The purpose of this chapter is to provide a set of best practices for analytic model deployment, organized into five phases that we’ve nicknamed the “Five Ds.” They are,
- Develop: Developing and packaging models
- Decisions: Tying operational business decisions to model scores
- Data: Operationalizing analytic model deployment in a specific data architecture
- Delta: Monitoring the workflow and numeric performance of analytic models in the field
- Deploy: Implementing analytic models via a software development life cycle
Model governance checks: Stability, Peformance and Calibration
Benchmarks for CCAR and IFRS17 practitioners
Some enterprises build a formal model governance practice in order to comply with industry standards such as CCAR (banking industry) or IFRS17 (insurance industry). Others know that building a sound predictive model governance discipline is a great idea to improve the quality of your business decisions. Here are some well-tested practices for ensuring three pillars of model governance: Stability, Performance and Calibration.
- Stability: Can I rely on the process that generates my enterprise data as stable and believeable?
- Performance: Can I predict the difference between good and bad outcomes, or between high and low losses?
- Calibration: Can I make those predictions accurately?
Test and learn: a virtuous cycle
Continuous evaluation and improvement of models
Turning your predictive analytics practices into a thriving model factory requires closing the loop from model creation, to performance monitoring, to problem and opportunity identification and subsequent model improvement, in a formal and disciplined way. This includes continuous measurement of each model, and creating a virtuous cycle of model improvement through portfolio management of your models.
Continuous measurement of model performance is about drawing insights from model performance, and identifying opportunities for model enhancement. You can find these opportunities by stratifying the customer population to isolate pools of customers who behave differently with respect to the predicted outcome; tuning the model for specific pools of risks that contribute to the predictive outcome in different ways; and adding new predictive drivers to the model to help address customer behavior not being explained effectively by the current model. Read More
Top Ten data science talent challenges
Successful talent recruitment, growth and retention
In the thirty-plus years I’ve worked with analytics organizations, management has faced several recurring challenges. Without an active (even if informal) practice to watch and react to these challenges, management will find their organizations becoming unable to cope with the ever-rising flood of requests that requires a capable, agile, powerful, happy and healthy analytics team. These Top Ten challenges are sorted in increasing order of strategic impact (i.e., from least to most), and in order of difficulty of detection and prevention (i.e., easiest to most difficult).