Corios and Databricks partner on roadmap to cloud of choice

Corios Rosetta eases legacy SAS analytics migration to the Databricks Lakehouse Platform

Ahead of next week’s Data and AI Summit, Corios announces a new partnership with event host Databricks, the Data and AI company. Corios and the Corios Rosetta solution will now offer enterprises a proven migration experience for legacy SAS analytics into the modern cloud of their choice on the Databricks Lakehouse Platform.

Read More

Maximize automation value to restore the human perspective

Apply automation in risk analytics platforms to reduce manual data preparations and technical debt

Reduce manual data preparations and technical debt with automation in risk analytics platforms

photo credit john-allen-QMaz1luQc24-unsplash
Retain the human perspective on data for improved analytics insights

The nature of risk analytics will always equate to an environment of quickly changing demands. Unpredictability and volatility of external factors and a need for rapid and agile adaptation work against the need to produce accurate and informed forecasts. Delivering these results often forces enterprises to revert to manual interventions in input data generation processes.

But wait a minute: why are we still wasting our human effort on hand-tooling work arounds when we would rather apply our passion, talents, and unique institutional knowledge to make the most of our analytics assets? Sadly, the time we think we save with quick adjustments only introduce less-than-helpful human error and add up to technical debt with processes that are over-managed at the expense of real insight yields.

Read More

Compliance Analytics Gaps: How to Fix Bad History with Novel Data Bounding

Close Gaps in Compliance Analytics with Data Bounding

It feels like with each passing year the stakes rise for predicting and managing risk. Does the universe need to keep dishing us market, environmental and health crises when regulated industries are amid massive digital transformation and analytic platform modernizations?

The insurance industry is facing down the compliance deadline for the FASB’s Long Duration Targeted Improvements (LDTI). Public carriers should be on the cusp of final transition to running new models developed over the past two years.

Similarly, most financial institutions should have their Credit Expected Credit Loss (CECL) compliance transitions in the rearview mirror. However, many credit unions have yet to reach key milestones toward implementation of the new credit standard.

And for private and smaller entities just now converting new roadmaps and processes into trial implementations, what are the lessons learned from their larger counterparts to help smooth the process?

Timing, business considerations and the scope of modernization may vary among companies impacted by new regulatory standards, but there is one unavoidable similarity: all will surface analytics gaps. Deadlines can’t be missed, but issues must be addressed. All that’s left is determining how and when.

Read More

Go Back, Jack, Do It Again

Every Steely Dan Song: Do It Again

Rosetta Academy Lessons in Practice: Design Patterns and the Anatomy of the Analytics Workload

(Credits to Steely Dan for the title reference)

Go back and try a second model, with a broader view and expanded skills, and you might just find a fresh and forward-looking method for analytics in the modern cloud enterprise. The anatomy of analytics workloads and the application of foundational SAS and SQL skills in the land of open-source languages reveal a wider perspective on what’s possible with legacy SAS assets. Here’s a story about a Corios client that gained both training and a new point of view on modeling with a dual approach and an eye toward innovation.

Building a model pipeline twice in both conventional and open-source contexts

Corios was hired by a rapidly growing bank to build the newest release of their prospect acquisition scorecard model; not once, but twice: once in SAS 9.4 (their production environment), and the second time in an open-source approach that leveraged Python, Dask and Spark.

The reason for the second modeling effort was to explore what an innovative, modern cloud-focused analytic environment could and should look like to support predictive model lifecycle management: authoring, champion/challenger experimentation, validation, version management, cloud deployment, drift analysis and ongoing refresh.

Two Pipelines: Breaking in a New Model
white pipes

Building the first traditional model pipeline was familiar territory for us because we had built several models for this client that had been put into production over the past few years. The greatest challenge was that the bar was set very high for the mathematical performance of the model, since we had to beat the performance of the current version which was constructed effectively and exhibited strong performance.

The second model broke a lot of new ground for the client. Major elements included:

  • Amazon Web Services clustered compute, storage, code development and management
  • Python, Dask and Spark as open-source frameworks for analytic pipeline development
  • Side-by-side comparisons for analytics assets built in familiar territory (SAS) and unfamiliar territory (open-source frameworks on cloud services); and
  • Novel analytics techniques (with new performance contributions) made available by open-source frameworks for the first time in a native, business-critical context

But what was the basis for the new framework to arrive at this breakthrough model? At its core is the Analytics Design Patterns course that is part of the Corios Rosetta Academy curriculum.

Read More

Picking Up the Jellyfish

Modernizing analytics practices for 800+ insurance analysts

We recently participated in the FIMA Boston conference, one of the many springtime financial services industry events showcasing data, analytics and AI innovations on the digital transformation odyssey. And the single loudest takeaway from these events regarding this journey we’ve all been on now for nearly a decade is that it’s a journey with no end.

No longer viewed as a destination, transforming enterprise analytics is a virtuous cycle of data decisioning and predictions, governance and security that drive greater transparency and fluidity in our pursuit of analytics excellence in the cloud. Yet despite the jellyfish-like squishiness and uncertain risk for pain, there’s more optimism and a sense of clarity found in a more well-worn path for modernizing data analytics assets.

A look back at our work so far with customers in insurance and financial services highlights captivating insights learned in their legacy SAS asset transformations. In one assessment alone we discovered that only 30% of analysts were actively developing on the platform, a percentage of those were exporting the data rather than leverage it directly in the warehouse, and nearly 20% created major security risks placing open text passwords in their code.

Which highlights the importance in our second big takeaway from discussions with the C-suite and down the command line: it’s beyond time to get hands-on and transform workloads that are more and more at odds with the permeating enterprise data strategy in the cloud. Now that the broad infrastructure and processes are in place, all eyes and budgets must focus on decades-old methods and platforms like your legacy SAS workloads that today mostly encumber the people and processes tied to their analytics advantage.

Finding the edges of the jellyfish without getting stung

Corios was hired by a prominent insurance carrier to modernize their analytics and data practices for all things analytical: underwriting, pricing, claims, repairs, coverage, compliance, and regulatory support. They wanted to reduce the cost of data storage, to align all their analysts on a consolidated set of tools and environments, and to modernize the enterprise so they could react to climate events and other large-scale adverse events faster and more efficiently.

Read More

What’s in your SAS code? Find out before you modernize

“I have 4,000 SAS users and 7 million lines of SAS code, and nobody knows what it all does.”

These are the words of the Chief Data Officer of one of our insurance industry clients.

And the CDO isn’t the only one in the C-suite interested in discovering what’s in their SAS code. It’s a ‘must-know’ for data and analytics transformation to leverage advantages with open source and cloud investments. Our key stakeholders at companies across banking, insurance, manufacturing and energy utilities are all asking the same question:

How do we tackle SAS modernization as part of our digital transformation?

But before you can answer the ‘what they do’ question about your SAS workloads it’s valuable to know the scope and depth of SAS in your enterprise and the impact all that code is having on your operation. Some wonder if this discovery is even possible; we know the key is establishing a ground truth baseline of existing SAS assets – information which is often hiding in plain sight.

Corios Rosetta Scanner helps clarifies SAS usage for the C-Suite

In the summary video below (appx 13-minute run time) I’ll walk you through some concrete examples of Corios Rosetta scanner tactics we use to quantify and qualify the value of your SAS analysts, data and workloads. Here’s a brief snapshot of key takeaways for the CIO, CDO, CCO and CAO:

Read More