Property Casualty 360, May 5, 2014 | Michael P. Voelker
It’s safe to say that you won’t find any punch-card readers in use at insurance companies anymore, and you’d probably be hard-pressed to even find a typewriter. Those two machines were instrumental to record keeping and data processing at insurers for decades but, ultimately, they made way for a new generation of business technology.
Every technology in place at insurers today will be obsolete some day; however, many companies aren’t willing to give up on their legacy systems just yet.
“The truth is, there are many companies that have been chugging along for decades with back-room systems written in RPG or COBOL. You don’t talk about it at conferences because it’s not ‘cool,’ but it’s the reality of the industry,” says Gregory M. Lawler, vice president of information technology at Government Personnel Mutual Life Insurance (GPM Life).
“Especially at smaller to medium insurers, sometimes it is just a practical to retain legacy technology. Given the market they are in, the business they want, and the fact that a current system represents sunk cost and does what they want and they know how to support it, the decision makes sense,” says Frank Petersmark, CIO advocate at architectural consultancy X by 2.
Surprisingly, cost is not the main inhibitor to technology modernization. “We find the most commonly stated reason why insurers are not migrating to a new platform is that they feel the risk of replacement outweighs the benefits,” says Martina Conlon, principal in Novarica’s insurance practice.
The risk lies in trading the known for the unknown. Systems, particularly core platforms, contain business rules that reflect decades of company history, records of products and customers, data that can be mined and other valuable assets.
“Companies worry that new solutions on the market won’t meet what they need, won’t meet their needs quickly enough, or will come with new problems of their own,” Conlon says. “In some cases, organizations also have an sense of the risk involved with the transformation because they’ve had failed projects, problems with vendor management, or other problems that have derailed previous replacement projects.”
A Solid Core
When Lawler was CIO at with his previous employer, SWBC, he annually assessed whether to replace the company’s insurance tracking system, which was the core support for SWBC’s collateral protection insurance business.
“Every year we evaluated the system, and every year we came to the same conclusion—the code was working fine,” says Lawler. “It would have cost between $10 and $15 million to replace that system, and we would not have sold one more policy if we did. Therefore, we were going to ride it as long as we possibly could.”
In business for more than 130 years and writing a full array of personal and commercial lines products, Michigan Millers has accumulated a collection core processing and ancillary systems. “We have deployed modern technology wherever we can, although we have a few dinosaurs,” says Maria Jasinski, vice president of IT at Michigan Millers.
One of those dinosaurs is its AS400-based commercial policy administration system. Although the company replaced its personal lines system in 2009, Michigan Millers is retaining the commercial platform for now.
“The two lines are different in that automation is a much more important part of personal lines than commercial, and we needed to modernize personal lines to achieve that automation,” Jasinski says. “Our strategy in all lines of insurance is built on leveraging strength of relationships with agencies, and our current commercial lines system can support that based on the changes we’ve made.”
Those changes include a new web front end for commercial lines, which Michigan Millers deployed in 2010 for use by both agents and internal staff. “Having drop-down lists, automated class code lookups, underwriter-agent communication, and other features increased the usability of the system,” Jasinski says. “From our agents’ perspective, they are getting responses faster and reducing the manual work they or their staff needs to do.”
Michigan Millers is also in the process of adding an underwriting rules engine, scheduled for completion in July of this year, to target straight-through processing of smaller commercial accounts. Their initial goal is for 50% of accounts to pass through without underwriting intervention.
A Legacy of Challenges
Keeping legacy technology does have its share of challenges. “On the negative side, they’re not making any more RPG or COBOL programmers. That is the hardest challenge to solve,” Lawler says.
The technology industry is trying. IBM has partnered with dozens of colleges, proving COBOL curricula and donating hardware for training. However, even with those efforts, there is likely to be a growing shortage of COBOL programmers.
“We do have a greying of the workforce,” Conlon says. “Insurers need to address talent management to be sure they have the people on staff they need, whether it’s training people internally or partnering with third party technology resources. Also, if insurers need to address the fact that if they are maintaining older code, they may have a challenge attracting workers who want to work with newer technology.”
No matter how well a legacy system is designed, there will be technical limits companies also need to address.
“We’ve definitely dealt with some compatibility issues that have limited our ability to integrate between the front end and back end,” Jasinski says. “As a result, we’ve had to limit the functionality of the front end, such as putting in new business processing functionality but not endorsement processing. There are many more data fields that need to be mapped with endorsements, so we run up against more integration issues that would be much easier to handle with a modern, object-oriented architecture.”
There are other challenges a company faces the longer it keeps a legacy system. GPM Life faced a licensing issue.
“We’ve been running the Genelco back-office system for 25 years, and the license runs out next year,” Lawler says. GPM Life is in the process of upgrading to a Java-based version of the system from Concentrix, a process that is complicated by the fact that the insurer extensively customized or changed the original base code over 3,000 times in the last 25 years.
The company decided to rewrite all 3,000 custom changes itself, hired a Java programmer to train its RPG staff in Java, and spent the past year “going over all the code in excruciating detail,” Lawler says. The company plans to complete the rewrite by December 1 to allow testing before a 2015 deployment.
How long a company can keep a system depends on the system. “Generally, if a solution is maintainable and flexible, and if users can get access to the data they need, then it will have a longer life in an organization compared to one that is not designed well, not well documented, and a bear to maintain,” Conlon says.
“Well-designed systems have databases in accessible formats with a normalized view of data, even if that view is in flat files. Applications that are data-driven are often more flexible, so even if the system is written in COBOL, you can change the behavior of the system by changing the data in the database rather than rewriting code, which helps address speed-to-market and other objectives,” she adds.
Unlike the all-or-nothing switch that was required when leaving the proverbial punch card machine behind, incremental modernization has been a strategy employed by some insurers.
“Companies start to look at code, eliminate redundancies, move data out of code and into database files, and clean the data. They externalize the rating engine. Over time, they can break the legacy up into little pieces and work through modernization in stages,” observes Jeffrey Scott Haner, principal research analyst in Gartner’s insurance industry advisory services.
“Insurers tend to do at least two common things to keep systems running,” Petersmark says. “One, almost everyone has found a way to unbundle the data from the transactional processing system and put it in a warehouse repository, operational data store, or some way to make the data more usable and clean it up. That is a good life extender because, on the business side, what they want is information out of whatever system they are dealing with. Two, they always do something with the user interface to make it more modern.”
Michigan Millers bolted on a rules engine to its commercial lines platform and integrated the new front end with the company’s document management system to achieve greater workflow automation. This type of approach helps reduce the risk of replacement that many companies fear and also puts business objectives at the forefront.
“We see companies taking measured approach legacy modernization, starting with asking what their business goals are, whether they are getting the functionality they need out of current systems to achieve those goals, and how to move forward from there,” Haner says. “They may see, for instance, that the functionality of a system is in pretty good shape, but the underlying technology is dated to the point of being unstable, or where they don’t have the skills to maintain it. We’ve seen a definite shift to a business-first approach to code transformation over the past few years.”
Reaching the Limit
However, just like punch cards and manual typewriters, there will come a time when any legacy—or current—technology needs to be replaced.
“Unfortunately, we are getting close to hitting the limit of how far we can take our technological capabilities without making dramatic changes. There will come a point where we will need a new platform that offers greater flexibility and easier functionality enhancement. There are also financial savings on the IT side that we could gain from new technology due to easier system maintenance,” Jasinski says.
“Right now we can meet our speed-to-market strategy, but as we continue to grow, we will eventually need to change,” she adds. “I would say in the next three years, we will need to making some serious decisions.”