Tag Archives: manufacturing

A Manufacturing Risk: the Talent Gap

Twenty five years ago labor experts warned employers about an impending shortage in the skilled manufacturing workforce caused by the soon-to-be-departing baby boomers. Almost no one listened.

Those few employers who did realized preparation meant investing in training. Investment = money so many employers put it off, especially during the Great Recession of 2008 – 2010.

So here we are America … needing to fill 3.5 million manufacturing jobs in the next 10 years, according to the Deloitte publication, The Skills Gap in U.S. Manufacturing 2015 & Beyond.”

Deloitte opines that we’ll be lucky to fill 1.5 million of those openings, leaving a gap of 2 million jobs. This potential shortfall didn’t go unnoticed by Daimler Trucks North America (DTNA), a manufacturer of class 5-8 commercial vehicles, school buses, and heavy-duty to mid-range diesel engines. The company saw this bullet coming years ago.

See also: Insurance And Manufacturing: Lessons In Software, Systems, And Supply Chains  

To those in the know, the skilled workforce shortage conundrum isn’t new. As far back as 1990, the National Center on Education and the Economy identified this job shortfall in its report, “The American Workforce – America’s Choice: High Skills or Low Wages,” stating large investments in training were needed to prepare for the slow workforce growth.

If you look at the burgeoning skills gap, coupled with vanishing high school vocational programs, how, as an employer, do you recruit potential candidates?

To not address the millennials’ employer predilections is to miss an opportunity to tap into a vast resource of potential talent.

DTNA addresses the issue by reaching out to high schools throughout the U.S. via the Daimler Educational Outreach Program, which focuses on giving to qualified organizations that support public high school educational programs in STEM (science, technology, engineering and math), CTE (career technical education), and skilled trades’ career development.

Daimler also works in concert with school districts to conduct week-long technology schools in one of the manufacturing facilities, all in an effort to encourage students to consider manufacturing (either skilled or technical) as a vocation.

Like all forward-looking companies, Daimler must address the needs of the millennials who – among a number of their desires – want to make the world a better place. Jamie Gutfreund, chief strategy officer for the Intelligence Group notes that 86 million millennials will be in the workplace by 2020 — representing 40 percent of the total working population.

To not address the millennials’ employer predilections is to miss an opportunity to tap into a vast resource of potential talent. To that end, Daimler has always emphasized research in renewable resources and community involvement as well as a number of philanthropic endeavors. Not only is it the right thing to do, but it also appeals to the much-needed next generation who will fill the boots of the exiting boomers.

See also: 4 Steps to Integrate Risk Management  

Just because a company manufactures heavy-duty commercial vehicles doesn’t mean it can’t give back to the environment and the community at large. And, in the end, that will help make the world a better place.

The Cyber Threat in Manufacturing

A friend of mine asked me if the cyber-risk threat was a bit of flimflam designed to sell more insurance policies. He compared cyber-risk to the Red Scare of the 1950s, when families scrambled to build bomb shelters to protect them from a war that never came. The only ones who got rich back then were the contractors, he concluded.

I found his question incredible. But I realized that he didn’t work in the commerce stream, per se, which quelled my impulse to slap him around.

See also: 3 Things on Cyber All Firms Must Know  

I shared with him some statistics that sobered him up quickly. I explained that cyber-crime costs the global economy more than $400 billion per year, according to estimates by the Center for Strategic and International Studies. Each year, more than 3,000 companies in the U.S. have their systems compromised by criminals. IBM reports more than 91 million security events per year. Worse yet, the Global Risks 2015 report, published in January by the World Economic Forum (WEF), included this rather stark warning: “90% of companies worldwide recognize they are insufficiently prepared to protect themselves against cyber-attacks.”

Cyber protection is not just about deploying advanced cyber threat technology to manage risk; you also have to educate your employees to not fall victim to unassuming scams like “phishing,” which is stealing private information via e-mail or text messages. It remains the most popular con as far as stealing company data because it’s so painfully simple. Just pretend to be someone else and hope a few people fall for it.

While most people understand the threat to data privacy for retailers, hospitals and banks and other financial institutions, few realize that manufacturers are also vulnerable in terms of property damage and downtime. In 2014, a steel manufacturing facility in Germany lost control of its blast furnace, causing massive damage to the plant. The cause of the loss was not employee error, but rather a cyber-attack. While property damage resulting from a cyber-attack is rare, the event was a wake-up call for manufacturers worldwide.

According to The Manufacturer newsletter, “the rise of digital manufacturing means many control systems use open or standardized technologies to reduce costs and improve performance, employing direct communications between control and business systems.” This exposes vulnerabilities previously thought to affect only office computers. In essence, according to The Manufacturer, cyber attacks can now come from both inside and outside of the industrial control system network.

See also: Now Is the Time for Cyber to Take Off  

Manufacturers also need to be concerned about cyber attacks that would: a) interrupt their physical supply chain or, b) allow access to their system via the third-party vendor. Manufacturers must then take steps to mitigate those risks. When Target and Home Depot were hacked several years ago, it wasn’t a direct attack on them but an attack on one of their third-party vendors. By breaching the vendors’ weak cyber security, the criminals were able to access the larger prize.

To circle back to my friend’s weird fallout-shelter theory, it’s certainly a good idea to have a backup plan in case one is hit by a proverbial “cyber-bomb.” But rather than hunker down and wait for the attack to occur, it’s critical to educate employees, vet vendors’ cyber-security and adopt — and continuously optimize — a formal cybersecurity program.

Q4 Economic and Investment Outlook

Although it may not seem like it, in the second quarter of this year the U.S. economy passed into the beginning of its seventh year of expansion. In the 158 years that the National Bureau of Economic Research (the arbiters of “official” U.S. economic cycles) has been keeping records, ours is now the fifth-longest economic cycle, at 75 months. For fun, when did the longest cycles occur, and what circumstances characterized them? Is there anything we can learn from historical perspective about what may lie ahead for the current cycle?

The first cycle longer than the current, by only five months, is the 1938-1945 U.S. economic expansion cycle. Of course, this was the immediate post-Depression recovery cycle. What preceded this cycle, from 1933-1937, was the bulk of FDR’s New Deal spending program, a program that certainly rebuilt confidence and paved the way for a U.S. manufacturing boom as war on European and Japanese lands destroyed their respective manufacturing capabilities for a time. More than anything, the war-related destruction of the industrial base of Japan and Europe was the growth accelerant of the post-Depression U.S. economy.

In historically sequential order, the U.S. economy grew for 106 months between 1961 and 1970. What two occurrences surrounded this economic expansion that were unique in the clarity of hindsight? A quick diversion. In 1946, the first bank credit card was issued by the Bank of Brooklyn, called the “Charge-It” card. Much like American Express today, the balance needed to be paid in full monthly. We saw the same thing when the Diners Club Card became popular in the 1950s. But in 1958, both American Express and Bank of America issued credit cards to their customers broadly. We witnessed the beginning of the modern day credit culture in the U.S. economic and financial system. A support to the follow-on 1961-1970 economic expansion? Without question.

Once again in the 1960s, the influence of a major war on the U.S. economy was also apparent. Lyndon Johnson’s “guns and butter” program increased federal spending meaningfully, elongating the U.S. expansion of the time.

The remaining two extended historical U.S. economic cycles of magnitude (1982-1990, at 92 months, and 1991-2001, at 120 months) both occurred under the longest bull market cycle for bonds in our lifetime. Of course, a bull market for bonds means interest rates are declining. In November 1982, the 10-year Treasury sported a yield of 10.5%. By November 2001, that number was 4.3%. Declining interest rates from the early 1980s to the present constitute one of the greatest bond bull markets in U.S. history. The “credit cycle” spawned by two decades of continually lower interest rates very much underpinned these elongated growth cycles. The question being, at the generational lows in interest rates that we now see, will this bull run be repeated?

So fast-forward to today. What has been present in the current cycle that is anomalistic? Pretty simple. Never in any U.S. economic cycle has federal debt doubled, but it has in the current cycle. Never before has the Federal Reserve “printed” more than $3.5 trillion and injected it into U.S. financial markets, until the last seven years. Collectively, the U.S. economy and financial markets were treated to more than $11 trillion of additional stimulus, a number that totals more than 70% of current annual U.S. GDP. No wonder the current economic cycle is pushing historical extremes in terms of longevity. But what lies ahead?

As we know, the U.S. Fed has stopped printing money. Maybe not so coincidentally, in recent months macroeconomic indicators have softened noticeably. This is happening across the globe, not just in the U.S. As we look forward, what we believe most important to U.S. economic outcomes is what happens outside of the U.S. proper.

Specifically, China is a key watch point. It is the second-largest economy in the world and is undergoing not only economic slowing, but the very beginning of the free floating of its currency, as we discussed last month. This is causing the relative value of its currency to decline against global currencies. This means China can “buy less” of what the global economy has to sell. For the emerging market countries, China is their largest trading partner. If China slows, they slow. The largest export market for Europe is not the U.S., it’s China. As China slows, the Euro economy will feel it. For the U.S., China is also important in being an end market for many companies, crossing industries from Caterpillar to Apple.

In the 2003-2007 cycle, it was the U.S. economy that transmitted weakness to the greater global economy. In the current cycle, it’s exactly the opposite. It is weakness from outside the U.S. that is our greatest economic watch point as we move on to the end of the year. You may remember in past editions we have mentioned the Atlanta FED GDP Now model as being quite the good indicator of macroeconomic U.S. tone. For the third quarter, the model recently dropped from 1.7% estimated growth to 0.9%. Why? Weakness in net exports. Is weakness in the non-U.S. global economy the real reason the Fed did not raise interest rates in September?

Interest Rates

As you are fully aware, the Fed again declined to raise interest rates at its meeting last month, making it now 60 Fed meetings in a row since 2009 that the Fed has passed on raising rates. Over the 2009-to-present cycle, the financial markets have responded very positively in post-Fed meeting environments where the Fed has either voted to print money (aka “Quantitative Easing”) or voted to keep short-term interest rates near zero. Not this time. Markets swooned with the again seemingly positive news of no rate increases. Very much something completely different in terms of market behavior in the current cycle. Why?

We need to think about the possibility that investors are now seeing the Fed, and really global central bankers, as to a large degree trapped. Trapped in the web of intended and unintended consequences of their actions. As we have argued for the past year, the Fed’s greatest single risk is being caught at the zero bound (0% interest rates) when the next U.S./global recession hits. With declining global growth evident as of late, this is a heightened concern, and that specific risk is growing. Is this what the markets are worried about?

It’s a very good bet that the Fed is worried about and reacting to the recent economic slowing in China along with Chinese currency weakness relative to the U.S. dollar. Not only are many large U.S. multi-national companies meaningful exporters to China, but a rising dollar relative to the Chinese renminbi is about the last thing these global behemoths want to see. As the dollar rises, all else being equal, it makes U.S. goods “more expensive” in the global marketplace. A poster child for this problem is Caterpillar. Just a few weeks ago, it reported its 33rd straight month of declining world sales. After releasing that report, it announced that 10,000 would be laid off in the next few years.

As we have explained in past writings, if the Fed raises interest rates, it would be the only central bank on Earth to do so. Academically, rising interest rates support a higher currency relative to those countries not raising rates. So the question becomes, if the Fed raises rates will it actually further hurt U.S. economic growth prospects globally by sparking a higher dollar? The folks at Caterpillar may already have the answer.

Finally, we should all be aware that debt burdens globally remain very high. Governments globally have borrowed, and continue to borrow, profusely in the current cycle. U.S. federal debt has more than doubled since 2009, and, again, we will hit yet a U.S. government debt ceiling in December. Do you really think the politicians will actually cap runaway debt growth? We’ll answer as soon as we stop laughing. As interest rates ultimately trend up, so will the continuing interest costs of debt-burdened governments globally. The Fed is more than fully aware of this fact.

In conjunction with all of this wonderful news, as we have addressed in prior writings, another pressing issue is the level of dollar-denominated debt that exists outside of the U.S.. As the Fed lowered rates to near zero in 2008, many emerging market countries took advantage of low borrowing costs by borrowing in U.S. dollars. As the dollar now climbs against the respective currencies of these non-dollar entities, their debt burdens grow in absolute terms in tandem with the rise in the dollar. Message being? As the Fed raises rates, it increases the debt burden of all non-U.S. entities that have borrowed in dollars. It is estimated that an additional $7 trillion in new dollar-denominated debt has been borrowed by non-U.S. entities in the last seven years. Fed decisions now affect global borrowers, not just those in the U.S.. So did the Fed pass on raising rates in September out of concern for the U.S. economy, or issues specific to global borrowers and the slowing international economies? For investors, has the Fed introduced a heightened level of uncertainty in their decision-making?

Prior to the recent September Fed meeting, Fed members had been leading investors to believe the process of increasing interest rates in the U.S. was to begin. So in one very real sense, the decision to pass left the investment world confused. Investors covet certainty. Hence a bit of financial market turbulence in the aftermath of the decision. Is the Fed worried about the U.S. economy? The global economy? The impact of a rate decision on relative currency values? Is the Fed worried about the emerging economies and their very high level of dollar-denominated debt? Because Fed members never clearly answer any of these questions, they have now left investors confused and concerned.

What this tells us is that, from a behavioral standpoint, the days of expecting a positive Pavlovian financial market response to the supposedly good news of a U.S. Fed refusing to raise interest rates are over. Keeping rates near zero is no longer good enough to support a positive market sentiment. In contrast, a Fed further refusing to raise interest rates is a concern. Let’s face it, there is no easy way out for global central bankers in the aftermath of their unprecedented money printing and interest rate suppression experiment. This, we believe, is exactly what the markets are now trying to discount.

The U.S. Stock Market

We are all fully aware that increased price volatility has characterized the U.S. stock market for the last few months. It should be no surprise as the U.S. equity market had gone close to 4 years without having experienced even a 10% correction, the third-longest period in market history. In one sense, it’s simply time, but we believe the key question for equity investors right now is whether the recent noticeable slowing in global economic trajectory ultimately results in recession. Why is this important? According to the playbook of historical experience, stock market corrections that occur in non-recessionary environments tend to be shorter and less violent than corrections that take place within the context of actual economic recession. Corrections in non-recessionary environments have been on average contained to the 10-20% range. Corrective stock price periods associated with recession have been worse, many associated with 30-40% price declines known as bear markets.

We can see exactly this in the following graph. We are looking at the Dow Jones Global Index. This is a composite of the top 350 companies on planet Earth. If the fortunes of these companies do not represent and reflect the rhythm of the global economy, we do not know what does. The blue bars marked in the chart are the periods covering the last two U.S. recessions, which were accompanied by downturns in major developed economies globally. As we’ve stated many a time, economies globally are more linked than ever before. We live in an interdependent global world. Let’s have a closer look.

If we turn the clock back to late 1997, an emerging markets currency crisis caused a 10%-plus correction in global stock prices but no recession. The markets continued higher after that correction. In late 1998, the blowup at Long Term Capital Management (a hedge fund management firm implosion that caused a $3.6 billion bailout among 16 financial institutions under the supervision of the Fed) really shook the global markets, causing a 20% price correction, but no recession, as the markets continued higher into the early 2000 peak. From the peak of stock prices in early 2000 to the first quarter of 2001, prices corrected just more than 20% but then declined yet another 20% that year as the U.S. did indeed enter recession. The ultimate peak to trough price decline into the 2003 bottom registered 50%, quite the bear market. Again, this correction was accompanied by recession.

graph

The experience from 2003 to early 2008 is similar. We saw 10% corrections in 2004 and 2006, neither of which were accompanied by recession. The markets continued higher after these two corrective interludes. Late 2007 into the first quarter of 2008 witnessed just shy of a 20% correction, but being accompanied by recession meant the peak-to-trough price decline of 2007-2009 totaled considerably more than 50%.

We again see similar activity in the current environment. In 2010, we saw a 10% correction and no recession. In 2011, we experienced a 20% correction. Scary, but no recession meant higher stock prices were to come.

So we now find ourselves at yet another of these corrective junctures, and the key question remains unanswered. Will this corrective period for stock prices be accompanied by recession? We believe this question needs to be answered from the standpoint of the global economy, not the U.S. economy singularly. For now, the jury is out, but we know evidence of economic slowing outside of the U.S. is gathering force.

As you may be aware, another U.S. quarterly earnings reporting season is upon us. Although the earnings results themselves will be important, what will be most meaningful is guidance regarding 2016, as markets look ahead, not backward. We’ll especially be interested in what the major multinationals have to say about their respective outlooks, as this will be a key factor in assessing where markets may be moving from here.

New Way to Spot Loss in Workers’ Comp

You’ve heard it before, “It’s not the tip of the iceberg that cost you so much; it’s what you can’t see. It’s what’s below the water level that costs you real money.” We hear that the total loss to a company from a workers’ comp loss is six to 10 times the value of that work comp loss. But risk managers have neither the right tools to understand and measure the loss, nor the right tools to improve productivity to capture the cash flow that comes from preventing that loss.

During my initial journey into lean sigma consulting, a seasoned Japanese colleague shared an important concept. While this principle was developed to improve the quality and efficiency of output in manufacturing, it has many other applications, including in improving safety and reducing workers’ comp costs. Understanding and applying the rule has improved the profitability of many companies.

Dr. Genichi Taguchi, a Japanese engineer, theorized (and ultimately proved mathematically) that loss within any process or system develops exponentially–not linearly–as we move away from the ideal customer specification or target value.

An example of Taguchi’s Loss Curve is shown below:

graph

Another way to look at it is this: Anything delivered just outside the target, (labeled as LTL and UTL in the diagram above) creates opportunity for exponential financial improvement as we move toward the center of the U-shaped curve. And the farther away from the target we are, the greater the opportunity.

I explain Taguchi’s principle using an example from a kaizen event that dramatically improved machine setup times within a CNC shop.

For years, our client assumed it took 46 minutes to set up and change over machinery. After all, for 10 years, it did take 46 minutes. But our kaizen team was hired to challenge this thinking.

If the CEO and his team were right, setup times couldn’t be completed any faster. But if setup times could be better, loss had been occurring beneath the water line, which meant the iceberg was growing, but no one knew.

Machine setup time is loss because no value is produced during the setup process. And setup times can represent 35% of the total labor burden, so there’s a lot at stake. While employers can compute labor and overhead costs easily, when their assumptions are incorrect about setup times, they’re losing big money. But rarely do they know it or how much.

Here’s our client’s story:

Our client used people and machinery to produce aircraft parts. Machines were not dedicated to product families or cycle times. In other words, the client could build a Mack Truck or Toyota Corolla on the same machinery. And because setup times were slow, the client built large batches of products. When defects struck, they struck in large quantities, and, financially, it was too late to find causes. The costs were already sunk.

Our client borrowed capital to purchase nine machines, leased the appropriate space to house them and purchased electricity, water, and cutting fluids, as well. Each machine had affiliated tool and dies, and mechanics to service them. In other words, when you own nine machines, you need the gear, people and money required to operate and maintain nine machines. And all of this cost was based on 46-minute setups.

Think about that for a moment.

If the client didn’t need nine machines, it wouldn’t have had to spend all of that money and for all of those years! And a wrong assumption in setup times could be leading to loss that never appeared on any income statement. What would show would be the known labor, materials, machinery and overhead costs. But what wouldn’t show would be what wasn’t needed if the team could complete a setup in less than 46 minutes.

After videotaping, collaborating and measuring cycle times on the existing operations and processes, it was evident: The team had ideas that would challenge the 46-minute setups.

After some 5S housekeeping, the team produced a 23-minute setup. One more day of tweaking, and the team got it down to 16. By the last day, the team was consistently producing 10-minute results.

Now let’s talk about the impact.

Under the better state, the client could indeed produce parts faster. It also needed far less capital, insurance, labor, gear, electricity, fluids, tooling, floor space, etc. And because our client’s customer would now get parts faster, the company would get paid faster.

While banks may not like these facts, clients and employees do. Employees can do their jobs more efficiently, and the company makes more money while borrowing less.

Here’s an explanation of the 5S tool the team used to make their setup times faster. This tool–when used properly––not only improves operating efficiency but removes or reduces safety hazards like: tripping, standing, walking, reaching, handling, lifting and searching for lost items.

In addition, the kaizen event itself creates an opportunity for employees to improve their own job conditions and use their curiosity and creativity to solve production-related problems. The event also creates a more engaged employee, one less likely to file future work comp and employment-related claims.

The 5S Process consists of five steps.

  1. Sort the work area out.
  2. Straighten the work area out, putting everything in the right place.
  3. Clean the entire area, scrub floors, create aisle ways with yellow tape, wash walls, paint, etc.
  4. Create standardized, written work processes.
  5. Sustain the process

Using the tools like 5S, I continue to improve my thinking relating to identifying, and managing work comp risks. But during each kaizen event, I also gain perspective about why stakeholders rarely change their ways. What I’ve learned is this: Clients typically need to have one of two conditions met for good change to occur.

  1. They need to have something to motivate them––which often means facing a crisis.
  2. They need to physically see and experience things to believe them.

If you’re like me, you probably need proof, too. Here it is: A reduction in setup times from over two and a half hours to just over ten minutes.

What the Lean Assessment Does

The lean assessment helps find improvement opportunities. That’s because assessments study and measure cycle times, customer demand, value-adding and non-value-adding activities. The assessment helps everyone—including the executive team— see how people physically are required to do their work and understand why they are required to do it the way they are.

In the week-long assessment process, we’re no longer studying the costs of just safety; we’re studying all of the potential causes that drive productivity and loss away from the nominal value. Safety is not necessarily why we are measuring outcomes. Safety is the benefactor from learning how and why the company adds value, and precisely where it creates loss.

That is the power of good change. And good change comes from the power of lean.

The best approach is to dig out and eliminate problems where they are assumed not to exist.” – Shigeo Shingo

Insurance And Manufacturing: Lessons In Software, Systems, And Supply Chains

Recently, my boss Steve and I were talking about his early career days with one of those Big 8, then Big 6, then Big 5, then Big 4 intergalactic consulting firms. Steve came out of college with an engineering degree, so it was natural to start in the manufacturing industry. Learning about bills of material, routings, design engineering, CAD/CAM … “Ah yes,” he recalled, “Those were heady days.” And all those vendor-packaged manufacturing ERP systems that were starting to take the market by storm.

Eventually Steve found his way into the insurance industry, and thus began our discussion. One of the first things that struck Steve was the lack of standard software packages in the insurance industry. I don’t mean the lack of software vendors — there are plenty of those. Seemingly, though, each software solution was a one-off. Or custom. Or some hybrid combination. “Why?” we wondered.

The reasons, as we now know, were primarily reflected in an overall industry mindset:

  • A “but we are unique!” attitude was pervasive. Companies were convinced that if they all used the same software, there would be little to differentiate themselves from one another.
  • There was also an accepted industrywide, one-off approach. Conversations went something like this: “XYZ is our vendor. We really don’t like them. Taking new versions just about kills us. We don’t know why we even pay for maintenance, but we do.”

But the chief reason for a lack of standard software was the inability to separate product from process. What does this mean?

Well, you can certainly envision that your auto product in Minnesota is handled differently than your homeowners’ product in California. I’m not referring to just the obvious elements (limits, deductibles, rating attributes), but also the steps required for underwriting, renewal, and cancellation. Separation of product from process must go beyond the obvious rate/rule/form variations to also encompass internal business and external compliance process variations.

But there’s still plenty of processing — the heavy lifting of transaction processing — that’s the same and does not vary. For example, out-of-sequence endorsement processing is not something that makes a company unique and therefore would not require a custom solution.

Where the rubber meets the road, and where vendor packages have really improved their architecture over the last several years, is by providing the capability in their policy admin systems for companies to “drop” very specific product information, along with associated variations, into a very generic transaction system.

Once product “components” (digitized) are separated from the insurance processing engine, and once companies have a formal way to define them (standard language), they can truly start making their products “unique” with reuse and mass customization. Much like those manufacturing bills of material and routings looked to Steve way back when.

This separation of policy from product has been a key breakthrough in insurance software. So what is an insurance product, at least in respect to systems automation?

From Muddled To Modeled
The typical scenario to avoid goes something like this:

  • The business people pore over their filings and manuals and say, “This is the product we sell and issue.”
  • The IT people pore over program code and say, “That’s the product we have automated.”
  • The business people write a lot of text in their word processing documents. They find a business analyst to translate it into something more structured, but still text.
  • The business analyst finds a designer to make the leap from business text to IT data structures and object diagrams.
  • The designer then finds a programmer to turn that into code.

One version of the truth? More like two ships passing, and it’s more common than you may think. How can organizations expect success when the product development process is not aligned? Without alignment, how can organizations expect market and compliance responsiveness?

What’s the alternative? It revolves around an insurance “product model.” Much like general, industry-standard data models and object models, a product model uses a precise set of symbols and language to define insurance product rates, rules, and forms — the static or structural parts of an insurance product. In addition, the product model must also define the actions that are allowed to be taken with the policy during the life of the contract — the dynamic or behavioral aspect of the product model. So for example, on a commercial auto product in California, the model will direct the user to attach a particular form (structure) for new business issuance only (actions).

Anyone familiar with object and data modeling knows there are well-defined standards for these all-purpose models. For insurance product modeling, at least currently, such standards are more proprietary, such as IBM’s and Camilion’s models, and of course there are others. It is interesting to note that ACORD now has under its auspices the Product Schema as the result of IBM’s donation of aspects of IAA. Might this lead to more industry standardization?

With product modeling as an enabler, there’s yet another key element to address. Yes, that would be the product modelers — the people responsible for making it work. Product modeling gives us the lexicon or taxonomy to do product development work, but who should perform that work? IT designers with sound business knowledge? Business people with analytical skills? Yes and yes. We must finally drop the history of disconnects where one side of the house fails to understand the other.

With a foundation of product modeling and product modelers in place, we can move to a more agile or lean product life cycle management approach — cross-functional teams versus narrow, specialized skills; ongoing team continuity versus ad hoc departmental members; frequent, incremental product improvements versus slow, infrequent, big product replacements.

It all sounds good, but what about the product source supplier — the bureaus?

Supply Chain: The Kinks In Your Links
Here is where the comparison between insurance and manufacturing takes a sharp turn. In their pursuit of quality and just-in-time delivery, manufacturers can make demands on their supply chain vendors. Insurance companies, on the other hand, are at the mercy of the bureaus. ISO, NCCI, and AAIS all develop rates, rules, and forms, of course. They then deliver these updates to their member subscribers via paper manuals or electronically via text.

From there the fun really begins. Insurance companies must log the info, determine which of their products and territories are impacted, compare the updates to what they already have implemented and filed, conduct marketing and business reviews, and hopefully and eventually, implement at least some of those updates.

Recent studies by Novarica and SMA indicate there are approximately 3,000 to 4,000 changes per year in commercial lines alone. The labor cost to implement just one ISO circular with a form change and a rate change is estimated to be $135,000, with the majority of costs in the analysis and system update steps.

There has got to be a better way …

ISO at least has taken a step in right direction with the availability of its Electronic Rating Content. In either Excel or XML format, ISO interprets its own content to specify such constructs as premium calculations (e.g., defined order of calculation, rounding rules), form attachment logic (for conditional forms), and stat code assignment logic (to support the full plan).

A step in the right direction, no doubt. But what if ISO used a standard mechanism and format to do this? ACORD now has under its control the ACORD Product Schema. This is part of IBM’s fairly recent IAA donation. It provides us a standard way to represent the insurance product and a standard way to integrate with policy admin systems. What if ISO and the other key providers in the product supply chain started it all off this way?

Dream on, you say? While you may not have the clout to demand that the bureaus change today, you do pay membership fees, and collectively the members have a voice in encouraging ongoing improvements in the insurance “supply chain.”

In the meantime, the goal to be lean and agile with product life cycle management continues. We must respond quickly and cost-effectively to market opportunities, policyholder feedback, and regulatory requirements. That all starts at the product source … but it doesn’t end there. So while the supply chain improves its quality and delivery, insurance companies will need to gain efficiencies throughout every corner of their organizations in order to achieve those lean goals.

In writing this article, David collaborated with his boss Steve Kronsnoble. Steve is a senior manager at Wipfli and an expert in the development, integration, and management of information technology. He has more than 25 years of systems implementation experience with both custom-developed and packaged software using a variety of underlying technologies. Prior to Wipfli, Steve worked for a major insurance company and leverages that experience to better serve his clients.