Let’s be honest: Operational motivations are about speed and efficiency, not security. For manufacturing organizations to effectively manage cyber risk, they first need to understand that the global digital transformation making businesses run smarter and more efficiently is also creating a widening security gap that must be addressed.
Creating Industry 4.0
In manufacturing, investments are largely motivated by the pursuit of increased operational effectiveness and efficiency: doing more for a lower per-unit cost. Often, these investments manifest as new operational technology (OT), for instance to enable higher degrees of automation, accelerated assembly timelines and improved real-time insights. New OT gets added to a large information technology (IT) stack, which has often been built over several decades; in that time, the IT stack has become a complex mix of legacy, aging and modern solutions held together by vulnerable protocols and a “don’t touch what isn’t broken” stability strategy.
Industry 4.0, driven by the pursuit of OT, is the connection of industrial equipment that accesses and analyzes centralized operational data. In essence, this is the next industrial revolution in advanced manufacturing and smart, connected, collaborative factories. This new paradigm is characterized by the action of the physical world becoming a type of information system through sensors and actuators embedded in objects and linked through networks. Beyond having the potential to completely change material and manufacturing processes, Industry 4.0 is expected to contribute to more efficient operations by aggregating data across all facilities, letting companies monitor, measure and improve performance.
This digital transformation introduces new generations of intelligent solutions and integrates these solutions into existing manufacturing processes and technologies including SCADA/ICS and PLCs. In many cases, this collection is controlled by a manufacturing execution system (MES), which is tightly integrated into the manufacturing organization’s ERP system.
Unfortunately, this pursuit of improved operations comes with an unintended consequence: a widening security gap. As manufacturing has become more connected, the threat surface—the collection of points an attacker can use to try to gain access—has increased substantially and now extends from endpoints and networks into cloud services. In fact, the entire manufacturing process (and, by extension, the company that depends on that process running effectively) is more vulnerable to cyberattacks. From opportunistic attacks using commodity malware as a service, to sophisticated hands-on-keyboard attacks that surgically evade defenses, to advanced persistent threats that can operate for years undetected, to industrial espionage using legitimate credentials harvested from phishing campaigns—the list is long, and the consequences can be devastating.
Modern threats can readily bypass legacy antivirus solutions and take advantage of vulnerability windows. Organizations need solutions that can harden endpoints, prevent polymorphic malware and fileless attacks, mitigate malicious code execution and provide investigation and remediation capabilities with dynamic response to security incidents.
As the knowledge of the growing threat landscape solidifies, tension develops between two core factions: OT and IT. Security was a distant priority when vendors created their new OT solutions, yet IT understands the security risks and best practices and wants to take the time to do things as safely as possible. OT is under pressure to hit targets and can feel like IT is slowing them down by unnecessarily overstating the risks. Plus, manufacturers must grapple with systemic vulnerabilities in operating systems and control systems. For instance, it’s important to recognize that many industrial communication standards don’t even consider security because they are based on the old firewall model of complete trust within the network.
But from the shadows comes a third party: attackers. These bad actors see highly connected, unprotected systems built by vendors that know very little about system security and that are content to pass risk to their customer—the manufacturing organization.
Additionally, the supply chain is vulnerable. As trusted partners, third-party vendors often become the overlooked or unwitting accomplice in criminal activities. A Spiceworks survey of 600 IT and security decision-makers that asked about supply chains highlights this risk.
While the majority of respondents felt confident in their vendors to keep data safe, nearly half (44%) of firms had experienced a significant, business-altering data breach caused by a vendor. Human error and stolen passwords accounted for 26% of the breaches, while malware played a key role in half of the attacks.
While past attacks against major manufacturers and industrial facilities were espionage believed to be sponsored by nation states and based on ideology, many of the latest attacks are the work of cyber criminals motivated purely by profit. Of course, criminals don’t need to shut down a facility to extract payment. In many cases they exfiltrate sensitive information (trade secrets, proprietary data and intellectual property, financial details, private emails, account credentials) and then threaten to release it publicly if a ransom isn’t paid. In some cases, attackers have even weaponized regulations like GDPR, which impose fines when breaches compromise personal information.
As operation and information technologies converge following an almost predictable path of profit-driven natural selection, the leaders of each group have yet to attain a similar level of integration. The operational groups lack the security expertise of their IT counterparts, and IT experts are often excluded from operational decisions, creating an inherent vulnerability that reaches to the top of the organization.
Cybersecurity is not an IT problem to solve; it’s a business risk to manage. Until manufacturers realize that OT and IT are not in competition with each other, they will remain easy prey for cybercriminals who recognize this philosophical flaw and are willing to exploit it.
Deloitte opines that we’ll be lucky to fill 1.5 million of those openings, leaving a gap of 2 million jobs. This potential shortfall didn’t go unnoticed by Daimler Trucks North America (DTNA), a manufacturer of class 5-8 commercial vehicles, school buses, and heavy-duty to mid-range diesel engines. The company saw this bullet coming years ago.
To those in the know, the skilled workforce shortage conundrum isn’t new. As far back as 1990, the National Center on Education and the Economy identified this job shortfall in its report, “The American Workforce – America’s Choice: High Skills or Low Wages,” stating large investments in training were needed to prepare for the slow workforce growth.
If you look at the burgeoning skills gap, coupled with vanishing high school vocational programs, how, as an employer, do you recruit potential candidates?
To not address the millennials’ employer predilections is to miss an opportunity to tap into a vast resource of potential talent.
DTNA addresses the issue by reaching out to high schools throughout the U.S. via the Daimler Educational Outreach Program, which focuses on giving to qualified organizations that support public high school educational programs in STEM (science, technology, engineering and math), CTE (career technical education), and skilled trades’ career development.
Daimler also works in concert with school districts to conduct week-long technology schools in one of the manufacturing facilities, all in an effort to encourage students to consider manufacturing (either skilled or technical) as a vocation.
Like all forward-looking companies, Daimler must address the needs of the millennials who – among a number of their desires – want to make the world a better place. Jamie Gutfreund, chief strategy officer for the Intelligence Group notes that 86 million millennials will be in the workplace by 2020 — representing 40 percent of the total working population.
To not address the millennials’ employer predilections is to miss an opportunity to tap into a vast resource of potential talent. To that end, Daimler has always emphasized research in renewable resources and community involvement as well as a number of philanthropic endeavors. Not only is it the right thing to do, but it also appeals to the much-needed next generation who will fill the boots of the exiting boomers.
Just because a company manufactures heavy-duty commercial vehicles doesn’t mean it can’t give back to the environment and the community at large. And, in the end, that will help make the world a better place.
A friend of mine asked me if the cyber-risk threat was a bit of flimflam designed to sell more insurance policies. He compared cyber-risk to the Red Scare of the 1950s, when families scrambled to build bomb shelters to protect them from a war that never came. The only ones who got rich back then were the contractors, he concluded.
I found his question incredible. But I realized that he didn’t work in the commerce stream, per se, which quelled my impulse to slap him around.
I shared with him some statistics that sobered him up quickly. I explained that cyber-crime costs the global economy more than $400 billion per year, according to estimates by the Center for Strategic and International Studies. Each year, more than 3,000 companies in the U.S. have their systems compromised by criminals. IBM reports more than 91 million security events per year. Worse yet, the Global Risks 2015 report, published in January by the World Economic Forum (WEF), included this rather stark warning: “90% of companies worldwide recognize they are insufficiently prepared to protect themselves against cyber-attacks.”
Cyber protection is not just about deploying advanced cyber threat technology to manage risk; you also have to educate your employees to not fall victim to unassuming scams like “phishing,” which is stealing private information via e-mail or text messages. It remains the most popular con as far as stealing company data because it’s so painfully simple. Just pretend to be someone else and hope a few people fall for it.
While most people understand the threat to data privacy for retailers, hospitals and banks and other financial institutions, few realize that manufacturers are also vulnerable in terms of property damage and downtime. In 2014, a steel manufacturing facility in Germany lost control of its blast furnace, causing massive damage to the plant. The cause of the loss was not employee error, but rather a cyber-attack. While property damage resulting from a cyber-attack is rare, the event was a wake-up call for manufacturers worldwide.
According to The Manufacturer newsletter, “the rise of digital manufacturing means many control systems use open or standardized technologies to reduce costs and improve performance, employing direct communications between control and business systems.” This exposes vulnerabilities previously thought to affect only office computers. In essence, according to The Manufacturer, cyber attacks can now come from both inside and outside of the industrial control system network.
Manufacturers also need to be concerned about cyber attacks that would: a) interrupt their physical supply chain or, b) allow access to their system via the third-party vendor. Manufacturers must then take steps to mitigate those risks. When Target and Home Depot were hacked several years ago, it wasn’t a direct attack on them but an attack on one of their third-party vendors. By breaching the vendors’ weak cyber security, the criminals were able to access the larger prize.
To circle back to my friend’s weird fallout-shelter theory, it’s certainly a good idea to have a backup plan in case one is hit by a proverbial “cyber-bomb.” But rather than hunker down and wait for the attack to occur, it’s critical to educate employees, vet vendors’ cyber-security and adopt — and continuously optimize — a formal cybersecurity program.
Although it may not seem like it, in the second quarter of this year the U.S. economy passed into the beginning of its seventh year of expansion. In the 158 years that the National Bureau of Economic Research (the arbiters of “official” U.S. economic cycles) has been keeping records, ours is now the fifth-longest economic cycle, at 75 months. For fun, when did the longest cycles occur, and what circumstances characterized them? Is there anything we can learn from historical perspective about what may lie ahead for the current cycle?
The first cycle longer than the current, by only five months, is the 1938-1945 U.S. economic expansion cycle. Of course, this was the immediate post-Depression recovery cycle. What preceded this cycle, from 1933-1937, was the bulk of FDR’s New Deal spending program, a program that certainly rebuilt confidence and paved the way for a U.S. manufacturing boom as war on European and Japanese lands destroyed their respective manufacturing capabilities for a time. More than anything, the war-related destruction of the industrial base of Japan and Europe was the growth accelerant of the post-Depression U.S. economy.
In historically sequential order, the U.S. economy grew for 106 months between 1961 and 1970. What two occurrences surrounded this economic expansion that were unique in the clarity of hindsight? A quick diversion. In 1946, the first bank credit card was issued by the Bank of Brooklyn, called the “Charge-It” card. Much like American Express today, the balance needed to be paid in full monthly. We saw the same thing when the Diners Club Card became popular in the 1950s. But in 1958, both American Express and Bank of America issued credit cards to their customers broadly. We witnessed the beginning of the modern day credit culture in the U.S. economic and financial system. A support to the follow-on 1961-1970 economic expansion? Without question.
Once again in the 1960s, the influence of a major war on the U.S. economy was also apparent. Lyndon Johnson’s “guns and butter” program increased federal spending meaningfully, elongating the U.S. expansion of the time.
The remaining two extended historical U.S. economic cycles of magnitude (1982-1990, at 92 months, and 1991-2001, at 120 months) both occurred under the longest bull market cycle for bonds in our lifetime. Of course, a bull market for bonds means interest rates are declining. In November 1982, the 10-year Treasury sported a yield of 10.5%. By November 2001, that number was 4.3%. Declining interest rates from the early 1980s to the present constitute one of the greatest bond bull markets in U.S. history. The “credit cycle” spawned by two decades of continually lower interest rates very much underpinned these elongated growth cycles. The question being, at the generational lows in interest rates that we now see, will this bull run be repeated?
So fast-forward to today. What has been present in the current cycle that is anomalistic? Pretty simple. Never in any U.S. economic cycle has federal debt doubled, but it has in the current cycle. Never before has the Federal Reserve “printed” more than $3.5 trillion and injected it into U.S. financial markets, until the last seven years. Collectively, the U.S. economy and financial markets were treated to more than $11 trillion of additional stimulus, a number that totals more than 70% of current annual U.S. GDP. No wonder the current economic cycle is pushing historical extremes in terms of longevity. But what lies ahead?
As we know, the U.S. Fed has stopped printing money. Maybe not so coincidentally, in recent months macroeconomic indicators have softened noticeably. This is happening across the globe, not just in the U.S. As we look forward, what we believe most important to U.S. economic outcomes is what happens outside of the U.S. proper.
Specifically, China is a key watch point. It is the second-largest economy in the world and is undergoing not only economic slowing, but the very beginning of the free floating of its currency, as we discussed last month. This is causing the relative value of its currency to decline against global currencies. This means China can “buy less” of what the global economy has to sell. For the emerging market countries, China is their largest trading partner. If China slows, they slow. The largest export market for Europe is not the U.S., it’s China. As China slows, the Euro economy will feel it. For the U.S., China is also important in being an end market for many companies, crossing industries from Caterpillar to Apple.
In the 2003-2007 cycle, it was the U.S. economy that transmitted weakness to the greater global economy. In the current cycle, it’s exactly the opposite. It is weakness from outside the U.S. that is our greatest economic watch point as we move on to the end of the year. You may remember in past editions we have mentioned the Atlanta FED GDP Now model as being quite the good indicator of macroeconomic U.S. tone. For the third quarter, the model recently dropped from 1.7% estimated growth to 0.9%. Why? Weakness in net exports. Is weakness in the non-U.S. global economy the real reason the Fed did not raise interest rates in September?
As you are fully aware, the Fed again declined to raise interest rates at its meeting last month, making it now 60 Fed meetings in a row since 2009 that the Fed has passed on raising rates. Over the 2009-to-present cycle, the financial markets have responded very positively in post-Fed meeting environments where the Fed has either voted to print money (aka “Quantitative Easing”) or voted to keep short-term interest rates near zero. Not this time. Markets swooned with the again seemingly positive news of no rate increases. Very much something completely different in terms of market behavior in the current cycle. Why?
We need to think about the possibility that investors are now seeing the Fed, and really global central bankers, as to a large degree trapped. Trapped in the web of intended and unintended consequences of their actions. As we have argued for the past year, the Fed’s greatest single risk is being caught at the zero bound (0% interest rates) when the next U.S./global recession hits. With declining global growth evident as of late, this is a heightened concern, and that specific risk is growing. Is this what the markets are worried about?
It’s a very good bet that the Fed is worried about and reacting to the recent economic slowing in China along with Chinese currency weakness relative to the U.S. dollar. Not only are many large U.S. multi-national companies meaningful exporters to China, but a rising dollar relative to the Chinese renminbi is about the last thing these global behemoths want to see. As the dollar rises, all else being equal, it makes U.S. goods “more expensive” in the global marketplace. A poster child for this problem is Caterpillar. Just a few weeks ago, it reported its 33rd straight month of declining world sales. After releasing that report, it announced that 10,000 would be laid off in the next few years.
As we have explained in past writings, if the Fed raises interest rates, it would be the only central bank on Earth to do so. Academically, rising interest rates support a higher currency relative to those countries not raising rates. So the question becomes, if the Fed raises rates will it actually further hurt U.S. economic growth prospects globally by sparking a higher dollar? The folks at Caterpillar may already have the answer.
Finally, we should all be aware that debt burdens globally remain very high. Governments globally have borrowed, and continue to borrow, profusely in the current cycle. U.S. federal debt has more than doubled since 2009, and, again, we will hit yet a U.S. government debt ceiling in December. Do you really think the politicians will actually cap runaway debt growth? We’ll answer as soon as we stop laughing. As interest rates ultimately trend up, so will the continuing interest costs of debt-burdened governments globally. The Fed is more than fully aware of this fact.
In conjunction with all of this wonderful news, as we have addressed in prior writings, another pressing issue is the level of dollar-denominated debt that exists outside of the U.S.. As the Fed lowered rates to near zero in 2008, many emerging market countries took advantage of low borrowing costs by borrowing in U.S. dollars. As the dollar now climbs against the respective currencies of these non-dollar entities, their debt burdens grow in absolute terms in tandem with the rise in the dollar. Message being? As the Fed raises rates, it increases the debt burden of all non-U.S. entities that have borrowed in dollars. It is estimated that an additional $7 trillion in new dollar-denominated debt has been borrowed by non-U.S. entities in the last seven years. Fed decisions now affect global borrowers, not just those in the U.S.. So did the Fed pass on raising rates in September out of concern for the U.S. economy, or issues specific to global borrowers and the slowing international economies? For investors, has the Fed introduced a heightened level of uncertainty in their decision-making?
Prior to the recent September Fed meeting, Fed members had been leading investors to believe the process of increasing interest rates in the U.S. was to begin. So in one very real sense, the decision to pass left the investment world confused. Investors covet certainty. Hence a bit of financial market turbulence in the aftermath of the decision. Is the Fed worried about the U.S. economy? The global economy? The impact of a rate decision on relative currency values? Is the Fed worried about the emerging economies and their very high level of dollar-denominated debt? Because Fed members never clearly answer any of these questions, they have now left investors confused and concerned.
What this tells us is that, from a behavioral standpoint, the days of expecting a positive Pavlovian financial market response to the supposedly good news of a U.S. Fed refusing to raise interest rates are over. Keeping rates near zero is no longer good enough to support a positive market sentiment. In contrast, a Fed further refusing to raise interest rates is a concern. Let’s face it, there is no easy way out for global central bankers in the aftermath of their unprecedented money printing and interest rate suppression experiment. This, we believe, is exactly what the markets are now trying to discount.
The U.S. Stock Market
We are all fully aware that increased price volatility has characterized the U.S. stock market for the last few months. It should be no surprise as the U.S. equity market had gone close to 4 years without having experienced even a 10% correction, the third-longest period in market history. In one sense, it’s simply time, but we believe the key question for equity investors right now is whether the recent noticeable slowing in global economic trajectory ultimately results in recession. Why is this important? According to the playbook of historical experience, stock market corrections that occur in non-recessionary environments tend to be shorter and less violent than corrections that take place within the context of actual economic recession. Corrections in non-recessionary environments have been on average contained to the 10-20% range. Corrective stock price periods associated with recession have been worse, many associated with 30-40% price declines known as bear markets.
We can see exactly this in the following graph. We are looking at the Dow Jones Global Index. This is a composite of the top 350 companies on planet Earth. If the fortunes of these companies do not represent and reflect the rhythm of the global economy, we do not know what does. The blue bars marked in the chart are the periods covering the last two U.S. recessions, which were accompanied by downturns in major developed economies globally. As we’ve stated many a time, economies globally are more linked than ever before. We live in an interdependent global world. Let’s have a closer look.
If we turn the clock back to late 1997, an emerging markets currency crisis caused a 10%-plus correction in global stock prices but no recession. The markets continued higher after that correction. In late 1998, the blowup at Long Term Capital Management (a hedge fund management firm implosion that caused a $3.6 billion bailout among 16 financial institutions under the supervision of the Fed) really shook the global markets, causing a 20% price correction, but no recession, as the markets continued higher into the early 2000 peak. From the peak of stock prices in early 2000 to the first quarter of 2001, prices corrected just more than 20% but then declined yet another 20% that year as the U.S. did indeed enter recession. The ultimate peak to trough price decline into the 2003 bottom registered 50%, quite the bear market. Again, this correction was accompanied by recession.
The experience from 2003 to early 2008 is similar. We saw 10% corrections in 2004 and 2006, neither of which were accompanied by recession. The markets continued higher after these two corrective interludes. Late 2007 into the first quarter of 2008 witnessed just shy of a 20% correction, but being accompanied by recession meant the peak-to-trough price decline of 2007-2009 totaled considerably more than 50%.
We again see similar activity in the current environment. In 2010, we saw a 10% correction and no recession. In 2011, we experienced a 20% correction. Scary, but no recession meant higher stock prices were to come.
So we now find ourselves at yet another of these corrective junctures, and the key question remains unanswered. Will this corrective period for stock prices be accompanied by recession? We believe this question needs to be answered from the standpoint of the global economy, not the U.S. economy singularly. For now, the jury is out, but we know evidence of economic slowing outside of the U.S. is gathering force.
As you may be aware, another U.S. quarterly earnings reporting season is upon us. Although the earnings results themselves will be important, what will be most meaningful is guidance regarding 2016, as markets look ahead, not backward. We’ll especially be interested in what the major multinationals have to say about their respective outlooks, as this will be a key factor in assessing where markets may be moving from here.
You’ve heard it before, “It’s not the tip of the iceberg that cost you so much; it’s what you can’t see. It’s what’s below the water level that costs you real money.” We hear that the total loss to a company from a workers’ comp loss is six to 10 times the value of that work comp loss. But risk managers have neither the right tools to understand and measure the loss, nor the right tools to improve productivity to capture the cash flow that comes from preventing that loss.
During my initial journey into lean sigma consulting, a seasoned Japanese colleague shared an important concept. While this principle was developed to improve the quality and efficiency of output in manufacturing, it has many other applications, including in improving safety and reducing workers’ comp costs. Understanding and applying the rule has improved the profitability of many companies.
Dr. Genichi Taguchi, a Japanese engineer, theorized (and ultimately proved mathematically) that loss within any process or system develops exponentially–not linearly–as we move away from the ideal customer specification or target value.
An example of Taguchi’s Loss Curve is shown below:
Another way to look at it is this: Anything delivered just outside the target, (labeled as LTL and UTL in the diagram above) creates opportunity for exponential financial improvement as we move toward the center of the U-shaped curve. And the farther away from the target we are, the greater the opportunity.
I explain Taguchi’s principle using an example from a kaizen event that dramatically improved machine setup times within a CNC shop.
For years, our client assumed it took 46 minutes to set up and change over machinery. After all, for 10 years, it did take 46 minutes. But our kaizen team was hired to challenge this thinking.
If the CEO and his team were right, setup times couldn’t be completed any faster. But if setup times could be better, loss had been occurring beneath the water line, which meant the iceberg was growing, but no one knew.
Machine setup time is loss because no value is produced during the setup process. And setup times can represent 35% of the total labor burden, so there’s a lot at stake. While employers can compute labor and overhead costs easily, when their assumptions are incorrect about setup times, they’re losing big money. But rarely do they know it or how much.
Here’s our client’s story:
Our client used people and machinery to produce aircraft parts. Machines were not dedicated to product families or cycle times. In other words, the client could build a Mack Truck or Toyota Corolla on the same machinery. And because setup times were slow, the client built large batches of products. When defects struck, they struck in large quantities, and, financially, it was too late to find causes. The costs were already sunk.
Our client borrowed capital to purchase nine machines, leased the appropriate space to house them and purchased electricity, water, and cutting fluids, as well. Each machine had affiliated tool and dies, and mechanics to service them. In other words, when you own nine machines, you need the gear, people and money required to operate and maintain nine machines. And all of this cost was based on 46-minute setups.
Think about that for a moment.
If the client didn’t need nine machines, it wouldn’t have had to spend all of that money and for all of those years! And a wrong assumption in setup times could be leading to loss that never appeared on any income statement. What would show would be the known labor, materials, machinery and overhead costs. But what wouldn’t show would be what wasn’t needed if the team could complete a setup in less than 46 minutes.
After videotaping, collaborating and measuring cycle times on the existing operations and processes, it was evident: The team had ideas that would challenge the 46-minute setups.
After some 5S housekeeping, the team produced a 23-minute setup. One more day of tweaking, and the team got it down to 16. By the last day, the team was consistently producing 10-minute results.
Now let’s talk about the impact.
Under the better state, the client could indeed produce parts faster. It also needed far less capital, insurance, labor, gear, electricity, fluids, tooling, floor space, etc. And because our client’s customer would now get parts faster, the company would get paid faster.
While banks may not like these facts, clients and employees do. Employees can do their jobs more efficiently, and the company makes more money while borrowing less.
Here’s an explanation of the 5S tool the team used to make their setup times faster. This tool–when used properly––not only improves operating efficiency but removes or reduces safety hazards like: tripping, standing, walking, reaching, handling, lifting and searching for lost items.
In addition, the kaizen event itself creates an opportunity for employees to improve their own job conditions and use their curiosity and creativity to solve production-related problems. The event also creates a more engaged employee, one less likely to file future work comp and employment-related claims.
The 5S Process consists of five steps.
Sort the work area out.
Straighten the work area out, putting everything in the right place.
Clean the entire area, scrub floors, create aisle ways with yellow tape, wash walls, paint, etc.
Create standardized, written work processes.
Sustain the process
Using the tools like 5S, I continue to improve my thinking relating to identifying, and managing work comp risks. But during each kaizen event, I also gain perspective about why stakeholders rarely change their ways. What I’ve learned is this: Clients typically need to have one of two conditions met for good change to occur.
They need to have something to motivate them––which often means facing a crisis.
They need to physically see and experience things to believe them.
The lean assessment helps find improvement opportunities. That’s because assessments study and measure cycle times, customer demand, value-adding and non-value-adding activities. The assessment helps everyone—including the executive team— see how people physically are required to do their work and understand why they are required to do it the way they are.
In the week-long assessment process, we’re no longer studying the costs of just safety; we’re studying all of the potential causes that drive productivity and loss away from the nominal value. Safety is not necessarily why we are measuring outcomes. Safety is the benefactor from learning how and why the company adds value, and precisely where it creates loss.
That is the power of good change. And good change comes from the power of lean.
“The best approach is to dig out and eliminate problems where they are assumed not to exist.” – Shigeo Shingo