Do you notice any built-in contradiction in the title of this article? A naïve approach to Continuous Improvement (CI) would be to attempt to benchmark Toyota, and just do what they do. That would be OK if: 1) You make automobiles, and 2) You have fifty years of Toyota management experience. Without those advantages, you would do best to consider a modified strategy.
It’s not easy to change, and usually not fruitful to change precipitously. An anecdote from an unnamed country, several wars ago, decided to mechanize the cavalry. An inspector, reviewing the prototype of the new battalion, marveled in the shiny new personnel carriers and tanks. However, he then noticed two soldiers standing off to the side, apparently doing nothing. When he inquired, he was told: “They are there to hold the horses.”
What I’ll do in this article is to extract (cherry-pick) fairly recent Quality Continuous Improvement articles where authors express opinions, what and how, to maximize benefits from CI knowledge and activities. It will be difficult to reduce fifty or so pages of the four articles selected, in order to fit the scope of this piece, so I’ll name the sources and dates for the reader’s further reference.
As an ongoing process, CI cannot be reduced to a moment in time, but is a potentially potent source of knowledge and especially, action.
One of the first articles that appeared in a text search for “continuous improvement” from the Harvard Business Review’ on-line library – [Richard Nolan and F. Warren McFarlan, “Information Technology and the Board of Directors”, (HBR, Oct, 2005)], introduced the idea that not all environments can use the same improvement processes. Figure 1 shows the differentiation they found:
Defensive | Offensive | |
M
o d |
Factory
(Low cost, High failure impact) |
Strategic
(High cost, High impact) |
e
s |
Support
(Low cost, Low impact) |
Turnaround
(High cost, Low impact) |
(less) IT Technical | Needs (more) |
Figure 1. Environments for Continuous Improvement.
The left side of the grid can function with less technology but generally can’t tolerate outages. The right side is less vulnerable to short term outages, but requires high consistency for improvement in development. Each quadrant has its special needs.
The next article, [“The Hidden Traps for Decision Making”, J.S. Hammond, R. L. Keeney, and H. Raiffa, (HBR, Jan 2006)], looks at impediments to continuous improvement thinking and action. Some of these project irrational behavior residue into the improvement possibilities. Of the six cited in the article, three are listed here: “Anchoring,” starting when an initial number is interjected; “Status Quo,” avoiding any change, “Sunk Costs,” defending the past numbers; “Confirming Evidence trap, “others are doing it; “Framing trap”, how the facts are presented; and “Estimating and Forecasting,” which places too much emphasis on a single outcome — these are elements that cloud thinking. Also noted were “Overconfidence,” and its opposite, excessive “Prudence.” While all of these are potential traps in all spheres of action, the authors’ solution to good decision-making is active “Awareness” – the review of the hidden assumptions in one’s decisions. It seems that CI is as dependent on avoiding pitfalls as doing the right thing.
Perhaps the most useful in search of CI wisdom is the article, [“Learning to Lead at Toyota,” S. Spear, (HBR, May 2004)]. This article emphasizes learning and metrics. It asks to compare preplanned expectations to testing ongoing performance, through constant experimentation. The Toyota way focuses on the “overburden” placed on the worker instead of looking for “waste.” That is, dealing with the person and team, by making work simpler. Most of the examples are from the production floor, not office work, but the key principle is “direct observation.” The goal isn’t to make the manager more productive, but to make the workers more efficient in their actions. A Team Leader spends 20 % of time on innovation, and a manager, 70 %. Managers coach, not fix. The key to mastering the Toyota way is to apply principles, not tools.
The oldest article I included from the search was [“Building a Learning Organization”, D. A.. Gavin, (HBR, Nov/Dec, 1993)]. The article picks up on Peter Senge’s 1990 book, “The Fifth Discipline”, and looks at five skill areas – Systems thinking, personal mastery, mental models, shared vision, and team learning. It is interesting that the recommendations are already in wide use in PM, for example, in the sub-section, “Meaning, Management, and Measurement,” the vital importance of Metrics in the process of creating, acquiring, and transferring knowledge is emphasized to modify behavior and reach consensus. He lists tools grouped by stages of project development. See Figure 2, below:
Steps | Tools |
Generate Ideas | Brainstorming, Interviews, Surveys |
Reach Consensus | List Reduction, Rating, Weighted Voting |
Display Data | Cause and Effect, Force-Field |
Plan Action | Gantt and Flowcharts |
Figure 2. Tools Appropriate to Stages of Project Management Activity
I found it interesting that the article above provided the early “Thought Leadership” – (which I noted in an earlier article *), in this case, the 1) Systematic Problem Solving, 2) Experiments, 3) Lessons Learned, 4) Information Transfer, and 5) Knowledge Transfer– each step supported through Metrics – has permeated our practices – (although I didn’t read the article in 1993.)
But with hindsight, it’s gratifying that this knowledge does finally trickle down into our practices.