CitationDownload as .RIS
Emerald Group Publishing Limited
Copyright © 2010, Emerald Group Publishing Limited
Strategy in the media
Article Type: CEO advisory From: Strategy & Leadership, Volume 38, Issue 6
Using triangulation to overcome modeling myopia
Some of us supplement whatever heuristics and lessons we have learned from our own experiences with statistical or modeling capabilities. These have had formidable success in the worlds of design and engineering, but as decision aids they often create as many problems as they solve. Those of us who are trained in rational decision-making models are familiar with the Law of Large Numbers, which is at the heart of modern organizations’ reliance on statistical analysis. An offshoot of the Law of Large Numbers is the Central Limit Theorem. It specifies that when you are looking at statistics, many distributions assume what we call a “normal” distribution, the familiar bell-shaped curve beloved of college statistics professors. Get enough observations, and the differences among them fade away, yielding the curve for the population as a whole. This insight leads many to build systems whose premise is that normal distributions will yield predictable results.
The problem, of course, is that many phenomena do not conform to the Central Limit Theorem. People constantly forget that the percentages in the bell curve apply to the whole population, and are therefore not good predictors of individual outcomes. In many cases, there are not enough observations for the bell curve to take shape. In still others, distributions are not bell-shaped, but take some other form. And finally, for many situations, our interest is in the outliers, not in the central tendencies … Why did the models underlying financial meltdowns, from the Long Term Capital fiasco to today’s financial crisis, not prevent disaster? Because reality acted in ways that were beyond the parameters of the financial models used to try to understand them …
Triangulation is another simple technique that researchers sometimes rely on, and it can be helpful to organizations in managing complexity. Triangulation means that rather than modeling data from just one source, you triangulate – get information from different sources and see if it leads you to the same conclusion. Triangulation is not simply a matter of gathering more data but of gathering different types of data. It implies looking at an object from two or more perspectives. Organizational theorist William Starbuck emphasizes that triangulation is especially useful when the perspectives are based on different levels of analysis, and use data aggregated at different levels – for example, data gathered from individual employees talking to individual customers, as well as aggregated sales figures of individual stores or stores in a whole region gathered over a three-month period.
Gökçe Sargut and Rita Gunther McGrath, “Managing under complexity: where is Einstein when you really need him”, Ivey Business Journal, May/June 2010.
Innovation inspired by nature
One of IDEO’s ten faces of innovation is the anthropologist: the one who observes human behaviors and actions to discover wasted effort that could be turned into an innovation challenge. In the past decade, an eleven’s face has been quietly but steadily rising to prominence in the innovation team: the biomimetist, who observes animal and plant characteristics to discover supreme efficiency that could be turned into an innovation breakthrough.
The kingfisher’s beak isn’t just a fashionable accessory that the bird has picked on the shelves of supermarket nature. It is the result of millions of years of evolution and natural selection. The biomimetist starts from the humble assumption that, even if it is not obvious at first, there may be a good reason why nature has designed animal and plants as we see them. The kingfisher’s beak turns out to be supremely efficient at crossing the air-water interface with the minimum amount of turbulence, thus making the bird more successful at catching fish by surprise.
It was the source of inspiration for the design of the Shinkansen, the Japanese bullet train. Obviously the train does not dive into water, but it has many tunnels to pass through. Tunnels tend to create an air-air interface between the inside and the outside, which, when crossed, generates turbulence and noise. The efficiency of the design has enables engineers to create a train that is the most silent of its kind.
Likewise, attentive and questioning observation of the lotus leaf inspired glass manufacturer Saint-Gobain. How does the lotus leaf manage to remain clean in a muddy environment? Electronic microscope observation of the surface of the leaf revealed a hydrophobic nano-structure on which mud does not stick. It then lets the lotus maximize its exposure to sunlight for photosynthesis.
This surface structure that enables the plant to yield maximum benefits from natural resources, namely the sun and the rain, became the inspiration for Saint-Gobain’s design of a self-cleaning glass. The company designed a hydrophobic surface structure on which dirt that is decomposed by sunlight is washed away by the rain.
Observing nature, humbly, questioningly, letting it fill us with wonder, can be a fantastic source of inspiration. Not only for its beauty, but also its efficiency.
Yann Cramer “The rise of the biomimetist”, Blogging Innovation, July 30, 2010, www.business-strategy-innovation.com/wordpress/2010/07/innovation-by-observation
CEO skills and company evolution
Q: You say companies need different breeds of leaders at different stages. How are CEOs like dogs?
RK: I call the first CEO the retriever – the leader who has to go out and assemble the resources. They have to go out and find the people, the money and the partners. That person is really great sales person – they have sell the vision every day. They’re asking people to believe in something that doesn’t exist and take a substantial leap of faith.
The next is the bloodhound CEO. You got to find out where that value proposition is going to find paydirt so you can actually build a business around it. You’ve got something now, but how do you optimize it? You’ve got to sleuth that out.
The husky is the next one. Now you’ve got a product, a value proposition, and you’ve figured out your business model. Now you’ve got to pull this sled as it gets heavier with people, products and customers up a hill, which is essentially the hill of building a big successful business.
The one dog you never really want pulling your company is the St Bernard.
Q: The rescue dog.
RK: Right. Because at that point you know you’ve got big trouble.
Q: Even a great leader, if the wrong breed at the wrong time, can be a mismatch?
RK: Absolutely. There are different talents in the creation of businesses and running of businesses that need to be taken into consideration. A mistake often made in the venture investment business is rushing to bring in a big CEO into what is still a small venture. The mismatch of skills is severe. The big CEO needs resources, needs a strong sense of direction and momentum, and is not very effective day-to-day with a bunch of people putting bits and bytes together. The other mismatch that’s harder to foresee is the small company with momentum. You say, great, let’s bring in the guy who can grow it to $100 million and take it public. The problem is that you may face yet another significant right or left hand turn in your business which that CEO may be completely unqualified to do.
“What breed is your CEO? Interview with Randy Komisar of Kleiner Perkins Caufield & Byers”, Fast Company, July 27, 2010.
Skills matter; does education?
Eduzealots have done a truly awful thing to serious human capital conversations and analyses around employment. By vociferously championing higher education as key to economic success, they’ve distorted important public policy debates about how and why people get hired and paid well. They’ve undermined useful arguments about “street smarts” versus “book smarts.” Treating education as the best proxy for human capital is like using patents as your proxy for measuring innovation – its underlying logic shouldn’t obscure the fact that you’ll underweigh market leaders like WalMart, Google, Tata and Toyota. Dare I point out that Microsoft’s Bill Gates, Dell’s Michael Dell, Apple’s Steve Jobs, Oracle’s Larry Ellison and Facebook’s Mark Zuckerberg are all college drop-outs? The point isn’t to declare a college degree antithetical to launching a high-tech juggernaut but to observe that, perhaps, higher education isn’t essential to effective entrepreneurship.
We have a huge branding issue. Pundits and policy-makers jabber about the need to educate people to compete in knowledge-intensive industries. But knowledge doesn’t represent even half the intensity of this industrial challenge. What really matters are skills. The grievously undervalued human capital issue here isn’t quality education in school but quality of skills in markets. Establishing correlations, let alone causality, between them is hard. (Michael Polanyi’s classic Personal Knowledge brilliantly articulates this. A computer science PhD doesn’t make one a good programmer. There is a world of difference between getting an “A” in robotics class and winning a “bot” competition. MIT’s motto isn’t Mens et Manus (Latin for Mind and Hand) by accident. Great knowledge is not the same as great skill. Worse yet, decent knowledge doesn’t guarantee even decent skills. Unfortunately, educrats and eduzealots behave as if college English degrees mean their recipients can write and that philosophy degrees mean their holders can rigorously think. That’s not true …
Academic and classroom markets are profoundly different than business and workplace markets. Why should anyone be surprised that serious knowledge/skill gaps dominate those differences?
Higher education institutions do decently with knowledge transmission. Unfortunately, they do dismally transmitting skills.
Michael Schrage, “Higher education is overrated; skills aren’t”, Harvard Business Review blog, July 29, 2010, http://blogs.hbr.org/schrage/2010/07/higher-education-is-highly-ove.html
Why manufacturing matters
There’s more at stake than exported jobs. With some technologies, both scaling and innovation take place overseas.
Such is the case with advanced batteries. It has taken years and many false starts, but finally we are about to witness mass-produced electric cars and trucks. They all rely on lithium-ion batteries. What microprocessors are to computing, batteries are to electric vehicles. Unlike with microprocessors, the US share of lithium-ion battery production is tiny.
That’s a problem. A new industry needs an effective ecosystem in which technology knowhow accumulates, experience builds on experience, and close relationships develop between supplier and customer. The US lost its lead in batteries 30 years ago when it stopped making consumer electronics devices. Whoever made batteries then gained the exposure and relationships needed to learn to supply batteries for the more demanding laptop PC market, and after that, for the even more demanding automobile market. US companies did not participate in the first phase and consequently were not in the running for all that followed. I doubt they will ever catch up …
Consider this passage by Princeton University economist Alan S. Blinder: “The TV manufacturing industry really started here, and at one point employed many workers. But as TV sets became ’just a commodity,’ their production moved offshore to locations with much lower wages. And nowadays the number of television sets manufactured in the US is zero. A failure? No, a success.”
I disagree. Not only did we lose an untold number of jobs, we broke the chain of experience that is so important in technological evolution. As happened with batteries, abandoning today’s “commodity” manufacturing can lock you out of tomorrow’s emerging industry.
Andy Grove, “How America can create jobs”, BusinessWeek, July 1, 2010.
The limits of the Dell model
The “Dell model” became synonymous with efficiency, outsourcing and tight inventories, and was taught at the Harvard Business School and other top-notch management schools as a paragon of business smarts and outthinking the competition.
“Dell, as a company, was the model everyone focused on 10 years ago,” said David B. Yoffie, a professor of international business administration at Harvard. “But when you combine missing a variety of shifts in the industry with management turmoil, it’s hard not to have the shine come off your reputation.”
For the last seven years, the company has been plagued by serious problems, including misreading the desires of its customers, poor customer service, suspect product quality and improper accounting …
The problems affecting the Dell computers stemmed from an industrywide encounter with bad capacitors produced by Asian PC component suppliers. Capacitors are found on computer motherboards, playing a crucial role in the flow of current across the hardware. They are not meant to pop and leak fluid, but that is exactly what was happening earlier this decade, causing computers made by Dell, Hewlett-Packard, Apple and others to break down …
Dell’s supply chain had always stood out as one of its important assets. The company kept costs low by limiting its inventory and squeezing suppliers. If prices for components changed, Dell could react more quickly than its competitors, offering customers the latest parts at the lowest cost.
But the hundreds of Dell internal documents produced in the lawsuit show a company whose supply chain had collapsed as it failed to find working motherboards for its customers …
“Suit over faulty computers highlights Dell’s decline”, New York Times, June 28, 2010.
How our tools shape our mind
Mankind’s first maps, scratched in the dirt with a stick or carved into a stone with another stone, were as rudimentary as the scribbles of toddlers. Eventually the drawings became more realistic, outlining the actual proportions of a space, a space that often extended well beyond what could be seen with the eye. As more time passed, the realism became scientific in both its precision and its abstraction. The mapmaker began to use sophisticated tools like the direction-finding compass and the angle-measuring and to rely on mathematical reckonings and formulas. Eventually, in a further intellectual leap, maps came to be used not only to represent vast regions of the earth or heavens in minute detail, but to express ideas – a plan of battle, an analysis of the spread of an epidemic, a forecast of population growth.
The historical advances in cartography didn’t simply mirror the development of the human mind. They helped propel and guide the very intellectual advances that they documented. The map is a medium that not only stores and transmits information, but also embodies a particular mode of seeing and thinking. As mapmaking progressed, the spread of maps also disseminated the mapmaker’s distinctive way of perceiving and making sense of the world. The more frequently and intensively people used maps, the more their minds came to understand reality in the maps’ terms.
The influence of maps went far beyond their practical employment in establishing property boundaries and charting routes. “The use of a reduced, substitute space for that of reality,” explains the cartographic historian Arthur Robinson, “is an impressive act in itself.” But what’s even more impressive is how the map “advanced the evolution of abstract thinking” throughout society. “The combination of the reduction of reality and the construct of an analogical space is an attainment in abstract thinking of a very high order indeed,” writes Robinson, “for it enables one to discover structures that would remain unknown if not mapped.” The technology of the map gave to man a new and more comprehending mind, better able to understand the unseen forces that shape his surroundings and his existence.
Nicholas Carr, “The map and the mind”, National Geographic Assignment blog, July 22, 2010, http://nationalgeographicassignmentblog.com/2010/07/22/the-map-and-the-mind/
How low-cost rivals steal a march on their premium competitors
When low-cost competitors appear, one of the toughest decisions facing executives in companies with premium products and brands is whether to respond. Should the company or business unit adjust its strategy to meet the low-cost threat or should it continue business as usual, with no change in strategy or tactics?
As these established companies attempt to define the nature and magnitude of the challenge, they often underestimate it. Sometimes executives are so focused on their traditional competitors, they don’t even recognize the threat developing from low-cost rivals. What executive isn’t familiar with the case of the low-cost airline Ryanair and its hugely successful entry into the European market at the expense of the region’s traditional carriers? Likewise, were the world’s leading telecommunications companies too busy competing with one another to recognize the threat from the Chinese low-cost competitor Huawei, now a leader in fixed-line networks, mobile-telecommunications networks, and Internet switches? Then there was Vizio, a little-known LCD TV supplier that overtook the premium brands in five years to become the North American market leader in large-format TVs. Complacency and arrogance produce blind spots that delay a response and leave incumbents vulnerable.
But our study of low-cost competitors suggests that they also build momentum in slower-moving and more subtle ways – factors that established players might do well to pay closer attention to. At times, low-cost challengers build their presence stealthily by competing in undeveloped segments of a market. Or they can narrow capability gaps by tapping the look, feel, and suppliers of bigger rivals. In other cases, competition between low-cost entrants can produce unintended second-level effects that escape the notice of incumbents until it’s too late to prevent a severe erosion of their market position.
Adrian Ryans, “When companies underestimate low-cost rivals,” McKinsey Quarterly, June 2010
Dealing with risk: avoidance vs resilience
Three recent events serve as a reminder that building resilience into organizations – in effect, thinking the unthinkable and preparing to face it – may serve us better than risk avoidance.
The first is the case of volcanic ash from Iceland that shut down significant air corridors between Europe and North America for days, shutdowns which promise to recur at unpredictable intervals. Before the unavoidable ash from the Eyjafjallajökull volcano, standard procedure in the airline business was not to fly through ash – in other words, to avoid it completely. When the unthinkable happened – an ash cloud so large that it couldn’t be avoided – everyone was unprepared. Only now is the industry envisioning a world in which large clouds of ash may be a permanent feature.
The second, of course, is the untamed BP oil spill in the Gulf of Mexico. Whatever mistakes were made in the prevention department, it is absolutely clear that little thought went into preparing to respond if something did go wrong.
And the third was described in a recent Wall Street Journal article, “Using science against suicide bombs”. The conventional approach to combating suicide bombings has been to try to prevent them through security and vigilance. A complementary approach is being explored by US-educated Fulbright scholar Zeeshan-ul-hassan Umani, who suggests that systems be designed to minimize the damage bombs can cause. Simulation software he’s developing indicates (among other things) that when bombs go off in mosques, where people sit in rows, fewer deaths occur than when they go off in crowded places, where people randomly move around. His research suggests that, if event organizers in threatened locations used rows rather than circular seating, it could reduce deaths and injuries by up to 25%.
Of course, we would always prefer to avoid negative outcomes if possible, and organizations should certainly invest in prevention. It may be wise to remember, though, that investing in resilience can be a complementary and essential component of preparing to face risks.
Rita McGrath, “The benefits of thinking the unthinkable,” Dynamic Strategies, July 7, 2010, http://blogs.hbr.org/hbr/mcgrath/2010/07/the-benefits-of-thinking-the-u.html
Learning from the worst cases
Disasters teach more than successes.
While that idea may sound paradoxical, it is widely accepted among engineers. They say grim lessons arise because the reasons for triumph in matters of technology are often arbitrary and invisible, whereas the cause of a particular failure can frequently be uncovered, documented and reworked to make improvements.
Disaster, in short, can become a spur to innovation.
There is no question that the trial-and-error process of building machines and industries has, over the centuries, resulted in the loss of much blood and many thousands of lives. It is not that failure is desirable, or that anyone hopes for or aims for a disaster. But failures, sometimes appalling, are inevitable, and given this fact, engineers say it pays to make good use of them to prevent future mistakes.
The result is that the technological feats that define the modern world are sometimes the result of events that some might wish to forget.
“It’s a great source of knowledge – and humbling, too – sometimes that’s necessary,” said Henry Petroski, a historian of engineering at Duke University and author of Success Through Failure, a 2006 book. “Nobody wants failures. But you also don’t want to let a good crisis go to waste.”
“Taking lessons from what went wrong”, New York Times, July 29, 2010.
Herman Miller reinvents itself
At the start of the 2000s, Michael Volkema, then the chief executive officer of Herman Miller Inc., became convinced that growth in the white-collar workforce was going to slow in the company’s main markets. That was a threat to this office-furniture maker, based in Zeeland, Mich., whose revenues depended on products sold to the white-collar workforce – products such as office desks, chairs, panels, shelves, and cabinets. Volkema’s solution was to create the Creative Office, a capability within Herman Miller for identifying adjacent markets in which the company could build businesses that would provide significant new streams of revenue.
The CEO chose Gary Miller, a 26-year company research veteran, to spearhead the effort, with the aspiration of doubling the size of the company’s business playing field in three to five years. Miller (no relation to the Miller in the company name) knew he would be exploring unfamiliar market territory. Although he would stay within the boundaries of office interiors, he would need to step beyond Herman Miller’s traditional niche making furniture and cubicles.
Still, Miller didn’t want to butt heads with incumbent companies. Why compete with giants dominating existing markets? “Gary went out and asked, ‘What are the unsolved problems out there?’” says Brian Walker, the company’s former chief financial officer, who took over as CEO in 2004. “He didn’t ask, ‘How do I respond to the market for specific products like lighting?’”
Miller’s multiyear research and development effort, which included creating a partnership with West Coast and East Coast technologists and architects, led to a burst of new concepts. In lighting, for example, GE, Philips, and Osram Sylvania were then focusing on light-emitting diodes (LEDs) as substitutes for standard incandescent light fixtures. Miller and his team saw an alternative: using the low-voltage DC power of LEDs for novel kinds of illumination – light tunnels, walls, lighted objects, wearable light. Why restrict lights to conventional overhead fixtures? Why not integrate them into office furniture and fixtures in new ways?
That effort led to a suite of product prototypes dubbed Programmable Environments, and later to a new business named Convia. Among the prototypes were illuminated, movable “visual shields” that changed color and a suspended wall with integrated LEDs. Integral to the new product suite was the notion of programmability. Office workers themselves would be able to use various devices, including their desktop computers, to reconfigure and reprogram the office environment. The new hardware and software allowed Miller and his team to redefine how people would think about personal space, office geometry, privacy, and illumination. In the end, the R&D project spawned 25 patent applications, and Convia was established as a Herman Miller subsidiary in 2006.
Bill Birchard, “Herman Miller’s design for growth”, Strategy+Business, Summer 2010.
Craig HenryStrategy & Leadership’s intrepid media explorer, collected these sightings of strategic management in the news. A marketing and strategy consultant based in Carlisle, Pennsylvania, he welcomes your contributions and suggestions (Craighenry@aol.com).