Strategy in the media

Strategy & Leadership

ISSN: 1087-8572

Article publication date: 6 November 2009

319

Citation

Henry, C. (2009), "Strategy in the media", Strategy & Leadership, Vol. 37 No. 6. https://doi.org/10.1108/sl.2009.26137faf.001

Publisher

:

Emerald Group Publishing Limited

Copyright © 2009, Emerald Group Publishing Limited


Strategy in the media

Article Type: CEO advisory From: Strategy & Leadership, Volume 37, Issue 6

Craig HenryCraig Henry, Strategy & Leadership’s intrepid media explorer, collected these sightings of strategic management in the news. A marketing and strategy consultant based in Carlisle, Pennsylvania, he welcomes your contributions and suggestions (Craighenry@aol.com).

From problem-solving to problem-finding

This book has argued that leaders at all levels must develop their problem-finding skills … Becoming an effective problem-finder requires a different mindset, though, not simply a set of new behaviors and competencies. That mindset begins with a certain level of intellectual curiosity. You must be willing to ask questions, seeking always to learn more about both the familiar and the unfamiliar …

Systemic thinking

Successful problem-finders not only exhibit a curious mindset, but they also embrace systemic thinking. They recognize that small problems often do not occur due to the negligence or misconduct of an individual. Instead, small errors frequently serve as indicators of broader systemic issues in the organization. Effective problem-finders do not rush to find fault and assign blame when they spot a mistake being made. They step back and question why that error occurred. They ask whether more fundamental organizational problems have created the conditions that make that small error more likely to occur. Effective problem-finders recognize that you might fire the person who made an error on the front lines, but if you do not address the underlying systemic issues, the same errors will occur again and again. …

Healthy paranoia

Andy Grove, former Chairman and CEO of Intel, once wrote a book titled, Only the Paranoid Survive. In the preface, he described himself as quite a worrier. He said that he worried about everything from manufacturing problems to competitive threats to the failure to attract and retain the best talent. Many concerns kept him up at night. Grove argued that he believed fervently in the “value of paranoia.” He felt that leaders must never allow themselves to get comfortable, no matter how successful they had become. They had to devise ways of staying in touch with those in the organization who were willing to challenge the conventional wisdom, and who might alert them to bad news.

Michael A. Roberto, What You Don’t Know: How Great Leaders Prevent Problems Before They Happen (Wharton School Publishing, 2009).

Studying earthquakes – real and financial

Scientists, sometimes in cooperation with economists, are taking the lead in a young field that applies complexity theory to economic research, rejecting the traditional view of the economy as a fully transparent, rational system striving toward equilibrium. The geophysics professor and earthquake authority Didier Sornette, for example, leads the Financial Crisis Observatory, in Zurich, which uses concepts and mathematical models that draw on complexity theory and statistical physics to understand financial bubbles and economic crises.

Sornette aims to predict extreme outcomes in complex systems. Many other scientists in the field of complexity theory argue that earthquakes, forest fires, power blackouts, and the like are extremely difficult or even impossible to foresee because they are the products of many interdependent “agents” and cascades of events in inherently unstable systems that generate large variations. One symptom of such a system’s behavior is that the frequency and magnitude of outcomes can be described by a mathematical relationship called a “power law,” characterized by a short “head” of frequently occurring small events, dropping off to a long “tail” of increasingly rare but much larger ones …

This suggests that the economy, like other complex systems characterized by power law behavior, is inherently unstable and prone to occasional huge failures. Intriguing stuff, but how can corporate strategists, economists, and policy makers use it? This is still a young field of research, and the study of power law patterns may be part of the answer, but it isn’t too early to consider and discuss potential implications.

Make the system the unit of analysis. You can’t assess the behavior and performance of a specific agent – for example, a financial-services company – without gauging the behavior and performance of the system in which it is embedded

Don’t assume stability and do take a long look back. Major systemic imbalances and corrections are highly likely, and everyone should be wary of new economic paradigms to the contrary.

Focus on early warning. The inherent uncertainty of complex systems makes point predictions unreliable. Much as earthquake scientists are developing tsunami early-warning systems, corporate strategists should monitor potential indications that economic stress might be building in their industries.

Build flexible business models. Corporate leaders might consider robust business models incorporating some slack and flexibility instead of the models most common today, which aim to optimize value in the most likely future scenario and thus leave companies exposed when conditions change dramatically

Learn from scientists studying other complex systems. Strategists, economists, and others should consider several other potential parallels. To take one example, what economic-policy lessons could be drawn from the observation that efforts to put out small forest fires quickly may in time lead to large-scale fires, because the rapid mitigation of small ones allows burnable undergrowth to accumulate?

Michele Zanini, “‘Power curves’: What natural and economic disasters have in common,” McKinsey Quarterly, June 2009.

Knowledge flows and competitive advantage

One reason traditional measures alone don’t capture the challenges and opportunities for US companies and the national economy is that the digital infrastructure supporting the lion’s share of industries has sustained rapid performance improvements – especially in computing power, bandwidth, and storage. Previous infrastructures experienced sharp bursts of innovation in underlying technologies, such as the telephone and the internal combustion engine, and then quickly stabilized. Today, we do not yet see any signs of stabilization, which suggests not only that competitive intensity (which has more than doubled in the past 40 years) will continue to build but also that the digital infrastructure will keep boosting the potential – and necessity – for business innovation.

To help managers in this decidedly challenging time, we present a framework for understanding three waves of transformation in the competitive landscape: foundations for major change; flows of resources, such as knowledge, that allow firms to enhance productivity; and the impacts of the foundations and flows on companies and the economy. Combined, those factors reflect what we call the Big Shift in the global business environment …

Elements of the Big Shift

The first, foundational wave in the Big Shift consists of the extraordinary changes in digital infrastructure that enable vastly greater productivity, transparency, and connectivity. Consider how companies can use digital technology to create ecosystems of diverse, far-flung users, designers, and suppliers in which product and process innovations fuel performance gains without introducing too much complexity.

The second wave involves the increasing movement of knowledge, talent, and capital. Knowledge flows – which occur in any social, fluid environment where learning and collaboration can take place – are quickly becoming one of the most crucial sources of value creation. … Twentieth-century institutions built and protected knowledge stocks – proprietary resources that no one else could access. The more the business environment changes, however, the faster the value of what you know at any point in time diminishes. In this world, success hinges on the ability to participate in a growing array of knowledge flows in order to rapidly refresh your knowledge stocks. … .

Knowledge flows can help companies gain competitive advantage in an age of near-constant disruption. The software company SAP, for instance, routinely taps the more than 1.5 million participants in its Developer Network, which extends well beyond the boundaries of the firm. Those who post questions for the network community to address will receive a response in 17 minutes, on average, and 85% of all the questions posted to date have been rated as “resolved.” By providing a virtual platform for customers, developers, system integrators, and service vendors to create and exchange knowledge, SAP has significantly increased the productivity of all the participants in its ecosystem.

John Hagel III, John Seely Brown, and Lang Davison “The Big Shift: measuring the forces of change,” Harvard Business Review, July-August 2009.

Re-inventing financial economics

The finance industry is in the midst of a transformative period of evolution, and financial economists have a huge agenda to tackle. They should do so quickly, given the determination of politicians to overhaul the regulation of financial markets.

One task, also of interest to macroeconomists, is to work out what central bankers should do about bubbles – now that it is plain that they do occur and can cause great damage when they burst. Not even behavioralists such as Mr Thaler would want to see, say, the Fed trying to set prices in financial markets. He does see an opportunity, however, for governments to “lean into the wind a little more” to reduce the volatility of bubbles and crashes. For instance, when guaranteeing home loans, Freddie Mac and Fannie Mae, America’s giant mortgage companies, could be required to demand higher down-payments as a proportion of the purchase price, the higher house prices are relative to rents.

Another priority is to get a better understanding of systemic risk, which Messrs Scholes and Thaler agree has been seriously underestimated. A lot of risk-managers in financial firms believed their risk was perfectly controlled, says Mr Scholes, “but they needed to know what everyone else was doing, to see the aggregate picture.” It turned out that everyone was doing very similar things. So when their VAR models started telling them to sell, they all did – driving prices down further and triggering further model-driven selling.

Several countries now expect to introduce a systemic-risk regulator. Financial economists may have useful advice to offer. Many of them see information as crucial. Data should be collected from individual firms and aggregated. The overall data should then be published. That would be better, they think, than a system based solely on the micromanagement of individual institutions deemed systemically significant. …

Financial economists also need better theories of why liquid markets suddenly become illiquid and of how to manage the risk of “moral hazard” – the danger that the existence of government regulation and safety nets encourages market participants to take bigger risks than they might otherwise have done. The sorry consequences of letting Lehman Brothers fail, which was intended to discourage moral hazard, showed that the middle of a crisis is not the time to get tough. But when is?

Mr Lo has a novel idea for future crises: creating a financial equivalent of the National Transport Safety Board, which investigates every civil-aviation crash in America. He would like similar independent, after-the-fact scrutiny of every financial failure, to see what caused it and what lessons could be learned. Not the least of the difficulties in the continuing crisis is working out exactly what went wrong and why – and who, including financial economists, should take the blame.

“Efficiency and beyond,” The Economist, July 16, 2009.

Financial regulation after the bailouts

Here’s a really scary thought. Now that the federal government has poured hundreds of billions of dollars into saving financial institutions deemed “too big to fail,” hasn’t it implicitly guaranteed similar largesse for all such institutions in the future? In its rush to help, has the government unwittingly created the mother of all moral hazards – implicit rescue guarantees as far as the eye can see?

No doubt about it, says HBS professor and economic historian David Moss. “The extension of implicit guarantees to all systemically significant institutions takes moral hazard in the financial system to an entirely new level,” he warns. But Moss has a fix: The federal government should slap tough new regulations on all firms that pose “systemic risk” – the risk that a failure of one institution could wreak havoc across the entire financial system.

“In too many cases, regulators chose not to use tools they already had, or they neglected to request new tools to meet the challenges of an evolving financial system.” – David Moss

Among the proposed new regulations: higher capital requirements; leverage limits; FDIC-like insurance charges; and, when all else fails, a receivership process to restructure, sell, or liquidate a failing company. Bottom line, no firm should be too big to fail. At the same time, the majority of financial firms that pose no systemic risk should face relatively light regulation, ensuring their continued dynamism and innovation …

To take the guesswork out of such judgments, HBS senior lecturer Robert Pozen, chairman of MFS Investment Management in Boston, told a Senate committee on March 4 that regulators need look no further than five factors historically associated with financial crises: inflated prices of real estate, institutions with high levels of leverage, new products falling into regulatory gaps, rapid growth in an asset class or intermediary, and mismatches of assets and liabilities.

“‘Too big to fail’: reining in large financial firms”, Knowledge@Wharton, June 22, 2009, http://knowledge.wharton.upenn.edu/article.cfm?articleid=2286

Integrating expert knowledge with the wisdom of crowds

The proponents of 2.0 thinking on user-generated content, be they fans of Web 2.0, Enterprise 2.0, Health 2.0, or Rhubarb 2.0, would have us believe that their highly participative approach is the only one that works. And indeed, there is an appeal in democratizing content creation and management. However, in almost every case there is also value in professional involvement.

Take health care, for example. A couple of months ago there was a conference in Boston (I was unable to attend) on “Health 2.0 meets Ix.” For those unfamiliar with this debate, Health 2.0 fans advocate patients taking control of their own health care and sharing information across patient communities, rather than turning it over to professionals. “Ix” refers to “information therapy,” which is shorthand for the scientific/medical establishment’s 1.0 use of research, clinical trials, and licensed practitioners to fight disease. If you were sick, would you rely on Health 2.0 or 1.0?

In all likelihood, you’d go for both. You’d listen to what the medical establishment prescribes for your ailment, but you’d probably also check out blogs, wikis, and other patient-generated content and communities. As health care analytics expert Blake Zenger notes in his blog, Health 2.0 versus Ix is a false dichotomy. We can embrace the virtues of democratized health content without throwing away the benefits of professionalism and science. Depending on the circumstances, your personal equation may be 1.3 or 1.8, but you’re going to want both medical science and the comments of those who have opined on their own health situations.

The same is true in many other content settings. … Even Wikipedia, often held out as the epitome of 2.0 content, has increasingly employed (mostly unpaid) editors to monitor, verify, and sometimes even create content.

Inside enterprises, the same 1.5 mix is often desirable. For example, I once heard Steve Schmidt, the CIO of Vertex Pharmaceuticals, describe his company’s use of wikis for capturing the results of research. He said that the most successful ones are “curated” – facilitated and edited by humans whose job it is to do so. Before blogs and wikis came along, the same was true of discussion databases. The best ones typically had facilitators and online community organizers.

Of course, it’s more romantic and revolutionary to assert that only the masses can generate useful content. It’s appealing that the hoi polloi can replace experts, editors, and experienced professionals. It just doesn’t happen to be true. The key word is “augment,” not “replace.” 1.5 is greater than either 1.0 or 2.0.

Tom Davenport, “Why 1.5 is greater than 2.0,” The Next Big Thing, June 15, 2009, http://blogs.harvardbusiness.org/davenport/2009/06/why_15_is_greater_than_ 20.html?loomia_ow=t0:s0: a38:g2:r2:c0.090825: b26445344:z6

How customers buy

Marketing has always sought those moments, or touch points, when consumers are open to influence. For years, touch points have been understood through the metaphor of a “funnel” – consumers start with a number of potential brands in mind (the wide end of the funnel), marketing is then directed at them as they methodically reduce that number and move through the funnel, and at the end they emerge with the one brand they chose to purchase. But today, the funnel concept fails to capture all the touch points and key buying factors resulting from the explosion of product choices and digital channels, coupled with the emergence of an increasingly discerning, well-informed consumer. A more sophisticated approach is required to help marketers navigate this environment, which is less linear and more complicated than the funnel suggests. We call this approach the consumer decision journey.

  • In the traditional funnel metaphor, consumers start with a set of potential brands and methodically reduce that number to make a purchase.

  • The decision-making process is now a circular journey with four phases.

  • Two-thirds of the touch points during the active-evaluation phase involve consumer-driven activities such as Internet reviews and word-of-mouth recommendations from friends and family.

We developed this approach by examining the purchase decisions of almost 20,000 consumers across five industries and three continents. Our research showed that the proliferation of media and products requires marketers to find new ways to get their brands included in the initial-consideration set that consumers develop as they begin their decision journey. We also found that because of the shift away from one-way communication – from marketers to consumers – toward a two-way conversation, marketers need a more systematic way to satisfy customer demands and manage word-of-mouth. In addition, the research identified two different types of customer loyalty, challenging companies to reinvigorate their loyalty programs and the way they manage the customer experience.

Finally, the research reinforced our belief in the importance not only of aligning all elements of marketing – strategy, spending, channel management, and message – with the journey that consumers undertake when they make purchasing decisions but also of integrating those elements across the organization. When marketers understand this journey and direct their spending and messaging to the moments of maximum influence, they stand a much greater chance of reaching consumers in the right place at the right time with the right message.

David Court, Dave Elzinga, Susan Mulder, and Ole Jørgen Vetvik, “The consumer decision journey”, McKinsey Quarterly, June 2009.

Branding when customers talk back

In a world where consumers trust each other more than they trust brands, we have to fix what’s really broken – the products, services, and experiences that people buy. And thank God for that. For the consumer the Internet made things better forever. It will also make things better for those brands that choose to actively shape their own destinies.

In fact we’re already seeing exactly this from brands both large and small. Here are three things that leaders seem to have in common.

  1. 1.

    Innovate by leading your customer. Consumers don’t always want you to ask them what they want. Often they don’t know what they want, because it hasn’t been done yet. Just look at Flip Video from Pure Digital. On the market for 3 years, selling 2m units, inspiring Cisco to buy the whole firm for $590m. The secret? Innovating a line of incredibly cheap, simple video cameras, which easily put video on the Web. The product of focus groups? Absolutely not. The product of a true unmet need that Flip could meet? Absolutely.

  2. 2.

    Create amazing experiences. If the experiences you create aren’t unique, you’re a commodity. In a conversation driven world, no amount of advertising can fix that. Instead, you must focus on what your unique experience will be–for Amazon, it’s about choice, service and the community of users, for Virgin America, it’s about style and modernity. Whatever you decide to base your experience on, you have to brutally ensure that everything the customer experiences is consistent with it.

  3. 3.

    Get involved with the conversation. Social media allows you to listen to the people who are talking about you, or to you, and then engage them right back.

“To live in interesting times,” Fast Company Blog, August 3, 2009, www.fastcompany.com/blog/paul-worthington/paul-worthington-wolf-olins/live-interesting-times

Getting started with social media

If you’ve got an experienced social media team, a solid budget and an appetite for innovation, you can create an original online presence that engages your customers or supporters in an entirely new kind of online experience.

But many organizations lack the time, budget or experience to start from scratch. That doesn’t limit your social media options to a generic corporate news blog or a standard-issue Facebook page. Here are three great options for robust social media presences that let you manage cost and risk by building on existing tools and established best practices.

1. Suggestion Box

What is it: Invite your customers, supporters or employees to submit their ideas and suggestions for new products, services or improvements. Community members get to rate submissions so the best ideas rise to the top; it’s your job to ensure top suggestions get implemented.

How to do it: Build your own site using a content management system; many now offer a Digg-style submission and voting system as an add-on. Or use a pre-fab solution like the Salesforce software that powers Dell’s site, or the turn-key Uservoice, designed specifically for managing customer suggestions … .

2. Widget

What is it: Create an interactive badge your customers or supporters can place on their Facebook pages or blogs. A widget can display your latest news, deals or contests, invite Twitter-style updates, or solicit donations.

How to do it: Use a service like SproutBuilder or WidgetBox to create a simple widget with content updates powered by your RSS feed; for non-standard approaches, a web developer or programmer can create something from scratch.

3. Deal-of-the-day

What is it: Create an online presence that lets people know about a special, time-limited offer. It could be a product available in limited quantities, a discounted service, or donation matching. Update your offer regularly so there’s a reason for your audience to check back frequently.

How to do it: Create a Twitter feed or Facebook page that you update once a day (or even once or twice a week) with a special offer. Promote your feed or page to fans or potential customers so they can track deals in real time.

Alexandra Samuel,"Three instantly effective social media ideas," Conversation Starter, July 27, 2009, http://blogs.harvardbusiness.org/cs/2009/07/three_instant_social_media_sol.html

Putting customers at the center of design

Antonella is an attractive 28-year old woman who lives in Rome. Her life is focused on friends and fun, clubbing and parties.

She is also completely imaginary.

But her influence is definitely real. It is evident in the design of the Ford Fiesta, on sale in Europe now and arriving in the United States next summer as a 2011 model.

Antonella was the guiding personality for the Ford Verve, a design study that served as the basis for the latest-generation Fiesta. A character invented by Ford designers to help them imagine cars better tailored to their intended customers, she embodies a philosophy that guides the company’s design studios these days: to design the car, first design the driver.

Antonella is the personification of a profile created from demographic research about the Fiesta’s target customer, said Moray Callum, executive director of Ford Americas design.

Ford is using characters like Antonella to bring a human element to the dry statistical research drawn from polls and interviews. Based on psychological profiles, these characters are a more modern version of the “theme boards” that designers once covered with snapshots and swatches of material to inspire a design. They are also like avatars, those invented characters used in online games and forums to symbolize a participant’s personality.

“Invented characters get everyone on the same page,” Mr. Callum said. “Personalizing gives context to the information we have. Sometimes the target demographics are difficult to relate to by, say, a 35-year-old male designer.

“We found in the past that if they didn’t understand the buyer, designers would just go off and design something for themselves,” he added.

Murat Yalman, Ford’s director of global advanced product strategy, is a strong supporter of an approach that personalizes the ideal buyer for everyone involved in a vehicle’s development.

“You get a common focus for everyone from the clay modeler to the chief executive,” he said.

The method brings statistics to life. “It creates very memorable ideas that live on after the meeting or presentation,” Mr. Yalman said.

“Before creating the car, Ford designs the driver”, New York Times, July 19, 2009.

Unexpected barriers to innovation

I know it sounds preposterous, but it is increasingly clear to me that the professionals who generate invention – engineers, scientists and mathematicians – are often the enemies of innovation. Yes, I realize that Google and other great companies are the products of mathematical minds, but I would argue that unless Google becomes more social sciency and less science sciency, it will ossify and be replaced. Perhaps that has already begun.

Which is my point. Innovation is about social applications of inventions, not about the inventions themselves. Engineers, scientists and mathematicians don’t get this. It’s not part of their culture. We see time and again, engineering-driven corporate cultures failing because they don’t address the social needs of their customers and they don’t address the social ramifications of invention.

Motorola, for example, has working touch-screen cell phones in China years before the iPhone (works great with complex Chinese writing system) but wouldn’t bring it to the US because the engineering-dominant company leaders focused on technology and features, not use.

P&G got into serious trouble before A.G. Lafley took over as CEO because it was a chemistry-driven culture that insisted on its scientists doing everything. Lafley turned it into a consumer-driven company and opened it to innovation from all over the globe. Lafley redesigned the corporate model for faster innovation.

There are a million examples of this. It is turning out that the US is great at invention but not so great at innovation. We need more anthropologists and sociologists working with our engineers and scientists to develop services, products and experiences that people need and want. And we need managers in companies to understand what they do and enable this doing.

Bruce Nussbaum “Are engineers, scientists and mathematicians enemies of innovation?,” Nussbaum on Design, June 16, 2009, www.businessweek.com/innovate/NussbaumOnDesign/archives/2009/06/are_engineers_s.html

When outsourcing aids innovation

I have heard many, many senior executives talk about failure. “We have to become more tolerant of failure”, they say, or “we have to learn to fail” or “fail often, fail early”. And yet, when things do go wrong their first questions are often “who is responsible for this? Who’s accountable? Who is to blame?” Well intentioned project post-mortems turn into blamestorming sessions …

Nowhere is this struggle more intense than in innovation. You must try new ideas out to see if they work. Sometimes, despite all the research, you only know that the idea will work after launching. Innovation is a risky business.

I don’t think we have to learn how to fail. I think we have to learn how to understand risk and how to mitigate it, how to manage it.

Here’s an example. Some years ago I was working with someone from Accenture. His previous assignment had been as a Product Manager with (if my memory serves me correctly) Vodafone. He had headed a new product development that upon launch was not as successful as had been hoped and was discontinued soon afterwards.

Vodafone had been very clever. They had assessed the risk of this particular idea and decided that it was high. Too high to risk assigning one of their own Product Manager’s to lead the development. If it was unsuccessful then it would be highly career-limiting for that person. In their company culture, a track record of success was important for building a career. So they hired a contractor instead.

It turns out this was the normal practice for their product development group. Risky new projects were handled by contractors. Less risky ones by their own staff. If a risky product ended up being successful they either hired the contractor or replaced them with one of their own staff to take it forward in the life cycle.

This was how they managed innovation risk. It seems to me to be a lot easier than trying to change their passion-for-winning culture.

Jonathan Clarks, “Innovation is a risky business”, A Wheelbarrow Full of Surprises, July 2, 2009, http://jonathanclarks.blogspot.com/2009/07/innovation-is-risky-business.html)

Wal-Mart – green leader?

Wal-Mart’s announcement of its new sustainability index marks the dawning of the age of ecological transparency in the marketplace.

This is not just idle speculation; Wal-Mart has signaled that suppliers who ignore the requirements for ecological transparency will become “less relevant” to them. In other words, suppliers may one day compete for shelf space on the basis of their transparency about the ecological impacts of their products.

The retailer’s 100,000 suppliers around the world will have to calculate and disclose the total ecological costs of their products – and that data will be boiled down into a single rating that shoppers will see right next to the price tag. For consumers, this will drop to zero the “effort cost” of finding an item’s ecological impacts, which today often means digging through a confusing forest of rating systems online, then trying to recall that information while strolling the aisles of a store.

As consumer surveys have shown for years, only a small portion, maybe ten percent, of shoppers are passionate about shopping their values; around 25 percent couldn’t care less. The action is the two-thirds in the middle, who say they would value shop if they didn’t have to make any extra effort, and if prices are comparable. And Wal-Mart has the knack for keeping costs down.

The sustainability index will be built from answers to detailed questions about impacts that range from a company’s greenhouse gas emissions and solid waste reduction targets to worker’s wages and human rights – and positive contributions to the local community. Third party certifications will be built into the system. As the 900-pound gorilla of retail presses its suppliers for greener products, it is also inviting other huge retailers like Target and Cosco to adopt the same sustainability index. That will simplify things for both suppliers and consumers. And as more and more major retailers join in, we will see a growing business imperative for perpetually upgrading the ecological impacts of consumer products.

The value chain concept gauges how each step in a product’s life adds to its worth.

But value can be seen from another angle, as embodied in the index: all the environmental, health, and social impacts of a product throughout its life cycle. By creating a single standard for evaluation, Wal-Mart opens a window on products that reveals any negatives – what might be called the “devalue chain” – and puts them into competitive play.

The strategic value of these metrics is that every negative value offers a potential for upgrading, as each upgrade improves the item’s overall score. Assessing the ecological pluses and minuses throughout a product’s life cycle offers a metric for business decisions that will boost the pluses and lessen the minuses.

Daniel Goleman, “Wal-Mart exposes the de-value chain,” Leading Green, July 17, 2009, http://blogs.harvardbusiness.org/leadinggreen/2009/07/walmarts-transparency-exposes.html

Related articles