CitationDownload as .RIS
Emerald Group Publishing Limited
Copyright © 2001, MCB UP Limited
Pricing Trends for Personal Computers: Moore's Law, Wilson's Corollary, and Reality
Pricing Trends for Personal Computers:Moore's Law, Wilson's Corollary, and Reality
Thomas C. Wilson
Back in the early 1960s, we were treated to the oft-cited projection of Intel Corporation founder Gordon Moore, namely that the number of transistors on an integrated circuit board would double every two years. Soon after, the projection was revised to transistors doubling every 18 months. Over the years, others have modified this remarkably accurate forecast in such a way as to suggest not only that the numbers of transistors double (and therefore the processing power), but also that the price is halved. This version would suggest a fourfold increase in efficiency with regard to each generation of the CPU performance/price ratio.
No wonder we have heard time and again about the overall cost of computing diminishing rapidly over time. Anecdotally, this appears to be true. After all, anyone who has watched the seemingly infinite increase in the processing power of microcomputers from the late 1970s to the early 2000s must be convinced of the truth of this matter.
Well, let's look at this a bit more closely. Someone once said, "The computer you really want always costs $4,500." Here's where Wilson's Corollary kicks in (Wilson, 1998). My basic premise is that, while the processing power of CPUs has clearly doubled at least every 18 months for three decades, there are other factors to consider when discussing the overall selling price of a computer. First, the CPU is not the only cost element. There are many other parts of a computer, the cost of which is less influenced by the number of transistors that can be manufactured within a square millimeter (e.g. monitors, drives, peripherals, etc.).
The second part of Wilson's Corollary posits the effect of "application demand." In 1982, we were impressed by less computing power than we are wowed by today. In other words, what we imagined and worked on automating in 1982 was nowhere near as complex and demanding (in processing terms) as what we envision today. Over time, our expectations, and therefore our processing needs, increase substantially. Thus, it is not sufficient to compare the progress of computers solely on the basis of CPU density applied to a static application environment. To understand what has been happening, we need to admit that what sufficed for adequate computing in 1990 will no longer serve our purposes. Perhaps that seems overly obvious. It does, however, suggest that computer costs have not dropped as precipitously as some would suggest.
Case in point: At the University of Houston Libraries for a number of years we have held the amount we spend per desktop computer relatively constant. For these units, we would generally not purchase the absolute high end, but would instead select a configuration down a notch or two. This strategy permitted us to invest in larger quantities of adequate technology without paying the premium for being "first-on-the-block" users. At the same time we did not purchase two-year-old technology. Table I indicates the per-computer cost ranges over time.
Table I.Per-computer cost ranges over time
Clearly for us the price per unit has decreased over this 14-year period (approximately 20 percent), but not at the rate that many projections have suggested. Why? Because our ideas and demands for what a computer can and should do have concomitantly increased, nearly meeting the cost decreasing force of Moore's Law.
But Wait A Minute ...
As with all predictions and models, they only approximate reality. In fact, with regard to computer prices, there has been another factor at play, particularly in the past year or two. It is now possible for approximately $1,100 to purchase a significantly higher capacity computer (in terms of processor speed, RAM, and drive space) than could have been purchased for $1,600 just two years ago. What accounts for this 31 percent drop in price? Market forces!
Yes, Moore's Law is still chugging away, effecting a persistent and relatively constant decrease in the cost of computer internals, at least for those items that use integrated circuits. (This phenomenon also accounts for the substantial reduction in component size.) This effect alone, however, cannot explain CPUs in the $200 range, multi-gigabyte drives in the $150 range, CD-ROM drives for $30, and fully loaded motherboards for $125. These outcomes are also driven by competition and demand. As prices decrease, the world market for computers grows, which in turn fuels more competition that then leads to even lower prices. The cycle continues over and over, or so the thinking goes. But isn't there some hitch to this?
Another piece of the puzzle is the issue of profit margins for these devices. When a computer sells for $4,000 with a projected profit margin of 7 percent, the company makes $280. When a similar machine goes for $1,000, the company only gets $70. This difference has to be made up in volume (i.e. sell more computers at a lower price per unit). In essence, for every percentage decrease in selling price, there must be a 1/(1-percentage-decrease) increase in sales volume in order to maintain a constant revenue stream. Most companies, however, won't settle for a constant revenue stream they wish to grow. Thus, the sales volume must be increased at an even higher rate.
This little mathematical game suggests that, in the long run, only larger manufacturers that can sustain high volume sales will survive. Meanwhile the press of Moore's Law, Wilson's Corollary, and market demand marches on. One could easily reach the conclusion that this process will continue until there are no more manufacturers who can afford to make computers so cheaply!
So, What Does This Mean for Libraries?
First, I believe it is crucially important to be intellectually honest. We operate in a political environment that is overly infatuated with technology and the promises proffered by its apostles. We cannot afford to believe that the advances in technology will eventually create powerful devices that will be so cheap to manufacture that they will be given away free. In fact, just like calculators, when computers get so cheap that they are free, those particular models won't be able to do what we need them to do.
While we have seen some amazing reductions in computer prices, we don't want to be caught buying machines that have a shortened technological or functional life, simply because we wanted to save a few bucks upfront (or worse yet because we believed the technology should be less expensive than it is).
More to the point, for technology budget planning it makes sense to select a price point neither at the low end nor at the high end. Furthermore, it might be worthwhile to consider only manufacturers that appear to be able to sustain large volume production. It would also inform our thinking if we stopped approaching any technological expenditure as a one-time investment. Computers in particular have become consumables. Ah, but that's a topic for another column.
The mathematics for this limited analysis is:
p = profit percentage (constant)h = higher unit priceu = lower unit pricen = higher price sales quantityq = lower price sales quantityr = percentage reduction in unit cost
Assuming that a constant revenue stream must be maintained, then: phn = puq; which is the same as: hn = uq; solving for the higher unit price: h = (uq)/n; applying a percentage price reduction: (1 r) h = ((1 r)(uq))/n, which is the same as: (1 r) h = (uq)/(n/(1 r)); therefore: for every percentage decrease in unit price, there must be a 1/(1-percentage-decrease) increase in sales volume.
Applying this to the example in the text:initial unit price = h = 4,000initial quantity = n = 5lower unit price = u = 1,000drop in unit price = r = 0.75(1-0.75) * 4,000 = (1,000 * q)/ 5/(1-0.75)0.25 * 4,000 = 1,000q/(5/0.25)1,000 = 1,000q/201,000 = 50q20 = q = necessary sales quantity to maintain constant revenue stream75 percent decrease in unit price is followed by a 400 percent increase in sales volume.
Wilson, T.C. (1998), The Systems Librarian: Designing Roles, Defining Skills, American Library Association, Chicago, IL, pp. 137-9.
Tom Wilson (Thomas.Wilson@mail.uh.edu) is Head, Systems Department, University of Houston Libraries, Houston, Texas, USA.