Testimony of Mark P. Mills
Science Advisor, The Greening Earth Society
Senior Fellow, The Competitive Enterprise Institute
President, Mills-McCarthy & Associates, Inc.
before the Subcommittee on National Economic Growth, Natural Resources, and Regulatory Affairs – U.S. House of Representatives
Thank you, Mr. Chairman and distinguished Members of the Subcommittee for inviting me to speak about the energy implications of the Digital Economy. We live in a special time. It is perhaps not a totally unique time in historical terms, but it is a rare one. Times of major inflections in technology, infrastructure and the economy occur only episodically in history. I am not alone in the belief that we are only at the beginning of one of those powerful inflections, driven by what has been broadly termed the Information Revolution. The Internet is a central part of that revolution and it has only just begun to effect profound changes in our economy.
There have been many attempts to attach numbers to chronicle the growth of the Internet over this remarkable past decade. The number of people accessing the Web has grown from thousands to tens of millions. Web sites have grown from practically none to millions. Computers sold annually have risen from tens of thousands to tens of millions. Digital traffic is measured by prefixes formerly reserved for astronomers; not megabytes, or gigabytes, but petabytes. Still, traffic on the Web is doubling every several months. The entire telecommunications industry as been upended, rebuilt and expanded by the digital revolution. Commerce on the Web has exploded from nothing to tens of billions. New companies, new kinds of equipment, new services appear in a continual flow. Employment in Information Economy jobs has risen from thousands to millions. The real growth by any of these measures has been so astonishing that even the hyperbolic language of headline writers appears understated by comparison.
Against this backdrop, last year I put forth a simple proposition with a colleague that has created some controversy. The proposition is really quite simple. The Internet is using a lot of electricity, and it will use even more in the future.
The currency of the Information Economy, digital bits, are themselves simply bundles of electrons. Every single one of the hundreds of millions of devices, PCs, routers, servers, transmitters and so on, have exactly two kinds of connections: one for bits and one for kilowatt-hours. Just how much electricity does the Internet use? We think something like 8% of the nation’s electric supply is absorbed by the sprawling and deeply penetrating hardware of the Internet. And when the broader array of all computers and related equipment are considered, in other words the heart of our new Information Economy, the total probably reaches 13% of all U.S. electricity consumption.
These ideas have been previously submitted to this Committee for the record. The basic concepts are set forth in my report for the Greening Earth Society, “The Internet Begins with Coal,” (available at www.fossilfuels.org) and an article published in Forbes magazine (5/31/99) with my colleague Peter Huber, a Senior Fellow at the Manhattan Institute.
Subsequently, two respected research organizations and a number of environmental activists have exhibited alarm at the proposition that the Internet uses large and rising amounts of electricity. Before addressing the counter claims, and their deep flaws, I should like to consider the broad context for my analysis to lend perspective to the energy requirements of the Internet.
The Internet’s Energy Transformation
If the U.S. Department of Commerce is correct, and I believe it is, in concluding that the Information Technology (IT) sector accounts for at least one-third of all GDP growth, then any policy issue that impacts IT must be considered with great caution. Energy policy is just such an issue. Because the explicit and implicit provisions of the Kyoto Protocol would directly impact every aspect of the nations’ energy supply, it is appropriate, in fact critical, to consider the energy implications of our emerging Digital Economy.
Energy underpins any economy, in effect because of the laws of physics. Put simplistically, you can’t get something for nothing. The Internet has not changed the laws of physics. Even cyberspace has an energy cost. Energy will continue to underpin our economy in the 21st century, just as it did in the 20th. But there will be one difference. In energy terms, the last century belonged to oil. This one belongs to electricity. Oil will not lose its prominent role, but it will take – and indeed, already has taken – second place to kilowatt-hours.
The dawn of the last century saw an explosion of economic activity in the creation of the automobile age. Investors and Wall Street rode chaotic markets investing in new companies. For technology historians, and Wall Street speculators, the dawn of the auto age has important analogs to the dawn of the Digital Age. One consequence of the rise of the automobile was the creation of an enormous and complex oil-related industrial infrastructure to fuel engines in all kinds of vehicles. The engine of the Digital Age is the microprocessor. Its fuel is electricity. Digital bits are bundles of electrons. The billions and even trillions of bits of data created and routed are, perforce, supported and energized by billions of watts. There’s no getting around it. Cyberspace, far from virtual, is very real and anchored in electrons. Thus, the Internet, the central driving force of the Digital Age, is both driving and reshaping the electric infrastructure.
The transformation is already in evidence. Our economy today spends four times as much purchasing electricity as oil. This is a profound reversal of the economic positions of oil and electricity 25 years ago. The only basic energy policy that makes sense in this new Digital Economy is to ensure an expanding supply of ever lower cost and ever more reliable electricity, especially considering the trends of the past decade which have been characterized by a dominance of the tools of the Information Economy.
During this past Digital Decade, consumption of electricity has risen by 650 billion kilowatt-hours. For perspective, this growth alone required more new U.S. electric supply than exists in all of Central and South America.
The increase in kilowatt-hour use occurred despite billions spent by federal and state governments and utilities to reduce electricity growth, and despite dramatic improvements in the efficiency of electric appliances, lights and motors. It occurred, I submit, in large part because of the new tools of the Digital Age.
Considering that coal supplied about one-half of the additional electricity over the past decade (about 10% from natural gas), it is easy to see the collision course this trend has with Kyoto-inspired energy policies which are explicitly and implicitly directed at reducing coal use as well as electric consumption.
The Internet & Electricity Demand
Just how much of the nation’s electricity demand is a direct result of equipment in the Digital Economy, and more specifically, the Internet? Truth be told, it is hard to draw a bright line between many devices used for the Internet and those that are part of the broader Digital Economy. Nonetheless, we made just such an attempt, precisely because the Internet is at the epicenter of the Digital Revolution.
It would be exceptionally challenging to catalog all the wide array of devices that comprise the Internet and Digital Economy. Instead, we chose a technique known as sequential approximation. This well-established technique permits one to gain a reasonable order-of-magnitude estimate of a complex factor without a detailed inventory. One can, for example, use sequential approximation to estimate the number of people in a stadium by considering an inventory of hot dogs and soft drinks. Some approximations are required, but the outcome will be in the right ballpark.
The ballpark estimate: the Internet in all its facets, likely consumes 290 billion kilowatt-hours, or about 8% of the U.S. electric supply system. The broader category, the entire array of all types of computers and computing-related devices (such as storage systems), in homes, businesses and factories which fuel our Digital Economy likely uses 13% of all the nation’s electricity.
These numbers encompass much more than PCs on desktops. One must include for example all the hardware behind-the-wall in the telecommunications and Internet networks which includes, but is far from limited to, such things as routers, the hardware of the dot-coms such as servers, and even the silicon and PC factories. Determining Internet and Digital Age electricity use requires collecting and assessing data across many sectors and boundaries.
It is clear that traditional data sources and methodologies are not adequate to the task of clearly tracking the electric needs of the Information Age. For example, most of the necessary data for the commercial sector is invisible in traditional Energy Information Administration energy accounting. EIA does report on PC electric use in commercial buildings, but all of the other types of information technology hardware (which comprise over three-fourths of Internet energy use) are thrown into a general grab bag category called “other.” EIA notes cryptically that “other” includes telecommunications equipment. The data lost in “other” was irrelevant two decades ago at the dawn of the Digital Age. Today, the “other” category of commercial electric use is over 300 billion kilowatt-hours and is greater than all other categories except lighting – and will soon overtake lighting.
Before addressing a few points of contention regarding my estimate of 290 billion kilowatt-hours for the entire Internet, it is useful to ask first, is such a result in the ballpark? Much of the confusion and controversy surrounding the issue arises from a key question. In effect, how much of the electric use of a PC (or any IT equipment) is directly attributable to the Internet? Since the Internet is an integral subset of the Digital Economy, the easiest sanity check would be to evaluate the electric needs of all Information Technology equipment. For example, how do you count computers used to develop software for the Internet if those PCs were not directly plugged into the Web? Clearly they are part of the bigger picture, the entire Internet-driven Digital Economy.
A useful starting point for a ballpark check is in the simple fact that the U.S. Information Technology industry sold over $400 billion worth of hardware last year. Over the past three years alone, more than $1 trillion of IT hardware has been installed. This hardware represents the engine of the new Digital Economy. Much of it becomes part of the Internet, most is driven by the Internet. Every single piece of this $1 trillion in hardware gets plugged into a wall somewhere.
There’s another more specific ballpark check available from the year 1993, the Jurassic Era of the Internet. A 1995 Lawrence Berkeley Labs (LBL) study (the most recent on the subject) reported about 50 billion kWh in 1993 for commercial sector use by PCs, computers and directly related equipment such as monitors and printers.
This 50 billion kWh figure for the commercial sector from seven years ago is a good starting point for the Digital Decade. Let’s consider what’s happened since then.
* the number of PCs and related equipment in offices has exploded
* the number of PCs in homes, schools, everywhere, has also exploded
* the Internet has burst on to the scene, with all its back-office Web and telecommunications hardware
* an entirely new class of businesses has been created; the dot-coms
* the usage level for all computing and IT equipment is up everywhere
I am quite confident that these factors collectively have brought the 50 billion kWh starting point in 1993 up to my estimates for the broader Internet (i.e., beyond the commercial sector alone) and the Digital Economy today. And if we’re not quite there yet, just wait a few more months.
There are some other useful ballpark indicators. The Information Technology Industry Council’s tracking shows the total inventory of computers and computer-type equipment has jumped by at least 100 million units since 1993. The inventory is growing at over 40 million a year now. And their data set specifically does not include such Internet equipment as routers, which are functionally computers. Cisco sells about a million routers a year. Nor does the official data track the number of wireless base stations, amplifiers, ports, hubs, information appliances, and so on. All of these have grown rapidly over the past Digital Decade. All of these devices use electricity. Many are already part of the Internet, and those that are not will soon be.
And this is only part of the story. One must also add the electric needs of the semiconductor, PC and IT manufacturing industries. Semiconductor manufacturing alone has grown in the past half-decade to become the nation’s largest manufacturing industry. Silicon plants are the steel mills of the 21st Century. Their fuel of choice; kilowatt-hours.
When you think about it, it is inconceivable that the Digital Age and the Internet, do not already account for a significant and growing share of the nation’s electric supply.
The Case for 1%
Two organizations have offered rebuttals to the 8% estimate for the Internet’s share of national electric use. I believe it important to address these ostensible rebuttals given the importance of this issue to federal energy and economic policy. There is insufficient time here to address all of the details, but a few observations are instructive.
The researchers at Lawrence Berkeley National Laboratory (LBL) have published a superficially analysis of my study “The Internet Begins with Coal.” Before addressing a couple of representative examples of the inherent failures in the LBL rebuttal, there are two over arching points that should be made. The first relates to the failure of LBL to step up and take an honest crack at estimating an answer to the core question. The second relates to the strange failure of the LBL team to seek information to clarify their misunderstandings.
First then is the fact that LBL team and others seem preoccupied with rebutting details of my analysis, but are quite unwilling to make their own independent estimate to answer the central and critical question: how much electricity does the Internet use? My recommendation to the LBL team then and now: please undertake a detailed and intellectually honest ground-up analysis of the Internet’s electric needs.
The central conclusion of the LBL paper is that 8% is an overestimate of the Internet’s use of U.S. electricity by “a factor of eight.”
On learning this, I asked the LBL team the obvious question, if you say 8% is an overestimate by a factor of eight:
“May I quote LBL as claiming/believing/estimating that the Internet uses 1% of the nation's electricity supply?”
Their answer, in full:
“You may NOT quote LBNL ‘as claiming/believing/estimating that the Internet uses 1% of the nation's electricity supply’ because your estimate just focuses on direct electricity use, and not the overall effects on the U.S. economy that result from structural changes and substitution effects due to the Internet. You may quote me as believing that your estimate of the direct electricity use associated with the Internet is too high by a factor of eight, but that the NET effect of the Internet on electricity and energy use (which is really what matters) cannot be estimated accurately without assessing the associated indirect effects of the internet on resource use in the economy.”
Given what I’ve outlined earlier, and what practically everyone who reads the news knows, the explosion in Internet equipment is quite unlikely to have led to a reduction in the use of electricity. Data contained in LBL’s own research on PCs and computers yields a figure of 2% way back in 1993 and just for the commercial sector.
Furthermore it is disingenuous for the LBL team to state that what really matters is the “NET” effect of the Internet. Certainly it’s an interesting issue (more about this in a minute). Fax machines use electricity and displace jet fuel by replacing overnight mail. I believe I may have been the first to publish detailed analyses of this effect of electrification in 1991, and to describe this effect I coined the term “ecowatts” at that time, documenting and publishing widely to extol this important efficiency trend.
But here’s a simple arithmetical fact; estimating the net savings from faxing requires, a priori, knowing the amount of electricity used by faxes. Accurately calculating the net savings is actually much more difficult than accounting for the electricity used. (Consider, for example, that faxing should have been expected to reduce use of overnight mail; in fact overnight mail has grown.) But LBL suggests that one should not study the use of electricity from PCs, or by inference, faxes or any office equipment “without assessing the associated indirect effects.” LBL’s own EPA-funded 1995 research on electricity used by all manner of office equipment in commercial buildings does not meet this test – nor should it have to.
The idea that we can or should only study and publish the “NET” effect is the equivalent of claiming that you can figure out the change from dinner without knowing how much money you gave the waiter.
The LBL team dodged the issue.
The second generic point I should like to make arise from the following statement from the LBL paper:
“Mills’ report does not contain enough detailed documentation to assess the reasonableness of many assumptions.” (emphasis added)
This is a fair complaint. I note for the record that the LBL team, in full possession of my e-mail, phone number and address, and despite a couple of very general e-mail exchanges with them, made absolutely no attempt to contact me to obtain clarification or expansion on specifics for any assumptions. Considering that clarification was and is necessary for “many assumptions,” their failure to do so leaves one wondering if they did not want clarification, and that the rebuttal was motivated by something other than the requirements of technical scholarship.
That the LBL team has, so far, dodged the central question is clear. Thus far their only contribution to this debate has been an attempt to cast doubt on my analysis. The LBL rebuttal contains numerous serious errors. Let me briefly outline two that are representative.
The first technical point: In the LBL paper, the authors take issue with the claim that the desktop for an Internet-configured PC (i.e., including necessary peripherals) is about a 1,000 Watt device. Setting aside the question of whether it is 1,000 Watts (it is), the LBL researchers know full well that the relevant number used in the calculation is NOT the peak watts, but the quantity of kilowatt-hours used in a year. In analogous terms, what really matters is how much gasoline you use in a year, not the horsepower of your engine.
In this regard, my analysis for an Internet-configured PC is based on 750 kWh/yr and is consistent with many other analyses, including their own at LBL.
* In their 1995 study, LBL finds that a PC and printer uses 650 kWh/yr
(“Efficiency Improvements in U.S. Office Equipment: Expected Policy Impacts and Uncertainties,” LBL, December 1995, p. 15.).
* In an unrelated 1995 EPA study, annual PC electric use was estimated to range from 450 to 2,000 kWh/yr.
(“The Green PC,” S. Anzovin, Windcrest, 1994, p. 5).
* A more recent National Academy of Sciences report put annual PC/workstation electric use at 1,000 to 1,800 kWh/yr.
(IEEE Spectrum, January 2000).
Despite the readily verifiable above noted facts, the LBL paper nonetheless concludes that “With these corrections [to Mills’ assumptions], PCs in offices use about 7.2 TWh, a reduction of 84% from Mills’ estimate.”
Surely the LBL team noticed the bizarre inconsistency in this conclusion. Their own 1995 seminal study showed collective commercial sector PC electric use at 50 TWh more than five years ago. How could their “correction” to my analysis yield 7 TWh today?
Let me turn now to a second example of poor analysis in the LBL paper, but of a slightly different ‘flavor’ of error.
One entirely new category of computer use since 1995 is in Web servers. Servers are really computers ranging in type from PCs, to workstations, and up to mainframes that host the Web sites. Servers run 24-7 and are frequently arranged by the hundreds in enormous banks of racks creating a “server farm” for mid-sized to large Internet Service Providers. LBL claims that we need to adjust downwards both the power used by servers and the total number of servers. The power use issue for servers is essentially the same as I’ve just outlined for PCs.
At the time of writing my report, I used an estimate of 4 million servers for 1999 based on an extrapolation from data for the number of Web sites. The LBL team ‘adjusted’ my estimate arbitrarily to conclude that the “correct” number of servers should have a downward correction of “80%” to 1 million. LBL could have undertaken some modest additional research, as I did subsequently, to learn that there is hard data on the number of servers in operation in 1999 that does not require any extrapolations. The actual number of servers last year was 4 million (Netcraft Internet Survey, www.netcraft.com/Survey/Reports/). Clearly my methodology was more accurate than theirs. As a point of interest; there were fewer than 20,000 servers in 1995. Servers are only one piece of a very big digital pie, but quite indicative of the electric trends.
In general LBL sought to ignore the basic methodology I used, sequential approximation, and instead clearly sought to undermine the integrity of my work, without attempting their own honest analysis. They also failed to note the explicit mention in my report that we did not count the electric use of a wide variety of other relevant Internet related devices, totaling in the millions.
The LBL researchers are right about one claim, that it is difficult to cleanly separate Internet equipment from all information technology equipment. Thus, I asked the LBL team to consider the conclusion offered in the study and the Forbes magazine article, that microprocessors of the Digital Age, in all categories including the Internet, consume about 13% of the nation’s electricity. We have yet to receive a response.
The Case for Zero
While the LBL team dodged the specific question of how much electricity the Internet or even the Digital Economy uses, a Cool Companies study led by Joseph Romm was braver. The Cool study has two central contentions that merit brief discussion. One contention, incredibly enough, is that the Internet’s electric use is zero. And the other central contention is that the efficiency gains from the Internet offset any putative energy needs. Let me briefly address these two contentions.
The Cool study conclusion about the Internet simply and astoundingly concludes:
“The authors found that the Internet itself is not a major energy user, largely because it draws heavily on existing communications and computing infrastructure.”
This observation reflects such a deep misunderstanding of the telecommunications revolution that it is difficult to know how to respond. Just what exactly do the authors think the past half decade of over several trillion dollars in new investment in telecommunications and computing equipment has been for and driven by, if not the Internet?
The exponential growth in equipment (and related Wall Street valuation) constitutes the electric-intensive infrastructure of the Internet. None of it was “existing.” Equally important, it is still rapidly expanding. The entire telecommunications industry has been visibly up-ended and expanded by the Internet. The purchase and installation of hundreds of thousands of miles of fiber optics, and the entire attendant infrastructure has been almost entirely driven by the Internet. Digital traffic now dwarfs voice traffic on the telecommunications networks. And every telecom expert forecasts traffic to grow, and for the growth to be utterly dominated by bits, not voice. The driving force for bits is the Internet.
The Cool study authors would have us believe the Digital Economy is some kind of virtual overlay on existing infrastructure. This is the equivalent of asserting, in 1950, that the several decade build-out of the nation’s Interstate Highway system, to support all the new cars and trucks moving into the economy, would not entail any investment (in dollars, materials or energy) since drivers would be using an existing highway infrastructure. It is 1950 for the digital highways.
But the Internet Improves Efficiency
It is widely recognized that the Internet is improving economic efficiency, sometimes astonishingly so. Indeed, this central fact is the very reason that the market is so rapidly consuming digital bandwidth and all of the equipment to create and serve that bandwidth. But economic and energy efficiency are not the same thing. Indeed, economic efficiency can fuel increased energy demand.
There are two aspects to the efficiency argument. One is macro-economic; is the general, overall effect of the Internet to reduce energy and material use? The second, micro-engineering; does the Internet reduce material and energy use in specific applications?
The Internet serves as a kind of economic lubricant. According to the Department of Commerce (Digital Economy II), information technologies drive at least one-third of the GDP growth, and further two-thirds of ALL investment in capital equipment. These results suggest the answer to an oft-posed question from economists and digital skeptics, “when will we see the putative economic effects of the massive investments in computers?” We’re seeing them now. Indeed, Chairman Greenspan appears to believe that the reaction is even a little overheated.
Regardless, so far the net effect of the Digital Age at the national level has been to increase energy use. In the last digital decade, total air miles flown have risen from 4.3 to 5.8 billion a year. People are flying more than ever. Planes use fuel. People are driving more than ever, and in bigger vehicles. SUV and light trucks account for one-half of all vehicle sales – doubling in the past Digital Decade. Transportation fuel use is up 12%. Similarly, the digitally-accelerated economy has driven up the size of homes and the spending on home improvements. Whether you think this is good or bad is not relevant to the fact; so far, it has all generated greater energy use.
A robust economy tends to use more energy. To be sure, we’re more efficient. But there is no evidence yet in human history, much less the past few decades, of rising economies with sustained declines in energy use. Obviously, improvements in efficiency moderates a growth rate; but the operative word is “growth.”
What about the application-specific efficiency argument? The idea, in a nutshell: the Internet is so powerful that it will improve efficiency faster than the energy consumed by the hardware on the Network.
The energy used per dollar of GDP is the favorite efficiency metric of both environmentalists and business leaders seeking environmental coverage. By this measure, the U.S. is incredibly more efficient than just a decade ago. Total U.S. energy use per dollar of GDP has dropped 16% since 1990. Today’s economy, operating at the energy efficiency of 1990, would need 15 Quads more fuel – in oil terms, that would be a 40% increase in total U.S. oil use. Interesting, but largely meaningless. The nation still uses more energy today than a decade ago. And more importantly for a Digital Economy, we use a lot more electricity. Increased supply to meet electric growth in just the past five years is equal to the total generating capacity of Italy.
There is a real problem with the dollars per BTU metric for energy efficiency, and easily illustrated. Considering, with this metric, who is the most energy efficient person in America? Bill Gates. Despite enormous energy use in his legendary home, personal jet and so forth, Mr. Gate’s wealth yields an efficiency measured as energy per $ that would shame a Sudanese hunter gatherer – only because his wealth is so great. The economic path the U.S. is on, with the Digital Era accelerating economic gains out of proportion to relatively modest energy growth, means that the U.S. economy is following the Bill Gates method for energy efficiency: increase wealth faster than you increase energy use.
What then of the specific energy efficiency gains of the Internet, especially the efficiencies of buying “on line” via e-commerce – what might be termed Amazon-dot-com effect? The jury's still out on whether more or less energy/material infrastructure is used to warehouse and deliver e-commerce products. Books from Amazon via 747 and trucks may use less or more energy than driving an SUV to the book and grocery store. It is far from clear, however, what the final, overall effect will be in retail e-commerce, especially since it is still only a tiny fraction of total retail. The 24-7, send-it-overnight e-commerce economy could increase energy use if aircraft begin to substitute for trucks and trains for product delivery. Many analysts believe that competition in e-commerce will drive business increasingly to delivery overnight. Developers are already building new, dedicated airport hubs that can handle multiple 747s loading-unloading specifically as for e-commerce. (“Developers Rush to Meet the Demands of E-Commerce,” 1/23/00, New York Times.)
Even if the net overall impact of the Amazon-effect is improved energy efficiency (it probably is in many cases, if not specifically the Amazon case)– reduced transportation oil use still comes with increased use of electricity. This in fact has been the general macro-energy trend of the past decade.
The Cool study continues also with one long-sought goal of environmental activists. The paperless society. The Cool study sees the Internet saving “2.7 million tons of paper every year by 2003, as it reduces the need to print newspapers, catalogs, direct mail, and the like.” (“The Internet may give a boost to energy efficiency,” J. Romm, Yahoo.com, 01.24.00) Perhaps. But this sounds eerily like the paperless office touted as the result of word processors a decade ago. So far, paper use is up.
Then too there is the long-promised energy savings from telecommuting. Certainly telecommuting uses less fuel than driving your car. But auto and air travel is up even with the rise of telecommuting. The reasons are complex, but even the co-inventor of the Internet himself has concluded:
“The Internet has the funny effect of increasing the amount of travel.”
(Vinton Cerf, Senior VP of Internet Architecture, MCI WorldCom, actual co-inventor of the Internet, Engineering Tomorrow, IEEE Press, 2000, p.10.)
Where does Internet Electric Demand Go From Here?
Will there be continued growth in the hardware of the Digital Age? Is the Digital Age fully formed, IT appliance invention, production and utilization fully saturated? All indicators point to the fact that we’re just at the beginning. The number of applications, and the range of microprocessor-based devices, the magnitude and extent of the communication networks needed to integrate all the devices is still at that so-called knee in the hockey-stick curve.
One hundred million computers today will become hundreds of millions in a few short years; globally, billions. As the Internet moves increasingly into a wireless mode, power use will grow disproportionately because it is inherently less efficient to broadcast than pipe information. The Palm VII and similar handheld devices and their wireless access to the Internet are only the beginning of an explosive trend. Add to this the ever expanding appetite for faster Internet access, and more broadband services. This is just the beginning.
Thus will the Information Economy keep driving demand for electricity? Or will the market’s use of new electric devices reach saturation, and efficiency gains combine to flatten out load growth? These two key questions have been posed repeatedly over the past two decades with regard to electric use in general, and are even more critical to understand today, at the dawn of Internet era.
The old conventional wisdom was that PCs and their kin would follow the efficiency trend of all other electric appliances. In one sense they have. Certainly PC monitors are more efficient today, as are many PCs. But unlike lights, chillers and refrigerators, the number of PCs and PC type devices has grown geometrically in a few short years.
In the past, some prominent forecasters have been confident that demand for electricity would stop growing because of efficiency gains and market saturation. We hear much the same language today, with much the same reasoning.
In 1980, a study from the Union of Concerned Scientists predicted:
“Because saturation levels for most major appliances are achieved, only minor increases in electricity consumption [will] occur.”
(Energy Strategy, Union of Concerned Scientists, 1980)
In 1981, a study from the then Solar Energy Research Institute, since re-named the National Renewable Energy Lab concluded:
“It appears that the demand for electricity is unlikely to increase significantly during the next two decades.”
(A New Prosperity: Building a Sustainable Energy Future, Solar Energy Research Institute, 1981)
What happened since 1980? Electric demand grew nearly 60%. What went ‘wrong’? The analysts completely misunderstood the technology trends of ever greater applications for electricity, uses that more than offset improved efficiency. The same mindset is in place once again with regard to the information age and the Internet.
More recently, researchers at LBL concluded in 1995, just five Internet years ago:
“While total energy use for office equipment has grown rapidly in recent years, this growth is likely to slow in the next decade because the US commercial sector market is becoming saturated (especially for PC CPUs and monitors).”
(“Efficiency Improvements in U.S. Office Equipment,” LBL, December 1995)
To be charitable, forecasters even five years ago could hardly have forecast the growth in electricity-consuming IT-type equipment. But this has not stopped the refrain from continuing. The indicators for future trends are nothing less than amazing.
There are literally trillion of objects manufactured each year. We are rapidly approaching a time when everything will be manufactured with a silicon device of some kind, and where virtually all of them will communicate. Even if the energy needs of this trillion chip industry, and trillion petabyte bandwidth are trivial in per-chip terms – the aggregate electric needs will no doubt be astonishing.
As bandwidth demand rises, power use rises, as does the market’s use of the services. Yes efficiency will rise too. But for some time, as we build out the new infrastructure of the Digital Age, efficiency gains will be overwhelmed by sheer growth. Electricity is the fuel of the Digital Age, and the Internet at the heart of this revolution.
No energy policy, including and perhaps especially the anti-electricity aspects of the Kyoto Protocol, should be considered without passing it first through a Digital sanity test. The integrity, reliability and low cost of the national electric infrastructure will be more, not less important in the future. A juxtaposition of key facts illustrates a policy collision course. Kyoto Protocol advocates call explicitly for the reduction, even elimination of fossil fuels and especially coal from the nation’s energy infrastructure. Yet the nation gets 70% of its electricity from fossil fuels (three-fourths of that from coal). EIA forecasts that more fossil fuels will be needed to support economic growth. And while EIA forecasts natural gas will dominate the growth, they also forecast coal use will rise to support the economy. Clearly energy policy and the Digital Economy are tightly linked.