The History of Making the Grid Smart: Difference between revisions

From ETHW
No edit summary
m (Text replace - "[[Category:Power, energy & industry application|" to "[[Category:Power, energy & industry applications|")
Line 17: Line 17:
All these technologies, and the more than one century of development, were necessary foundations for building the safer, more efficient, and more reliable electricity distribution network that will eventually become the smart grid.
All these technologies, and the more than one century of development, were necessary foundations for building the safer, more efficient, and more reliable electricity distribution network that will eventually become the smart grid.


[[Category:Power, energy & industry application|Grid]] [[Category:Power generation|Grid]] [[Category:Power engineering|Grid]] [[Category:Power distribution|Grid]]
[[Category:Power, energy & industry applications|Grid]] [[Category:Power generation|Grid]] [[Category:Power engineering|Grid]] [[Category:Power distribution|Grid]]

Revision as of 13:51, 13 November 2013

The History of Making the Grid Smart

Almost as soon as there were electrical distribution grids, there was a demand for devices to measure the consumption and to help the suppliers distribute, price, and monitor their service. The path from the first tentative devices used to measure consumption, to today’s smart grid technology which uses two-way metering technology which can turn appliances on and off according to demand and off-peak electricity prices, has been a long one. Many obstacles needed to be overcome in order to obtain accurate information about the way the grid behaved, and some of the obstacles to the earliest attempts to devise technologies for monitoring electrical distribution one hundred or more years ago are strikingly similar to obstacles facing smart grid technologies today.

In Edison’s 1882 Pearl Street system in lower Manhattan, the pull of an electromagnet against a carefully-adjusted spring closed or opened contacts which illuminated either a red lamp (if the line voltage rose) or a blue lamp (if the line voltage dropped) thereby indicating to an attendant to turn a hand wheel to control the strength of the electromagnetic field in the generators in order to match the output of the generators to the load.

To measure the electricity consumed, Edison devised a meter consisting of two electrodes in an electrolyte. As current passed through the meter, the current caused the metal of the electrodes to transfer. The customer’s consumption was calculated by weighing the two electrodes.

The first known electric meter was patented in 1872 by Samuel Gardiner. An electromagnetic started and stopped a clock. This provided information on the duration of the flow of the current, but not the amount. In 1883, Hermann Aron patented a recording meter which showed the energy used on a series of clock dials. Edward Weston’s indicating meter of 1886, which set high standards for precision, was not intended to measure consumption, but rather to measure current.

In 1889, Elihu Thomson introduced a recording wattmeter. This immediately became a very popular metering technology and allowed the utilities to measure the amount of electricity provided to a customer. The road to accuracy was a long one, however. Braking magnets in the meters were sometimes weakened by the power surges which accompanied lightning storms; this meant that the meters would then run fast, a complaint which parallels modern consumer complaints about fast-running smart meters. Older meters tended to run slow under overload conditions. In the late 1940s, General Electric conducted an advertising campaign “Time to Retire Old Watthour Meters” and demonstrated to utilities the lost revenues they were incurring from slow-running meters.

In the years prior to utilities being able to disconnect customer devices at peak times and reconnect them during periods of low demand, problems of load management sometimes took care of themselves in a somewhat immediate and non-negotiable manner: lines would simply burn out if demand exceeded the capacity of the line.

As electricity demands on grids increased through the late twentieth century, utilities searched for ways of managing peak loads. The capital costs of building generating capacity to handle these peaks — capacity which would then be idle during long periods of non-peak load — led utilities to find ways to study their demand periods, price them accordingly, and to encourage customers to switch consumption from peak to non-peak periods. The goal of matching consumption to generation required meters which could measure the time of day of the consumption in addition to the cumulative consumption. Automatic meter reading devices introduced in the 1970s were the beginning of meters which provided information back to the utility, a basic requirement of any smart grid system. The technology for monitoring sensors and relaying the data grew out of the caller-ID technology patented by Theodore Paraskevakos.

All these technologies, and the more than one century of development, were necessary foundations for building the safer, more efficient, and more reliable electricity distribution network that will eventually become the smart grid.