In 1831 Michael Faraday discovered the principles that make transformers possible.
Using an induction ring Faraday discovered that an electric current flowing through one wire could have the effect of creating or “inducing” an identical current in a nearby wire. This only happens when the voltage is changing—such as when the power is snapped on and the voltage rises from 0 to its peak.
The reason for the induction of a current in a nearby wire is that every flow of current in a wire results in the creation of a magnetic field around the wire. A second wire placed nearby, within that field, is influenced by the magnetism. As the field expands or collapses, it acts on the electrons in the second wire and creates a new flow of current. Winding the wire into coils makes the device more compact, and wrapping the coils onto an iron bar or ring concentrates the magnetic field in a small area.
We use the name transformer because it is used in alternating current systems to raise or lower voltages. Alternating current creates a fluctuating magnetic field as it flows in a wire. If the number of turns in the first coil is lower than in the second, then higher voltage will be induced in the second coil. If the first coil has many turns, then the “secondary” voltage will be lower. Electric power systems use this principle to raise the voltage produced by a generator or dynamo to a high level, such as 100,000 volts or higher. At this high voltage, electricity can travel hundreds of miles along transmission wires without being significantly diminished. Near a residence, another transformer does just the opposite: it makes voltage usable by lowering it back down to 120 volts in the United States (220 volts in Europe).