An attack on a computer network shuts down a power plant, plunging New York City into darkness. A control tower at an airport is suddenly overwhelmed with false signals from nonexistent airplanes.
Could a terrorist group or foreign country mount a cyberattack that causes this sort of crisis? Secretary of Defense Leon Panetta thinks so.
In October, Panetta warned a group of business executives about a "cyber Pearl Harbor" — a massive cyberattack upon the United States that would compromise critical pieces of infrastructure.
The term wasn't Panetta's invention. Experts have been using "cyber Pearl Harbor" ever since it became clear in the mid-1990s that such attacks were possible.
More recent events have erased all doubt. The Stuxnet worm of 2010 demonstrated that a piece of malware can do real harm to an industrial system, even one not connected to the Internet.
Stuxnet, which was likely created by the U.S. or Israel, altered the rate of spin of uranium centrifuges in an Iranian nuclear facility, spinning them faster until they broke. At the same time, it fed false information to the facility operators, reporting that everything was functioning normally.
Critical infrastructure systems, such as power plants or transportation systems, are vulnerable because today's industrial processes are largely controlled by computers.
Small workhorse computers known as programmable logic controllers (PLCs) operate the machinery. PLCs in turn are controlled by industrial control systems (ICSs) that humans use.
Large-scale ICSs, often spanning geographically separated facilities, are called supervisory control and data acquisition (SCADA) systems.
Since both PLCs and SCADA systems use commonly distributed software, they are vulnerable to disruption and to hacking.
"It’s really not a challenge to find new vulnerabilities in SCADA systems, because it's designed without security in mind," said Sergey Gordeychik, chief technology officer at Positive Technologies, a London-based company that assesses IT vulnerabilities for industry.
In one sense, the comparison to Pearl Harbor might be misleading, argued Jon Stanford, director of power and utilities security at the worldwide accounting firm PriceWaterhouseCoopers.
"The motives [of cyberattacks] aren't clear, and the perpetrators aren't visible," Stanford said.
The attack on Pearl Harbor had a specific objective: Cripple the U.S. Navy's Pacific fleet. It was also clear that the Japanese navy carried out the attack. Such clarity is rare in cyberspace.
While the possibility of a massive cyberattack upon American infrastructure is a real concern, it's important to put it in perspective, experts say.
Robert Graham, chief executive officer of Atlanta consulting firm Errata Security, noted that hacking is a matter of both luck and skill.
"China could easily send tourists over to make fertilizer bombs to blow up substations and long-distance power lines, and create a much greater effect" than a hacker could, Graham said.
The question, Graham said, is always how likely a given line of attack is.
So what are some of the juiciest targets for cyberattackers? Here are 10.
Industrial control systems are vital to running power plants, especially nuclear power plants. Altering the information put out by monitoring systems could lead to a series of cascading failures that could cause a nuclear plant to shut down, or, in the worst cases, to suffer a fuel-core meltdown.
The great Northeastern blackout of Aug. 14, 2003, which cut off power to Detroit, Toronto and New York, was caused by overheated power lines sagging into trees near Cleveland, according to a join Canadian-U.S. official report.
Some security experts contend that a piece of Windows malware called the Blaster worm, which had spread rapidly through the Internet in the previous three days, may have played a role by hampering communications among dozens of different power companies.
It's still not clear whether Blaster did in fact worsen the blackout. What is clear is that malware can indeed disrupt the industrial control systems at power plants.
At the Black Hat and DEFCON security conferences in Las Vegas this past July, security researchers Andrei Costin and Brad Haines separately showed how they could "spoof" an aircraft's signal to an air-traffic controller using off-the-shelf equipment.
The hacks were done on the ADS-B system, which replaces radar and uses GPS and unencrypted radio data links to locate a plane in flight.
But as Costin and Haines demonstrated, it's not hard to create radio transmissions from "ghost" flights that don't exist.
As for GPS, its weak signals are easy to jam. In 2009, Newark Liberty International Airport traced air-traffic-control disruptions to local truck drivers who used illegal GPS jammers so their boss couldn't tell where they were.
ADS-B is a core part of the Next Generation Air Transportation System, which is currently being deployed by the Federal Aviation Administration and scheduled to be fully in place by 2020.
A stock-market plunge by itself won't cripple an economy, but it can certainly result in losses of millions of dollars or even do in a big financial firm.
In 1995, the 230-year-old British investment bank Barings was destroyed by a $1.3 billion loss in the Asian options markets by a single Singapore-based trader who falsified records to cover his tracks.
In 2010, a cascade of rapid-fire trades caused the "Flash Crash" that brought the Dow Jones Industrial Average down by 1,000 points in minutes.
In August of this year, a "technology breakdown" at Knight Capital Group in Jersey City, N.J., caused rapid computerized-trading glitches, causing price disruptions in dozens of stocks. Some soared while others crashed.
None of these incidents was the result of a hack, but it's not hard to imagine a hacker causing a lot of mayhem by breaching the security of a trading system. All he would need to do would be to surreptitiously alter a trading algorithm.
Many road-traffic control systems are now computer-controlled, and some are even being networked.
In the classic 1969 crime caper "The Italian Job," British gold thieves fed malicious software into the city of Turin's road-traffic-control computer, causing traffic jams that allowed the robbers, driving tiny Mini Coopers, to escape police.
That was fiction. But if a piece of malware were to interfere with a road-traffic-control system's proper functioning —for example, the traffic lights of greater Los Angeles at rush hour —there could indeed be gridlock.
For at least three months in early 2000, malicious commands were sent via radio transmission to sewage-treatment pump stations in southern Queensland, Australia. Raw sewage was pumped into parks, rivers, canals, a harbor and even the grounds of a local Hyatt Regency hotel.
The culprit was Vitek Boden, a man who had been denied a job by the local public water utility. He had likely helped install the radio-controlled PLCs on the pump stations when he worked for a private contractor.
In November 2011, a hacker who called himself "pr0f" posted images online of the SCADA system that managed water infrastructure in South Houston, Texas.
Pr0f had been angry over the security industry's dismissive reaction to an alleged cyberattack on an Illinois water facility, which turned out to be a false alarm.
Pr0f didn't do anything to the South Houston plant; he only wanted to point out that he could have, and that it was stupid to connect such systems to the Internet.
In the United States, water-delivery and wastewater-treatment systems are fairly decentralized, but that doesn't make them immune.
The attack in Australia came from a disgruntled employee, but it isn't hard to imagine a terrorist —or a kid out for laughs —trying the same thing in this country.
Making food in the 21st century is a big, industrial process. That means temperatures and ingredients have to be precisely measured and sequenced in the correct order.
An altered control system could put food safety at risk by not cooking food thoroughly enough, or adding the wrong amount —or none at all —of a necessary preservative.
The same kinds of concerns that affect food apply here, but even more so. A switch that doesn't respond the right way in a pharmaceutical plant could render huge amounts of any drug useless, or even dangerous.
In 1982, a still-unknown perpetrator killed seven people in the Chicago area by tainting bottles of Tylenol pain medication with cyanide before the bottles were sold. (The incident spurred the introduction of tamper-resistant seals on medicine bottles.)
Only eight bottles of tainted Tylenol were ever discovered. A malicious hacker could taint many more.
Oil and gas refineries —and the pipelines, ships and trucks that deliver their raw material and products —are huge systems. Refineries require hundreds of employees, and the machines they operate, to handle extremely flammable materials without making mistakes.
But the workers need accurate information to do so. Malicious software could provide false readings, open and close valves at the wrong times or damage industrial firmware —all with potentially lethal consequences.
"At the end of every control system is a physical thing," said Dale Peterson, founder and chief executive officer of Digital Bond, an IT security consulting company in Sunrise, Fla.
Peterson noted that fatal accidents demonstrate what can happen when something goes wrong, and that hackers who can make things go wrong are all the more dangerous.
"The BP Texas City refinery explosion," which killed 15 and injured 270 in 2005, said Peterson, "was due in part to a control system not displaying properly."
In this case, a broken gauge failed to alert workers that flammable gas was escaping. A hacker could easily replicate a broken gauge on an electronic system.
Aside from foreign countries, terrorist groups, or disgruntled employees, said Peterson, there's also the danger of a prank gone horribly wrong —some lone hacker doing it for the "lulz."
"It's sort of surprising that hasn't happened more often," Peterson said.
It's unlikely a hacker could take control of the trains in a subway system such as that of New York City. But hackers could certainly cause chaos if they attacked the ticketing, signaling or rail-traffic-control systems.
At best, there would be delays and disruptions. At worst, there could be a fatal crash.
In large office or industrial buildings, computers monitor fire-alarm systems and operate the heating and lights. At Brown University in 2000, students got permission to rewire the lighting systems in a tall campus building in order to play a 10-story game of "Tetris."
It'd be even easier to hack into a computerized lighting system. A potential hack could be done for the "lulz" —Anonymous might want to make the lights spell a message, for example —or it could cause more serious disruptions if the manager was told the fire alarms were all going off.