To some, the microchip was a wondrous invention a high-tech helper that could increase security at nuclear plants and military bases, help authorities identify wandering Alzheimer's patients, allow consumers to buy their groceries, literally, with the wave of a chipped hand.
To others, the notion of tagging people was Orwellian, a departure from centuries of history and tradition in which people had the right to go and do as they pleased, without being tracked, unless they were harming someone else.
Chipping, these critics said, might start with Alzheimer's patients or Army Rangers, but would eventually be suggested for convicts, then parolees, then sex offenders, then illegal aliens until one day, a majority of Americans, falling into one category or another, would find themselves electronically tagged.
The concept of making all things traceable isn't alien to Americans. Thirty years ago, the first electronic tags were fixed to the ears of cattle, to permit ranchers to track a herd's reproductive and eating habits. In the 1990s, millions of chips were implanted in livestock, fish, dogs, cats, even racehorses.
Microchips are now fixed to car windshields as toll-paying devices, on "contactless" payment cards (Chase's "Blink," or MasterCard's "PayPass"). They're embedded in Michelin tires, library books, passports, work uniforms, luggage, and, unbeknownst to many consumers, on a host of individual items, from Hewlett Packard printers to Sanyo TVs, at Wal-Mart and Best Buy.
But CityWatcher.com employees weren't appliances or pets: They were people made scannable.
"It was scary that a government contractor that specialized in putting surveillance cameras on city streets was the first to incorporate this technology in the workplace," says Liz McIntyre, co-author of "Spychips: How Major Corporations and Government Plan to Track Your Every Move with RFID."