"Roy Schestowitz" <newsgroups@xxxxxxxxxxxxxxx> wrote in message
news:1475230.NGcxXIZqyJ@xxxxxxxxxxxxxxxxxx
__/ [ Oliver Wong ] on Monday 24 July 2006 17:52 \__
"Mark Kent" <mark.kent@xxxxxxxxxxx> wrote in message
news:7qeep3-h9o.ln1@xxxxxxxxxxxxxxxxxxxxxxxxx
It goes on to describe [...] superior methods for taking linux
forward to make it much more immune to such difficulties. Techniques
which have been successfully used are described, including encrypting
the running code in memory to make it much more difficult to exploit a
buffer overflow (the virus wouldn't be able to encrypt itself the right
way).
Interestingly enough, this sounds a lot like a "whitehat" application
of DRM and/or Trusted Computing.
I'm aware that you are on the verge of sarcasm there, but
FWIW encryption is used very widely in computing, especially
for inter-host/client/peer communication. Think, for
example, about Internet banking (HTTS, SSL) or PGP/GPG. The
computer also needs to be secure therein if multiple users
are involved and each possesses different privileges and
must prevent intrusion (shared memory, disks, maybe even
physical access which requires filespace encryption). There
were serious incidents of user role escalation within the
Debian servers recently. This has been resolved. Piracy and
privacy may sound similar, but there are separate
altogether. DRM/encryption has no place in combatting piracy
because it entails a high cost and it offers merely nothing.
Actually, I was serious. Trusted Computing is basically encryption, so
why did we even come up with a new name for it? Because the new twist that
TC brings is that it encrypts executable programs, rather than data. In the
old way of thinking, a user is always implicitly trusting the programs that
(s)he runs. That is, the programs will usually inherit the same permissions
that the user has. If the user is root, then the program is said to have
root access.
With TC, you no longer nescessarily trust the programs you're running.
The programs are run in a sandbox environment, and what permissions are
given to the program is configurable at a fine-grain level. In the "bad,
corporate dictatorship" version of TC, Microsoft/Intel/RIAA/MPAA are the
ones who are assigning the permissions for each program. In the "good,
empowered user" version of TC, it would be the end user assigning the
permissions -- or more accurately, the OS (operating system) would be
assigning the permissions in proxy of the user. This works best if the OS is
an open source one, so that you can be sure that its is faithfully and
accurately giving and denying the permissions that you, the user, want.
There's some progress made in this direction already. For example, in
Windows XP SP2, the first time a program tries to connect to the Internet,
you'll get a pop up saying so, and asking if you authorize this connection.
Probably Linux has a similar feature as well (though I don't know what it's
called). The Java JVM had this feature a long time ago: Your could write a
remotely-deployable Java application (via JWS, or Java Web Start), and
request certain "services" from the host VM, and the user could accept or
deny each of these requests. The requests include such things as:
(*) Storing temporary data (like a cookie)
(*) Read access to specific directories.
(*) Write access to specific directories.
(*) Connecting to an IP address other than the one from which the
application was initially downloaded from.
(*) Accessing local audio devices
etc.
A lot of people who follow "tech news" automatically associate "Trusted
Computing" with "bad" or "evil" (I'm often do this too, at least
subconsciously). However, the description given above, encrypting a program
as its running, immediately reminded me of TC, and it was a non-evil
application of it.
- Oliver
|
|