<$BlogRSDUrl$>

23 April 2004

An Academic Aside 

posted by Rob @ 10:26:00 pm Perm Link
An Academic Aside
One of my favorite people to read is Lawrence Solum. He is not only quite creative when it comes to thinking and writing, with lots of original things to say, but he shows depth in his blog posts beyond many others (myself included). If you want an introduction to legal theory, read Lawrence's Legal Theory Lexicon--I had my "Law and the Information Society" students read some of it this term); if you want to learn who is joining the academy in the U.S., read Lawrence; if you want to learn (and think) about copynorms, read Lawrence.

Well, in the midst of all this (he teaches, too, by the way), Lawrence has decided to include all of us in his newest research project, which is just getting off the ground. In a series of posts that will appear irregularly, organized in the "Legal Scholar's Journal," he'll let us in on his process from the ground floor up. Here's just part of what he has to say in his first Journal entry:

"The journal will follow my progress as I take an article from a blank page (or "screen") today through the early drafts of summer to the submission of the final draft to law reviews in February of 2005. Some of the early posts will discuss the origins of the project. I'll say something about the parts of the article writing process that are rarely discussed in public—the pragmatic reasons for picking one project over another--about gaming the law reviews and taking into account the opinion leaders in the legal academy. But my next article is really being written to satisfy must one person--me. And I will also have a good deal to say about why I feel passionate about the project that I am starting today."
As a new (potential?) scholar myself, I can't wait. I'll be reading, hoping that my own projects, in various states of disarray, can benefit as much from Lawrence's sharing of his experience as it has from Eugene Volokh's "Legal Academic Writing."

I can't wait . . .

18 April 2004

The Amplification Effect 

posted by Rob @ 12:42:00 pm Perm Link
The Amplification Effect
In reading my E-mail this morning, it seemed to me that the struggle against viruses -- specifically E-mail viruses -- provides us with a concrete example of the "Trusted Technology Fallacy" (about which I've written here before) and shows us how this fallacy not only affects our thinking about technology, but also has real effects on us.

Here's the situation: in responding the onslaught of E-mail viruses, both individual E-mail systems and virus prevention manufacturers have implemented automatic stop and respond systems that react to messages with certain types of (or in specific cases certain named) attachments. Once implemented (turned on), these systems are largely automated. They stop the messages, and they send a message to the alleged sender notifying that person that their message has not been delivered. This seems reasonable. If the message was wrongly stopped -- if the stop was a "false positive" for virus detection -- the sender likely thinks it has been delivered. If it was a real virus, then the sender should be alerted that his or her system has been compromised.

The problem comes when new types of E-mail viruses change their behavior, as has happened with the klez virus, and with others such as "MyDoom." The virus infects a system, but instead of sending messages that clearly come from the infected machine, it sends messages with the "from" field filled in from another address found on the system (more info on E-mail spoofing here). That means if someone who has ever sent or received an E-mail message from me is infected, I might either receive an infected file from them but "from" another E-mail address, or someone else might receive an infected file that seems as though it was sent by me. In the latter case, the automated system kicks in, and sends me a message that my message was stopped from reaching its destination.

Why does this make a difference? Well, let's formalize it. The equation looks like this:

"Standard" E-mail Virus
Infects a system and sends n infected messages from that system. Each message is either accepted and leads to another infected machine, or is deleted without further effect. We can thus assume that the infection will multiply by the number of infected machines m:

   n x m = t1

where t1 is the total number of messages caused by the virus (though n here is variable depending on elements of the infected machine, including time connected to the Internet, type of connection, number of E-mail addresses in the address book, etc.). In any event, t1 is directly related to the number of infected systems.

"From field" spoofing virus
Infects a system and sends infected messages to found addresses; each message is accepted and leads to another infected machine, as above, or is handled in one of two ways: deleted either automatically or manually by the user, or, is deleted and a response sent to the spoofed E-mail address. Thus messages are sent by:
   1) infected machines m
   and
   2) non-infected machines that will never become infected r
Let's add the non-infected by responding machines r to the equation:

n x m x r = t2

In any situation in which r > 1, it follows that t2 > t1.

This is the amplification effect. Virus writers have learned how to use automated responses to amplify the effects of their infiltrations into compromised systems, sometimes to the extent that the corollary effects added by amplification are greater than the effects of the virus itself. Without ever having been infected, having taken all the appropriate steps and avoided the possible mis-steps, I now receive more messages telling me that a message that I never sent has not been received by a person to whom I didn't send it than I do actual virus messages. In all cases, the message is useless to me, but it amplifies the effect of the virus by utilizing systems designed to combat the virus to actually spread its effects further. This is the nature of the "new" viruses.

And so long as there are technological systems, my guess is that the amplification effect will be in play. And the possible response, the idea that static or even flexible system automated systems can somehow cope with this, that we can trust them to actually handle this kind of human ingenuity, is at the core of the trusted technology fallacy. Turnkey computer security? I don't think so . . .