A Taste of Computer Security

© Amit Singh. All Rights Reserved. Written in August 2004

Unix vs. Microsoft Windows

Microsoft Windows is widely regarded as the epitome of insecure platforms — a platform fraught with innumerable security problems. Windows systems top the charts in perhaps every major vulnerability and incident list. In contrast, Unix systems are perceived to be considerably more secure, because __________ (your favorite reason here).

How Did Windows Become "So Insecure"?

I posed this question to a few people: technology experts, and non-technical users. I found that very few people had actually ever given this any serious thought. They "just knew" that Windows is "the most insecure platform." Those who were willing to think on-the-fly ascribed their beliefs to gut-feeling, perpetual digital disasters crashing down upon Windows (as experienced first-hand or as reported by the media), statistics (incidents and vulnerabilities published by respectable organizations), folklore, inherent (sometimes inexplicable) hatred of all things Microsoft, inherent affinity for Unix, etc. Some conjectures took Microsoft's monopoly into account, concluding that Microsoft "doesn't really care" about security, as they can afford not to, and still be successful.

Many of these people are Windows users.


Nevertheless, Windows NT was designed to be a secure system, with provisions for even more security than initially implemented. It provides an impressive set of security mechanisms (Windows 2000 Security Services, Windows Server 2003 Security Services), with more being worked on (Windows XP Service Pack 2, Next-Generation Secure Computing Base).

Current Windows systems have some of the highest security ratings (as compared to other systems). Note that this is factual information, regardless of how much sectarian laughter it induces.

However, the number of documented security issues and the real-life rampant insecurity of Windows are not speculations either! The problems are real, both for Microsoft, and for Windows users.

There is no single incontrovertible explanation of this paradox, and it is not our goal to conclude "which is better" or "which is more secure." Perhaps the best we could do is to attempt a brief objective (and dispassionate) discussion on the topic.

I chose to put my personal overall standpoint on Windows in a digressionary section: Windows: My Personal View.

Defining Windows

Microsoft has dabbled with numerous operating systems and environments, similarly to Apple (refer to A Technical History of Apple's Operating Systems). However, unlike Apple, whose trial-and-error process was rather excruciating, Microsoft has had considerable success in most cases.

Various incarnations of Microsoft Windows could be classified as follows:

Perhaps the most palpable failure Microsoft had with an operating system (or related effort) was with the "Bob" user-interface product for Windows 3.x.

We use the term "Windows" only to refer to NT-based systems. We use the term "Unix" in a collective sense to refer to Unix, Unix-derived, and Unix-like systems.

The "Official" Security of Windows

"Windows XP: The Rock of Reliability."

- Microsoft

As we saw in Defining Computer Security, NT-based Windows systems are classified at C2 (Orange Book) or EAL 4+ (Common Criteria) levels — the highest among existing general purpose operating systems. In fact, Windows even meets a few requirements of the next more secure division (Orange Book B2), such as the provision of separate roles for separate administrative functions (trusted facilities management), and the Ctrl+Alt+Delete Secure Action Sequence (SAS). Thus, Windows officially meets critical security-related requirements of most businesses and government agencies.

The unfortunate part is that the abovementioned security ratings do not necessarily represent the security strength of a system in real life. For example, an EAL level for a system only indicates the level of confidence in how well the system meets its stated security requirements. Thus, if the required security feature set is minimal or perhaps even empty, and the system satisfies it well, it can achieve a high EAL rating. Alternatively, consider the C2 rating of Windows NT. It is not the operating system itself that achieves such a certification: it is a specific configuration, which includes hardware and a typically excruciatingly long checklist of what all to enable, disable, add, remove, and so on. Changing even one aspect could invalidate the certification. The C2 rating of Windows NT 3.5 became a subject of great controversy as it only applied to a stand-alone machine — one that was unplugged from the network.

However, it wouldn't be entirely fair to single out Microsoft Windows when it comes to "unrealistic" (from a daily life standpoint) requirements for effective security. The 100-page long Mac OS X "Security Configuration Guide" from the NSA details several such steps. Its suggestions include physical disabling of devices that may leak information, such as AirPort, Bluetooth, microphone and other audio devices, and so on.


On a Windows system, the Winlogon service responds to the SAS, a sequence of keystrokes that begins the logon or logoff process. The default SAS is Ctrl+Alt+Delete. A dynamically linked library (DLL) called GINA (for Graphical Identification 'n' Authentication), implemented in msgina.dll gathers and marshals information provided by the user and sends it to the Local Security Authority (LSA) for verification.

GINA is replaceable, say, if you want to use an alternate authentication mechanism, such as a Smart Card.

The SAS provides a level of protection against Trojan horse login prompts, but not against driver level attacks.

As we have seen in various sections earlier, the types of inflictions normally associated with Windows (such as worms, viruses, Trojan horses, and so on) existed before Windows did, and are not technically limited to Windows.

Points to Ponder

There are several points to consider — intertwined, and often subtly related — in our attempts to understand Windows' situation.


Consider some historical aspects of Microsoft's pre-NT platforms:

Now, Windows NT was based on a new design, focusing on numerous modern features, portability, reliability, and security. The resounding success of Windows 3.x was instrumental in Microsoft shifting its focus regarding many such goals, and putting the greatest emphasis on native backwards compatibility. Similarly, the graphical user-interface of Windows 95 was considered a critical feature to be passed on to NT-based systems.

Consider the following two "big transitions": from the Windows 95 to the NT family (Microsoft), and from Mac OS 9.x to Mac OS X.


Microsoft used Virtual DOS Machines (VDMs) to run MS-DOS applications under Windows NT. A VDM is a virtual MS-DOS system that runs on a virtual x86 computer. For 16-bit Windows applications, Microsoft used an environment called Windows on Win32 (WOW) — essentially a multi-threaded VDM. Thus, backwards compatibility with MS-DOS and 16-bit Windows was achieved via virtual machines.

However, for compatibility between 95- and NT-based systems, Microsoft chose the native route, with applications from the old family running on the new family courtesy of backwards binary compatibility.


Apple used the Classic virtual environment for binary compatibility, as well as Carbon, an overhaul of the "classic" Mac OS APIs — pruned, extended, and modified to run in the more modern Mac OS X environment. Carbon provided source-level compatibility, with a Carbon application running native both under Mac OS 9 and Mac OS X.


Apple's eschewing of legacy/backwards compatibility in Mac OS X was fairly rapid, and had different logistics (considerably smaller and unique user-base).

In comparison, Microsoft had a considerably harder task at hand. Note that the 95 family was extremely inferior to the NT family from a security standpoint. Due to this, and various other reasons, the Win32 API implementation would be very different on these two.

Thus, an application from a 95 family system (single-user, weak access control, little to no security) ran on an NT family system (multi-user, strong access control, much greater security) — without any changes to the application itself. Ensuring that application-level security is maintained on the newer system in doing so is a difficult task.

That said, Microsoft's transition was distributed over several more years than Apple's (considering Windows NT 3.1 was released to manufacturing in mid-1993). Apple's switch, being rather abrupt, does have its own set of problems. The security "philosophy" of the Mac platform, and of the Mac community, is immature yet. While Mac OS X has a good amount of circumstantial immunity against malware, it is significantly lacking in its security paraphernalia as compared to the cutting edge feature-set found in its competitors. The difference is more stark on the server side, where the competition is stiffer.


It is one thing to come up with a modern, or great design. However, design alone, and even its subsequent implementation, do not magically change real-life scenarios. What about the mind-set of the users? What about the philosophy associated with the platform? An extensive array of security functions in a system is quite ineffective if the system mostly operates in a context where these functions are not used properly, or maybe even are bypassed entirely. While the presence of such mechanisms, and their correct functioning in an evaluative setting would win security ratings, real-life introduces numerous weakening factors: historical, contextual, and even imaginary.

"Security" is hard to formalize, hard to design (and design for), hard to implement, hard to verify, hard to configure, and hard to use. It is particularly hard to use on a platform such as Windows, which is evolving, security-wise, along with its representative user-base.

The primary environment in which a typical Windows system exists has traditionally been hostile, especially after the advent of the Internet. While Unix systems share the same environment today, their traditional environments were comparatively trusted: research labs and universities. Similarly, Unix users have had backgrounds differing from Windows users.

We stated earlier that UNIX was not even designed with security in mind. Several technologies that originated on Unix, such as NFS and the X Window System, were woefully inadequate in their security.

One could argue that security is often absorbed (or retrofitted) into Unix with less effort than Windows, and more importantly, it is put into use faster, owing to both Unix's developer-base(s) and user-base(s). Enumerating reasons for this particular statement is beyond the scope of this document, but it would involve hackers (in the good sense), computer science research, Open Source, no single product representing Unix, etc.


Windows has more execution environments than typical Unix systems, for macro-processing, email attachment processing, and so on. A related point is that Windows tries to implicitly do things for the user. Consequently, there are more situations for code (such as code embedded in a document) to be executed, often without the user being asked proactively.

As non-Windows systems, perhaps in their evolution to being "mainstream", perform more operations on the user's behalf (an aspect considered by many to be a facet of user-friendliness), operational and situational differences might not remain as pronounced. The recent spate of "URI-related security flaws" in Mac OS X (KDE shared some of the same problems) was an example of "bad things happening when the system does things for you". A more detailed description of this issue can be found in URL-based Security Holes in Mac OS X.

Another example is that of the sudo command on Mac OS X. When you do a successful sudo as the super-user, the default system behavior is to not ask for the super-user password for 5 minutes. If you inadvertently execute, say, a shell script that does sudo and something malicious, you would have a problem. For example, a virus could spread this way.

In many situations, security could be "improved" simply by "turning things off." This especially applies to network services. Many Unix systems, particularly recent ones, emphasize on security by default: services are turned off out-of-the-box. In many cases, the user would not require most of these services, so any vulnerabilities in the corresponding daemons would be inapplicable to such a system.

Windows (again, possibly driven by the "less work for the end-user" tenet) has traditionally shipped with various services enabled by default. Such services increase the attack surface of Windows.

Windows XP Service Pack 2

The upcoming Windows XP Service Pack 2 would introduce a wide variety of security improvements in the areas of networking, memory (including preventing data execution), email handling, web browsing, and some miscellaneous areas. Among these are changes to turn certain services off by default, and turn some others on (such as the Windows Firewall) by default.

Security and Ease of Use

I earlier said that a common, although not necessary, side-effect of enhancing a system's security is that it becomes harder to program, and harder to use. Security related steps that are required to be performed by end-users must be easy to understand, and easy to use. If not, users may bypass, even altogether, steps that are especially frustrating.

Windows is supposed to be an easy-to-use platform, while Unix is supposed to be cryptic and hard-to-use. Historically, an average Unix user has been an academician, researcher, or somebody who is either proficient in, or is willing to spend time and energy figuring out details of a computer system. In contrast, an average Windows user wants things to "just work", and is not so much interested in the inner workings of the system. This is in conformance with the purported philosophies of the two systems. With time, Unix and Windows have both become less extreme, with an average Windows user being more aware of (and interested in) the system, while not all Unix users want to dissect their systems anymore.

Now, configuring and using security can be extremely difficult on Windows. This is not to say that security is easy on Unix. However, consider that the barrier of entry to using Unix is such that if somebody is using Unix primarily, chances are that he would be able to manage his system reasonably (owing to his interest in the system itself, his willingness or ability to read and understand man pages and HOWTOs, etc.) Compare this with Windows. There are too many "knobs." The exposed interfaces are either too complicated, even with documentation, or too weak and limited. Security on Windows is hard to configure correctly (try setting up IPSEC). As such, expecting an average Windows user to administer his Windows machine competently is an unfair expectation. Thus, we have a detrimental interplay of the platform's philosophy and qualities with its representative user-base.

On a related note, attackers have a better chance of succeeding against an average Windows user. Who do you think is more likely to innocently open a malicious email attachment: the average Windows user, or the average Unix user (the latter probably might not even have an appropriate application to handle the attachment).

Market Share

Microsoft's market-share is perhaps the most obvious, and the most controversial point raised when discussing Windows vs. Unix, malware-wise. Windows has over 95% of the desktop market-share, though the server market is far less lopsided.

Microsoft's success, as reflected in their incredible market share, amplifies their security problems.

A potentially relevant issue is the phenomenal amount of resentment against Microsoft and Microsoft products that is seen in many circles.

Implementation Issues

While we have emphasized on the fact that technical differences alone do not account for Windows' security situation, software quality (or its paucity) is a problem in Windows, although it is not as extreme as it is often portrayed to be (such as, "Windows is poorly written. Period.")

Consider some randomly chosen examples:

When people talk of the security of an "operating system", they are invariably talking of the entire environment comprised of the kernel, applications, and everything else in user-space. This is indeed the correct interpretation. However, it is possible, and is ostensibly the case with Windows, that while the core operating system (especially the kernel) is well-designed, the overall system behaves poorly with respect to security. This is why Windows can have a high security rating with the impressive array of security mechanisms present, but it also sets dubious records in insecurity with these mechanisms being often rendered ineffective in real-life.

Late for (Net)work?

A well-publicized Microsoft oversight is their undermining of the Internet initially. Once Microsoft realized that they had been late in climbing on the Internet bandwagon, they attached paramount importance to coming up with a web browser, a web server, and related technology. In comparison, Unix had networking much earlier.

Now, it is one thing to incorporate networking into a system. It's another to do so as quickly as possible, make existing applications benefit from networking support, and maintain high standards for software quality and security — in a highly-competitive arena that inexorably demands quick-to-market (for example, Netscape was a major threat to Microsoft at one point). Perhaps the questionable implementation of some core network-related components in Windows, as indicated by the raw number of reported flaws in them, could be attributed to this apparent rush.

Is Popularity Really An Issue?

Regarding the relation between the success (popularity) of Windows and the amount of malware for it, a few points are frequently raised:

Well, the issue is perhaps too subjective to address satisfactorily, but one must realize that even though Windows malware might use bleeding-edge flaws (that may be discovered on a daily-basis), the apparent marriage of malware to Windows is not a new thing, and it did not happen overnight. For close to thirty years, PC (DOS and Windows) viruses have thrived: there is a long-standing viral infrastructure, both real and philosophical, in place.

End-users often play a critical role in spreading malware. As mobile users travel, so does the malware on their laptop computers. A naturally vulnerable end-user does not use a server, if at all, the same way as he does a client computer (for example, downloading and using random software). Moreover, servers are better monitored and protected. Due to these, and related factors, servers often have a higher natural immunity against malware. Exceptions happen when a vulnerability is discovered in a widely-deployed server (the Morris Worm, various IIS flaws). In such cases, servers can act as very effective attack-portals.

The key to arriving at an "answer" to the "Why" question (whether it be about the insecurity of Windows or the security of Unix) is to consider all that we have discussed as a whole, rather than attempting to discount individual points. The situation as it exists today really is a result of complex technical and non-technical interactions over a long period of time.

Abundance and Homogeneity

Now, it is perfectly feasible, technically and otherwise, for malware to breed on Unix, say, if Unix becomes more popular. However, why does it have to happen, simply because it can? Perhaps it will, perhaps not. While Windows has the misfortune of having decades of malicious momentum, Unix might have the advantage of having decades of inactivity in this area: no rampant viral activity (even if technically feasible), no existing momentum, no traditionally tarnished image, elitism (and snobbery against Windows), and in general, inertia.

There are other factors in favor of Unix.

If you were to enumerate what constitutes "Windows" today, you would get a handful of systems providing essentially the same execution environment. "The" Windows environment is abundant and homogeneous.

Recall that we defined "Unix" to be a family of systems. If you were to enumerate what constitutes "Unix" today, you would get maddening diversity: in architectures, interfaces, flavors, distributions, and many more. Even apparently similar Unix systems, such as two Linux distributions, might be different enough to warrant considerable extra "work", if an attacker were to create (the easy part) and deploy (the hard part), say, a virus. Creating malware, as we have seen, is a technical problem, easily solved on any platform. Spreading malware involves operational and situational issues, which are apparently less of an obstacle on Windows than any other platform.

<<< Miscellaneous main Epilogue >>>