"... it is easy to be blinded to the essential uselessness of them by the sense of achievement you get from getting them to work at all. In other words ... their fundamental design flaws are completely hidden by their superficial design flaws."
-- The Hitchhiker's Guide to the Galaxy, on the products of the Sirius Cybernetics Corporation.
Let's be honest: there's no such thing as bug-free software. Initial
versions of programs may occasionally crash, fail to de-allocate memory, or
encounter untested conditions. Developers may overlook security holes, users
may do things nobody thought of, and not all systems are identical.
All software developers are human, and they make mistakes now and then.
It happens. But of all software vendors, Microsoft has the worst record
by far when it comes to the quality of their products in general.
Microsoft boasts a rather extensive product range, but in fact there's
less here than meets the eye. Microsoft has forever been selling essentially
the same software over and over again, in a variety of colorful new
wrappers.
Microsoft products can be divided into three categories: applications,
operating systems, and additional server products. The applications include
the Microsoft Office suite, but also Internet Explorer, Media Player, Visio,
Frontpage, etc. The operating systems involve desktop and server versions of
Windows. On the desktop we find Windows 9x/ME, NT Workstation, Windows 2000
and Windows XP, and on the server end we have Windows NT Server and Windows
2000 varieties such as Windows 2000 Datacenter. The additional server products,
e.g. Internet Information Server (IIS) and SQL Server, run on top of one of
the Windows server products. They add services (e.g. webserver or database
server functions) to the basic file, print and authentication services that
the Windows server platform provides.
Windows on the desktop comes in two flavors: the Windows 9x/ME product line,
and the Windows NT/2000/XP product line. The different versions within one
product line are made to look a bit different, but the difference is in the
details only; they are essentially the same. Windows '95, '98 and ME are
descended from DOS and Windows 3.x, and contain significant portions of old
16-bit legacy code. These Windows versions are essentially DOS-based, with
32-bit extensions. Process and resource management, memory protection and
security were added as an afterthought and are rudimentary at best. This
Windows product line is totally unsuited for applications where security
and reliability are an issue. It is completely insecure, e.g. it may ask for
a password but it won't mind if you don't supply one. There is no way to
prevent the user or the applications from accessing and possibly corrupting
the entire system (including the file system), and each user can alter the
system's configuration, either by mistake or deliberately. The Windows 9x/ME
line primarily targets consumers (although Windows '95 marketing was aimed
at corporate users as well).
The other Windows product line includes Windows NT, 2000 and XP, and the
server products. This Windows family is better than the 9x/ME line; at least
these versions use new (i.e. post-DOS) 32-bit code. Memory protection,
resource management and security are a bit more serious than in Windows
9x/ME, and they even have some support for access restrictions and a secure
filesystem. That doesn't mean that this Windows family is as reliable and
secure as Redmond's marketeers claim, but compared to Windows 9x/ME its
additional features at least have the advantage of being there at
all. But even this Windows line contains a certain amount of 16-bit legacy
code, and the entire 16-bit subsystem is a direct legacy from Microsoft's
OS/2 days with IBM. In short, all 16-bit applications share one 16-bit
subsystem (just as with OS/2). There's no internal memory protection, so
one 16-bit application may crash all the others and the the entire 16-bit
subsystem as well. This may create persistent locks from the crashed 16-bit
code on 32-bit resources, and eventually bring Windows to a halt. Fortunately
this isn't much of a problem anymore now that 16-bit applications have all
but died out.
Of course Windows has seen a lot of development over the years. But in fact
very little has really improved. The new features in new versions of Windows
all show the same half-baked, patchy approach. For each fixed problem, at
least one new problem is introduced (and often more than one).
Windows XP for example comes loaded with more applications and features than
ever before. While this may seem convenient at first sight, the included
features aren't as good as those provided by external software. For example,
XP insists on supporting DSL ("wideband Internet" networking,
scanners and other peripherals with the built-in Microsoft code instead of
requiring third-party code. So you end up with things like DSL networking that
uses incorrect settings (and no convenient way to change that), scanner
support that won't let you use your scanner's photocopy feature, or a digital
camera interface that will let you download images from the camera but you
can't use its webcam function. Also, applications (e.g. Internet Explorer and
Outlook) are integrated in the system more tightly than ever before, and more
formerly separate products have been bundled with the operating system.
All versions of Windows share a number of structural design flaws.
Application installation procedures, user errors and runaway applications may
easily corrupt the operating system beyond repair, networking support is poorly
implemented, inefficient code leads to sub-standard performance, and both
scalability and manageability leave a lot to be desired. (See also
appendix A.) In fact, NT and its successors (or any
version of Windows) are not really comparable to the functionality,
robustness or performance that the UNIX community has been used to for
decades. They may work well, or they may not. On one system Windows will run
for weeks on end, on another it may crash quite frequently. I've attended
trainings at a Microsoft Authorized Education Center, and I was told:
"We are now going to install Windows on the servers. The installation
will probably fail on one or two systems [They had ten identical systems
in the classroom] but that always happens - we don't know why and neither
does Microsoft." I repeat, this from a Microsoft Authorized Partner.
Be that as it may... Even without any installation problems or serious crashes
(the kind that require restore operations or reinstallations) Windows doesn't
do the job very well. Many users think it does, but they generally haven't
experienced any alternatives. In fact Windows' unreliability has become
commonplace and even proverbial; the dreaded blue screen has featured in
cartoons,
screen savers and on
t-shirts, it has appeared on
airports and
buildings, and there has even been a
Star Trek episode in which a malfunctioning space ship had to be switched
off and back on in order to get it going.
And even if Windows stays up it leaves a lot to be desired. On a fairly
serious desktop PC (e.g. a 450MHz PII CPU with 256MB RAM, something we could
only dream of a few years ago) four or five simultaneous tasks are enough to
tax Windows' multitasking capabilities to their limits, even with plenty of
core memory available. Task switches will start to take forever, applications
will stop responding simply because they're waiting for system resources to
be released by other applications (which may have crashed without releasing
those resources), or kernel and library routines lock into some unknown wait
condition. Soon the whole system locks up entirely or becomes all but
unusable. In short, Windows' process management is as bad a joke as its
memory protection and resource management is, and an operating system that
may crash entirely when an application error occurs should not be sold as a
serious multi-tasking environment. Granted, it does run several processes at
once - but not very well. Recent versions of Windows (i.e. 2000 and
XP) are a little better in this respect than their predecessors, but not much.
Although they have been kludged up to reduce the impact of some of the most
serious problems, the basic flaws in the OS architecture remain; a crashing
application (e.g. a video player) can still lock up the system or throw it
into a sudden and spontaneous warm reboot.
Windows is quite fragile, and the operating system can get corrupted quite
easily. This happens most often during the installation of updates, service
packs, drivers or application software, and the problem exists in all versions
of Windows so far. The heart of the problem lies in the fact that Windows
can't (or rather, is designed not to) separate application and operating
system code and settings. Code gets mixed up when applications install
portions of themselves between files that belong to the operating system
(occasionally replacing them in the process). Settings are written to a
central registry that also stores vital OS settings. The registry database
is basically insecure, and settings that are vital to the OS or to other
applications are easily corrupted.
Even more problems are caused by the limitations of Windows' DLL subsystem.
A good multi-tasking and/or multi-user OS utilizes a principle called
code sharing. Code sharing means that if an application is running
n times at once, the code segment that contains the program code
(which is called the static segment) is loaded into memory only once,
to be used by n different processes which are therefore instances of
the same application. Apparently Microsoft had heard about something called
code sharing, but obviously didn't really understand the concept and the
benefits, or they didn't bother with the whole idea. Whatever the reason,
they went and used DLLs instead. DLL files contain Dynamic Link Libraries
and are intended to contain libary functions only.
Windows doesn't share the static (code) segment - if you run 10 instances of
Word, the bulk of the code will be loaded into memory 10 times. Only a
fraction of the code, e.g. library functions, has been moved to DLLs and
may be shared.
The main problem with DLL support is that the OS keeps track of DLLs by name
only. There is no adequate signature system to keep track of different DLL
versions. In other words, Windows cannot see the difference between one
WHATSIT.DLL and another DLL with the same name, although they may contain
entirely different code. Once a DLL in the Windows directory has been
overwritten by another one, there's no way back. Also, the order in which
applications are started (and DLLs are loaded) determines which DLL will
become active, and how the system will eventually crash.
So there is no distinction between different versions of the same DLL, or
between DLLs that come with Windows and those that come with application
software. An application may put its own DLLs in the same directory as the
Windows DLLs during installation, and may overwrite DLLs by the same name
if they exist.
What it boils down to is that the application may add portions of itself to
the operating system. (This is one of the reasons why Windows needs to be
rebooted after an application has been installed or changed.) That means that
the installation procedure introduces third-party code (read: uncertified
code) to the operating system and to other applications that load the
affected DLLs. Furthermore, because there is no real distinction
between system level code and user level code, the code in DLLs that has been
provided by application programmers or the user may run at system level, i.e.
unprotected. This corrupts the integrity of the operating system and other
applications. A rather effective demonstration was provided by Bill Gates
himself who, during a Comdex presentation of the Windows 98 USB Plug-and-Play
features, connected a scanner to a PC and caused it to crash into a Blue
Screen. "Moving right along," said Gates, "I guess this is why
we're not shipping it yet." Nice try, Mr. Gates, but of course the
release versions of Windows '98 and ME were just as unstable, and in Windows
2000 and XP new problems have been introduced. These versions of Windows use a
firmware revision number to recognize devices, so an update of a peripheral's
firmware may cause that device to be 'lost' to PnP.
Another, less harmful but mostly annoying, side-effect of code confusion is
that different language versions of software may get mixed up. A foreign
language version of an application may add to or partially overwrite Windows'
list of dialog messages. This may cause a dialog window to prompt "Are
you sure?" in English, followed by two buttons marked "Da" and
"Nyet".
Peripheral drivers also use a rather weak signature system and suffer from
similar problems as DLL's, albeit to a lesser degree. For example, it's quite
possible to replace a printer driver with a similar driver from another
language version of Windows and mess up the page format as a result. Printer
drivers from different language versions of Windows sometimes contain entirely
different code that generates different printer output, but Windows is unaware
of the difference. This problem has been addressed somewhat with the release
of Windows 2000, but it's still far from perfect.
Designing an OS to deliberately mix up OS and application code fits
Microsoft's strategy of product bundling and integration. (See below.) But the
results are obvious: each file operation that involves executable code puts
the entire OS and its applications at risk, and application errors often mean
OS errors (and crashes) as well. This leads to ridiculous 'issues' such as
Outlook Express that may crash Windows NT 4.0, if NT is a 'high encryption'
version with the locale set to 'France'; replying to a message may crash the
entire system, a problem which has been traced to one of the DLLs that comes
with Outlook. (Are you still with me?)
In a well-designed and robustly coded OS something like this could never
happen. The first design criterion for any OS is that the system, the
applications, and (in a multi-user environment) the users all be separated
and protected from each other. Not only does no version of Windows do that by
default, it actively prevents you from setting things up that way. The DLL
fiasco is just the tip of the problem. You can't maintain or adequately
restore OS integrity, you can't maintain version control, and you can't
prevent applications and users from interfering with each other and the
system, either by accident or on purpose.
Then there's Windows' lack of an adequate repair or maintenance mode. If
anything goes wrong and a minor error or corruption occurs in one of
the (literally) thousands of files that make up Windows, often the only real
solution is a large-scale restore operations or even to reinstall
the OS. Yes, you heard me. If your OS suddenly stops working properly and the
files which you need to restore are unknown or being locked by Windows, the
standard approach to the problem (as recommended by Microsoft) is to do a
complete reinstallation. There's no such thing as single user mode or
maintenance mode to do repairs, nor is there a good way to find out which file
has been corrupted in the first place, let alone to repair the damage. (The
so-called 'safe mode' merely swaps configurations and does not offer
sufficient control for serious system repairs.)
Windows has often been criticized for the many problems that occur while
installing it on a random PC, which may be an A-brand or clone system in
any possible configuration. This criticism is not entirely justified; after
all it's not practically feasible to foresee all the possible hardware
configurations that may occur at the user end. But that's not the point. The
point is that these problems are often impossible to fix, because most of the
Windows operating system is beyond the users' or administrators' control. This
is of course less true for Windows 9x/ME. Because these are essentially DOS
products, you can reboot the system using DOS and do manual repairs to a
certain degree. With Windows NT this is of course completely impossible.
Windows 2000 and XP come with an external repair console utility on the CD,
that allows you to access the file system of a damaged Windows installation.
But that's about it.
This leads to ridiculous situations. For example, I tried to install a bugfix
from Microsoft for Windows NT, which included a DLL that replaced a previous
version in the system directory. However, since Windows keeps this DLL open
and locked, this cannot be done while the operating system is running.
The README that came with the patch blithely suggested that I reboot my
computer using DOS, and manually replace the DLL from the DOS prompt. Apart
from the fact that NTFS system partitions cannot be accessed from DOS,
which I will ignore for the moment, this inability to allow for its own
maintenance is a good demonstration of Windows' immaturity.
The inability to make repairs has been addressed, to a certain degree, in
Windows XP. XP comes with a 'System Restore' feature that tracks changes to
the OS, so that administrators may 'roll back' the system to a previous state
before the problem occurred. Also, the 'System File Check' feature attempts to
make sure that some 1000 system files are the ones that were originally
installed. If a "major" system file is replaced by another version
(for example if a Windows XP DLL file is overwritten by a Windows '95 DLL with
the same name) the original version will be restored. (Of course this also
prevents you from removing things like Outlook Express, Winfile.exe or
Progman.exe, since the specification of what is an important system file is
rather sloppy.)
These workarounds are automated processes, and they are largely beyond the
user's control. Disabling System File Check requires major surgery, and neither
feature can be stopped from undoing modifications that it thinks are
incorrect but that may be intentional. There's still no maintenance mode. Even
so this may be an improvement over the previous situation: a certain amount of
recovery is now possible. On the other hand, this illustrates Microsoft's
kludgey approach to a very serious problem: instead of implementing changes in
the architecture to prevent OS corruption, they perpetuate the basic design
flaw and try to deal with the damage after the fact. They don't fix the hole
in your roof, they sell you a bucket to put under it instead.
The inefficiency of Microsoft's program code is of course disastrous to
Windows' performance figures. (Where did you think the name 'Windoze' comes
from?) Today you need at least a 600 MHz Pentium III to do the same kind of
typical office work job under Windows XP that you used to do on a 12 MHz 286
under DOS... in about the same amount of time. Productivity has only been
fractionally increased at a hugely inflated Total Cost of Ownership. Can you
say "Return On Investment"?
Try this for a laugh: start the performance monitor in NT 4.0 SP3 and have
it keep track of the CPU load. Now open the Control Panel. Don't do anything,
just open it and leave it there... and enjoy your constant CPU load of 100%
on your production server or work station. (To be honest, this doesn't always
happen. This 'issue' of CPU cycles being used up somewhere is caused by a
combination of factors that MS has never been able to explain. Other
applications have been known to trigger this bug as well.) There are also
many other ways to waste CPU cycles and other resources on a Windows system.
For example, check your system's performance before and after you install
Internet Explorer 5.x or later. You don't have to run it, you don't have to
make it your default browser... just install it. And watch your performance
plummet.
Only 32 kilobytes of RAM in the Apollo capsules' computers was enough to put
men on the moon. The Voyager deep space probes have on-board computers based on
a 4-bit CPU. An 80C85 CPU with 176 kilobytes of PROM and 576 kilobytes of RAM
controlled the Sojourner rover that drove across the surface of Mars and sent
us a wealth of scientific data and high-resolution images in full-color stereo.
Today I have a 233MHz Pentium II with 64 Megabytes of RAM and 15 Gigabytes of
disk space, but if I try to type a letter to my grandmother using Windows and
Office XP, the job will take me forever because my computer is
underpowered!!
Server-based or network-based computing is no solution either, mainly because
Windows doesn't have any real code sharing capability. If you were to shift
the workload of ten 450MHz/512MB workstations to an application server (using
Windows Terminal Server, Cytrix Server or another ASP-like solutions) you
would need a theoretical 4.5GHz CPU speed and 5GB of RAM at the server end
to maintain the same performance, not counting the inevitable overhead
which could easily run up to 10 or 20 percent.
And then there's the incredible amount of inefficient, or even completely
unnecessary code in the Windows file set. Take the 3D Pinball game in Windows
2000 Professional and XP Professional, for example. This game (you'll find it
under \Program Files\Windows NT\Pinball) is installed with Windows
and takes up a few megabytes of disk space. But most users will never know
that it's sitting there, wasting storage and doing nothing productive at all.
It doesn't appear in the program menu or control panel, and no shortcuts point
to it. The user isn't asked any questions about it during installation. In
fact its only conceivable purpose would be to illustrate Microsoft's
definition of 'professional'. No wonder Windows has needed more and more
resources over the years. A few megabytes doesn't seem much, perhaps, but
that's only because we've become used to the enormous footprints of Windows
and Windows applications. Besides, if Microsoft installs an entire pinball
game that most users neither need nor want, they obviously don't care about
conserving resources (which are paid for by the user community). What does
that tell you about the resource-efficiency of the rest of their code? Let me
give you a hint: results published in PC Magazine in April 2002 show that the
latest Samba software surpasses the performance of Windows 2000 by about
100 percent under benchmark tests. In terms of scalability, the results show
that Unix and Samba can handle four times as many client systems as Windows
2000 before performance begins to drop off.
Paradoxically, though, the fact that Microsoft products need humongous piles
of hardware in order to perform decently has contributed to their commercial
success. Many integrators and resellers push Microsoft software because it
enables them to prescribe the latest and heaviest hardware platforms in the
market. Unix and Netware can deliver the same or better performance on much
less. Windows 2000 and XP however need bigger and faster systems, and are
often incompatible with older hardware and firmware versions (especially the
BIOS). This, and the fact that hardware manufacturers discontinue support for
older hardware and BIOSes, forces the user to purchase expensive hardware with
no significant increase in return on investment. This boosts hardware sales,
at the expense of the "dear, valued customer". Resellers make more
money when they push Microsoft products. It's as simple as that.
Apart from the above (and other) major flaws there's also a staggering
amount of minor flaws. In fact there are so many minor flaws that their sheer
number can be classified as a major flaw. In short, the general quality of
Microsoft's entire set of program code is sub-standard. Unchecked buffers,
unverified I/O operations, race conditions, incorrectly implemented protocols,
failures to deallocate resources, failures to check environmental parameters,
et cetera ad nauseam... You name it, it's in there. Microsoft products contain
some extremely sloppy code and bad coding practices that would give an
undergraduate some well-deserved bad marks. As a result of their lack of
quality control, Microsoft products and especially Windows are riddled with
literally thousands and thousands of bugs and glitches. Even most of the
error messages are incorrect!
Some of these blunders can be classified as clumsy design rather than as mere
sloppiness. A good example is Windows' long filename support. In an attempt to
allow for long filenames, Microsoft deliberately broke the FAT file system.
They stored the extension information into deliberately cross-linked directory
entries, which is probably one of their dirtiest kludges ever. And if that
wasn't enough, they made it legal for filenames to contain whitespace. Because
this is incompatible with Windows' own command line parsing (it still assumes
the old FAT notation) another kludge was needed, and whitespace had to be
enclosed in quotation marks. This confuses (and breaks) many programs,
including many of Microsoft's own that come with Windows.
Another good example is Windows' apparent case-sensitivity. Windows
seems to make a distinction between upper and lower case when handling
filenames, but the underlying software layers are in fact case-insensitive. So
Windows only changes the case of the files and directories as they are
presented to the user. The names of the actual files and directories may be
stored in uppercase, lowercase or mixed case, while they are still presented
as capitalized lower case files. Of course this discrepancy causes no
problems in a Windows-only environment. Since the underlying code is
essentially case-insensitive, case is not critical to Windows' operation.
However as soon as you want to incorporate Unix-based services (e.g. a
Unix-based webserver instead of IIS) you discover that Windows has messed
up the case of filenames and directories.
But most of Windows' errors and glitches are just the result of sloppy work.
Of course there is no such thing as bug-free software, but the amount of bugs
found in Windows is, to put it mildly, disproportionate. For example, Service
Pack 4 for Windows NT 4.0 attempted to fix some 1200 bugs (yes, one thousand
two hundred). But there had already been three previous service packs at the
time! Microsoft shamelessly admitted this, and even boasted about having
"improved" NT on 1200 points. Then they had to release several
more subsequent service packs in the months that followed, to fix remaining
issues and of course the additional problems that had been introduced by the
service packs themselves.
An internal memo among Microsoft developers mentioned 63,000 (yes: sixty-three
thousand) known defects in the initial Windows 2000 release. Keith
White, Windows Marketing Director, did not deny the existence of the document,
but claimed that the statements therein were made in order to "motivate
the Windows development team". He went on to state that "Windows
2000 is the most reliable Windows so far." Yes, that's what he said.
A product with 63,000 known defects (mind you, that's only the known
defects) and he admits it's the best they can do. Ye gods.
All these blunders have of course their effects on Windows' reliability
and availability. Depending on application and system load, most Windows
systems tend to need frequent rebooting, either to fix problems or on a
regular basis to prevent performance degradation as a result of Windows'
shaky resource management.
On the desktop this is bad enough, but the same flaws exist in the Windows
server products. Servers are much more likely to be used for mission-critical
applications than workstations are, so Windows' limited availability and its
impact on business become a major issue. The uptimes of typical Windows-based
servers in serious applications (i.e. more than just file and print services
for a few workstations) tend to be limited to a few weeks at most. One or
two server crashes (read: interruptions of business and loss of data) every
few months are not uncommon. As a server OS, Windows clearly lacks
reliability.
Windows server products aren't even really server OSes. Their architecture is
no different from the workstation versions. The server and workstation
kernels in NT are identical, and changing two registry keys is enough to
convince a workstation that it's a server. Networking capabilities are still
largely based on the peer-to-peer method that was part of Windows for
Workgroups 3.11 and that Microsoft copied, after it had been successfully
pioneered by Apple and others in the mid-eighties. Of course some code in the
server products has been extended or optimized for performance, and
domain-based authentication has been added, but that doesn't make it a true
server platform. Neither does the fact that NT Server costs almost three times
as much as NT Workstation. In fact we're talking about little more than Windows
for Workgroups on steroids.
In November 1999, Sm@rt Reseller's Steven J. Vaughan-Nichols ran a test
to compare the stability of Windows NT Server (presumably running Microsoft
Internet Information Server) with that of the Open Source Linux operating
system (running Samba and Apache). He wrote:
Conventional wisdom says Linux is incredibly stable. Always skeptical, we decided to put that claim to the test over a 10-month period. In our test, we ran Caldera Systems OpenLinux, Red Hat Linux, and Windows NT Server 4.0 with Service Pack 3 on duplicate 100MHz Pentium systems with 64MB of memory. Ever since we first booted up our test systems in January, network requests have been sent to each server in parallel for standard Internet, file and print services. The results were quite revealing. Our NT server crashed an average of once every six weeks. Each failure took roughly 30 minutes to fix. That's not so bad, until you consider that neither Linux server ever went down.
Interesting: a crash that takes 30 minutes to fix means that something
critical has been damaged and needs to be repaired or restored. At least it
takes more than just a reboot. This happens once every six weeks on a
server, and that's considered "not so bad"... Think about
it. Also note that most other Unix flavors such as Solaris, BSD or AIX are
just as reliable as Linux.
But the gap between Windows and real uptime figures is even greater than
Vaughan-Nichols describes above. Compare that half hour downtime per six
weeks to that of Netware, in the following article from Techweb on 9 April
2001:
Server 54, Where Are You?
The University of North Carolina has finally found a network server that, although missing for four years, hasn't missed a packet in all that time. Try as they might, university administrators couldn't find the server. Working with Novell Inc. (stock: NOVL), IT workers tracked it down by meticulously following cable until they literally ran into a wall. The server had been mistakenly sealed behind drywall by maintenance workers.
Although there is some doubt as to the actual truth of this story, it's a
known fact that Netware servers are capable of years of uninterrupted
service. I recently brought down a Netware server at our head office. This is
a Netware 5.0 server that also runs software to act as the corporate
SMTP/POP3 server, fax server and virus protection for the network, next
to regular file and print services for the whole company. It had been
up and running without a single glitch for more than a year, and the only
reason we shut it down was because it had to be physically moved to
another building. Had the move not been necessary, it could have run on
for years and years - after all there's no reason why its performance
should be affected, as long as nobody pulls the plug or rashly loads
untested software. The uptimes of our Linux and Solaris servers (which act
as mission-critical web servers, database servers and mail servers, and also
run basic file and print services) are measured in months as well. Uptimes
in excess of a year are not uncommon for Netware and Unix platforms, and
uptimes of more than two years are not unheard of either. Most OS updates
short of a kernel replacement do not require a Unix server to reboot, as
opposed to Windows that needs a complete server reboot whenever a DLL in
some subsystem is replaced. But see for yourself: check the
Netcraft Uptime statistics and
compare the uptimes of Windows servers to those of Unix servers. The figures
speak for themselves.
Microsoft promises 99.999% availability with Windows 2000. That's a little
over 5 minutes of downtime per year. Frankly I can't believe this is a
realistic target for Windows. Microsoft products have never even approached
such uptime figures. Even though most of the increased availability of
Windows 2000 must be provided through third-party clustering and redundancy
solutions (something that the glossy ads neglect to mention) it's highly
unlikely that less than five minutes of downtime per year for the entire
Windows cluster is practically feasible.
Perhaps even more serious is the fact that, short of clustering, there is no
adequate solution for the many software glitches that jeopardize the
availability of a typical Windows server. A typical NT or 2000 server can
spontaneously develop numerous transient problems. These may vary from network
processes that seem healthy but ignore all network requests, to runaway
server applications that lock up the entire operating system. Usually the only
solution in these cases is to power cycle and restart the system. I remember
having to do that three times a week on a production server. Believe me, it's
no fun. Perhaps it's understandable that some network administrators feel
that the best way to accelerate a Windows system is at 9.81 meters per second
squared.
Does this make Windows an entirely unusable product that cannot run in a
stable configuration anywhere? No, fortunately not. There are situations
where Windows systems (both workstations and servers) may run for long
periods without crashing. A vendor-installed version of Windows NT of 2000 on
an HCL-compliant, A-brand system, with all the required service packs and
only certified drivers, should give you relatively few problems (provided
that you don't use it for much more than basic file and print services, of
course). The rule of thumb here is to use only hardware that is on Microsoft's
Hardware Compatibility List (HCL), to use only vendor-supplied, certified
drivers and other software, and to use third-party clustering solutions for
applications where availability is a critical issue.
To be honest, Windows 2000 is somewhat better (or rather less bad) than NT4
was. It's less prone to spontaneous crashes and some of NT's most awkward
design blunders have been fixed. (For example, the user home directories are
no longer located under the WINNT directory.) On most systems (especially on
notebook computers) it is considerably less shaky. Which goes to show that a
few relatively minor improvements go a long way, I suppose. But still, given
the general quality of Microsoft products and Windows in particular, there
are absolutely no guarantees. And of course Microsoft introduced a whole new
set of glitches and bugs in Windows XP, which largely undid many of the
improvements in Windows 2000, so that Windows XP is less stable than Windows
2000. But that's innovation for you, I suppose.
The most frightening aspect about all this is that Microsoft doesn't seem
to realize how serious these problems are. Or rather, they probably realize it
but they don't seem to care as long as sales hold up. Their documents on
serious problems (which are always called 'Issues' in Microsoft-speak) are
very clear on that point. Take the 'TCP/IP Denial Of Service Issue' for
example: a serious problem that was discovered a few years ago. It caused NT
servers to stop responding to network service requests, thus rendering
mission-critical services unavailable. (This should not be confused with
deliberate Denial Of Service attacks to which most operating systems are
vulnerable; this was a Windows issue only.) At the time there was no real
solution for this problem. Microsoft's only response at the time was to state
that "This issue does not compromise sensitive data in any way. It
merely forces a server to become unavailable for a short time, which is easily
remedied by rebooting the server." (NT administrators had to wait
for the next service pack that was released several months later before this
problem was addressed.)
And Microsoft thinks that this stuff can compete with Unix and threaten the
mainframe market for mission-critical applications?
Uh-huh.
I don't think so.
In September 2001 Hewlett-Packard clustered 225 PCs running the Open Source
Linux operating system. The resulting system (called I-cluster) benchmarked
itself right into the global top-500 of supercomputers, using nothing but
unmodified, out-of-the-box hardware. (A significant number of entries in that
top-500, by the way, runs Linux, and more and more Unix clusters are being
used for supercomputing applications.) Microsoft, with a product line that is
descended solely from single-user desktop systems, can't even dream of such
scalability - not now, not ever. Nevertheless Microsoft claimed
on a partner website with Unisys
that Windows will outperform Unix, because Unisys' server with Windows 2000
Datacenter could be scaled up to 32 CPU's. This performance claim is of course
a blatant lie: the opposite is true and they know it. Still Microsoft would
have us believe that the performance, reliability and scalability of the
entire Windows product line is on par with that of Unix, and that clustered
Windows servers are a viable replacement option for mainframes and Unix
midrange systems. I'm not kidding, that's what they say. If you're at all
familiar with the scalability of Unix midrange servers and the requirements of
the applications that mainframes are being used for, you will realize how
ludicrous this is. And if you're not, consider this: when Microsoft acquired
the successful Hotmail free Email service, the system had roughly 10 million
users, and the core systems that powered Hotmail all ran Unix. Today the
number of Hotmail users exceeds 50 million, but in spite of Microsoft's
claims about the power of Windows servers and their previous intentions to
replace Hotmail's core systems with NT servers, Hotmail's core systems still
run Unix. This is discussed thoroughly in a leaked-out
internal paper by Microsoft's Windows 2000 Server
Product Group member David Brooks. He mentions the proverbial stability of the
Unix kernel and the Apache web server, the system's transparency and
combination of power and simplicity. Windows on the other hand is considered
to be needlessly GUI-biased (Brooks writes: "Windows 2000 server products
continue to be designed with the desktop in mind") and also complex,
obscure, needlessly resource-hungry (Brooks: "It's true that Windows
requires a more powerful computer than Linux or FreeBSD") and involving
"reboot as an expectation" (sic).
Of course Hotmail is not the only example of Microsoft refusing to eat their
own dog food. The "We have the way out" anti-Unix website that
Microsoft (along with partner Unisys) put up in the first months of 2002,
initially ran Unix and Apache (and was ported to IIS on Windows 2000 only
after the entire ICT community had had a good laugh). And Microsoft's own
accounting division used IBM's AS/400 midrange platform for critical
applications such as the payroll system, until the late nineties. Food for
thought.
It should also be mentioned that Microsoft doesn't know the first thing
about networking. A Windows system in a TCP/IP environment still uses a
NetBIOS name. Microsoft networking is built around NetBEUI, which is an
extended version of NetBIOS. This is a true Stone Age protocol which is
totally unroutable. It uses lots of broadcasts, and on a network segment
with Windows PCs broadcasts indeed make up a significant portion of the
network traffic, even for point-to-point connections (e.g. between a Microsoft
mailbox and a client PC). If it weren't for the fact that it is possible to
encapsulate NetBIOS/NetBEUI traffic in a TCP/IP envelope, connecting Windows
to the real world would be totally impossible. (Note that Microsoft calls the
IP encapsulation of NetBEUI packets 'native IP'. Go figure.)
In any case Windows PCs tend to generate an inordinate amount of garbage on
the network. I remember reading a document from Microsoft that stated that a
typical PC network consists of ten or at most twenty peer-to-peer workstations
on a single cable segment, all running Microsoft operating systems. And that
explains it, I suppose. If you want anything more than that, on your own
head be it.
BTW: don't believe me -- try for yourself. Take a good, fast FTP
server (i.e. one that runs on Unix). Upload and download a few large files
(say, 50MB) from and to a Windows NT or 2000 workstation. (I used a 233MHz
Pentium-II.) You will probably see a throughput in the order of 1 Mbit/s for
uploads and 2 to 3 Mbit/s for downloads. Then boot Linux on the same
workstation and repeat. You will now see your throughput limited only
by the bandwidth or your network connection or by the capacity of your FTP
server, whichever comes first. On 10 Mbit/s Ethernet, 5 Mbit/s upload
and download throughput are the least you may expect. To further test
this, you can repeat it with a really slow client (e.g. a 60 or 75MHz Pentium)
running Linux. The throughput limit will still be network-bound and not
system-bound. (Note: this is not limited to FTP but also affects other network
protocols. It's a performance problem related to the code in Windows' IP stack
and other parts of the architecture involved with data throughput.)
New Windows versions bring no relief here. Any network engineer who uses PPPoE
(Point-to-Point Protocol over Ethernet) with ADSL will tell you that the MTU
(a setting that limits packet size) should be set to 1492 or less. In XP it's
set by default to 1500, which may lead to problems with the routers of many
DSL ISP's. Microsoft is aware of the problem, but XP nevertheless persists in
setting up PPPoE with an MTU of 1500. There is a registry hack for PPPoE
users, but there is no patch, and XP has no GUI-based option which enables the
user to change the MTU conveniently. (This problem is rather typical for XP.
Previous versions of Windows needed third party code to support such features.
This was inconvenient, but at least there was a chance to obtain code written
by developers who knew what they were doing. In Windows XP Microsoft insisted
on doing the job themselves, and botched it.)
On top of all this, readers of this paper report that according to John Dvorak
in PC Magazine, the Windows kernel maxes out at 483 Mbps. He remarks that as
many businesses are upgrading to 1 Gigabit Ethernet, Windows (including XP)
just can't keep up.
Now go read what Microsoft writes about Windows 2000 and XP being the
ideal platform for Internet applications...
The sloppy nature of Windows' networking support code and protocol stacks
also makes the system more vulnerable to Denial of Service attacks. A DoS
attack is nothing but a form of computer vandalism, with the intention to
crash a system or otherwise render it unavailable. In a typical DoS attack
a deliberately malformed network packet is sent to the target system, where
it triggers a known flaw in the operating system to disrupt it. In the case
of Windows, though, there are more ways to bring down a system. For example,
the kernel routines in Windows 2000 and XP that process incoming IPsec (UDP
port 500) packets are written so badly that sending a stream of regular IPsec
packets to the server will cause it to bog down in a CPU overload. And of
course Windows' IPsec filters cannot block a 500/udp packet stream.
Another way to render a system unavailable is a Distributed Denial of Service
attack. A DDoS attack involves many networked systems that send network
traffic to a single target system or network segment, which is then swamped
with traffic and becomes unreachable. There's very little that can be done
against DDoS attacks, and all platforms are equally vulnerable.
With all these DoS and DDoS vulnerabilities, it's a worrying development that
Windows 2000 and XP provide new platforms to generate such attacks. The only
real 'improvement' in Windows 2000's and XP's IP stacks is that for no good
reason whatsoever, Microsoft has extended the control that an application has
over the IP stack. This does not improve Windows' sub-standard networking
performance, but it gives applications the option to build custom IP packets
to generate incredibly malicious Internet traffic. This includes spoofed
source IP addresses and SYN-flooding full scale Denial of Service (DoS)
attacks.
So far we have concentrated on Windows. Most of the problems with Microsoft
products originate here, since Windows is by far the most complex Microsoft
product line, and there are more interactions between Windows and other
products than anywhere else. But unfortunately most server and desktop
applications are cut from the same cloth as Windows is. The general quality
of their code and design is not much better.
The additional server products generally run on a Windows server. This means
that all the disadvantages of an insecure, unstable platform also apply to
the server products that run on those servers. For example, Microsoft SQL
Server is a product that has relatively few problems. It's basically a
straightforward implementation of a general SQL server. It's not very
remarkable or innovative perhaps, but it's not a bad product as far as it
goes, certainly not by Microsoft standards. But any database service can never
be more stable than the platform it's running on. (This goes of course for any
software product, not just for an SQL database server.) So all vulnerabilities
that apply to the Windows server directly apply to the database service as
well.
Other additional server products come with their own additional problems,
though. Microsoft's webserver product, Internet Information Server (IIS),
is designed not just to serve up web pages written in the standard HTML
language, but also to provide additional authentication and links to content
databases, to add server and client side scripting to web pages, to generate
Dynamic HTML and Active Server Pages, et cetera. And it does all these
things, and more, but often not very well. IIS is outperformed by all other
major webserver products (especially Apache). IIS' authentication is far from
robust (the lack of security in MS products is discussed below) and the
integration of an IIS webserver with a content database server is far from
seamless. Dynamic HTML, ASP and scripting require the webserver to execute
code at the server end, and there Microsoft's bad process management comes
into play: server load is often excessive. Running code at the server end
in response to web requests creates a lot of security issues as well, and
on top of all that the web pages that are generated do not conform to the
global HTML standards, they are only viewed correctly in Microsoft's own
web browser products.
Microsoft's mail server product, Exchange, has a few sharp edges as well.
To lose a few days worth of corporate E-mail in an Exchange crash is not
uncommon. Exchange is designed to integrate primarily with other Microsoft
products (especially the Outlook E-mail client) and it doesn't take the
Internet's global RFC standards too seriously. This limits compatibility
and may cause all kinds of problems. Outlook Express also has a strange
way of talking IMAP to the Exchange server. It makes a bazillion IMAP
connections; each connection logs in, performs one task, sets the connection
to IDLE-- and then drops the connection. Since OE does not always close the
mailbox properly before dropping the connection, the mailbox and Outlook do
not necessarily sync up. This means that you may delete messages in OE that
magically return to life in a few minutes because those deletions did not get
disseminated to the mailbox before the connection terminated.
Just like other Microsoft applications, the additional server products are
tightly integrated into Windows during installation. They replace DLLs
belonging to the operating system, and they run services at the system level.
This does not improve the stability of the system as a whole to begin with,
and of course most of the code in the additional server products is of the
same doubtful quality as the code that makes up Windows. The reliability and
availability of any service can never be better than the server OS it runs
on. However most of Microsoft's additional server products add their own
complexity, bugs and glitches to the system, which only makes it worse.
The resulting uptime and reliability figures are rather predictable. The
inefficiency that exists in Windows is also present in the additional server
products, so as a rule of thumb each additional service needs its own server
platform. In other words: if you need a file and print server, a web server
and a mail server, you need three separate systems whereas Unix or Netware
could probably do the job on only one system.
Microsoft desktop applications (like Word and Excel) are largely more of
the same. They're in the terminal stages of feature bloat: they're full of
gadgets that don't really add anything useful to the product, but that ruin
productivity because of their complexity, and that introduce more errors,
increase resource demands, and require more code which in turn leads to
increased risks. After years of patching and adding, the code base for these
products has become very messy indeed. Security, if any, has been added as an
afterthought here, too. For example, a password-protected Word document is not
encrypted in any way. Inserting a 'protected' document into another
non-protected document (e.g. an empty new document) is enough to get around
the 'protection'.
Animated paper clips don't really make Word a better word processor. We'd be
better off with other things, such as a consistent behavior of the auto-format
features, the ability to view markup codes, or a more complete spell checking
algorithm and dictionary. But in spite of all the "new" versions of
Office and persistent feature requests from their users, Microsoft still
hasn't gotten around to that. Instead we have multi-language support that
tends to 'forget' its settings occasionally, and an 'auto-correct' feature
that's limited to the point of being more annoying than useful. Word documents
have become excessively large and unwieldy, and occasionally they are
corrupted while being saved to disk so that Word will crash while trying to
read them at a later time.
We can be brief about Excel: it has similar problems, and calculation errors
in formula-based spreadsheets on top of that. Excel is full of frills and
spiffy graphics and animations, but essentially it's still a spreadsheet that
cannot count and that requires that many formulas and macros be rewritten for
each new version of Excel.
Menu interfaces in all Microsoft applications, even in those that are bundled
in MS Office, are inconsistent and non-intuitive. For example, the menu
option to set application preferences, which may be titled 'Preferences' in
some products but 'Options' in others, may be found under the 'File' menu,
under the 'Edit' menu, under 'View' or somewhere else, depending on what
product you're currently using. To create even more confusion, the same options
in different applications do not work identically. For example the 'Unsorted
list' button (to create a bullet list) handles indentation correctly in Word
but ignores it completely in PowerPoint (PowerPoint adds bullets but messes
up the left outline of the text). And the 'F3' key activates the 'search
again' function for string searches in practically all Microsoft products,
except in Internet Explorer where it brings up a 'Find: all files'
window for no apparent reason.
Microsoft's most important application outside MS Office is without doubt
Internet Explorer. In its first incarnation IE was a very unremarkable web
browser; e.g. version 2.0 as it was shipped with Windows NT 4 was so backward
that it didn't even have frame capability. This soon changed as Microsoft
began integrating the web browser with Windows as a part of their integration
and bundling strategies (which are discussed in detail below).
In all honesty it must be said that recent versions of IE (starting with
version 6) aren't bad web browsers. They do what they're supposed to do, and
they do it pretty well. At least they display standards-compliant HTML as
correctly rendered web pages at a speed that is by all means acceptable.
Previous versions weren't so good and even contained deliberate deviations
from the global HTML standards that were intended to discourage the use of
standardized HTML in favor of Microsoft's own proprietary and restrictive
ideas.
The main drawbacks of Internet Explorer lie in the fact that it tries to be
more than just a web browser. It adds scripting support (with the ability to
run Visual Basic or Jscripts that are embedded in web pages) and it hooks
directly into the Windows kernel. I've seen web pages that would shut down the
Windows system as soon as the page was viewed with Internet Explorer. Microsoft
doesn't seem to have bothered very much with basic security considerations,
to put it mildly. And of course the installation of a new version of Internet
Explorer replaces (overwrites) significant portions of the Windows operating
system, with all the drawbacks discussed above.
Similar problems are found in Outlook, Microsoft's E-mail client. Outlook is
in fact a separate application, but it isn't shipped separately. There are two
versions: one is bundled with Internet Explorer (this version is called Outlook
Express) and the other is part of MS-Office (this version is named 'Outlook'
and comes with groupware and scheduler capabilities). In itself Outlook is an
acceptable, if unremarkable, E-mail client; it allows the user to read and
write E-mail. It comes with a few nasty default settings, but at least these
can be changed, although the average novice user of course never does that.
(For example, messages are by default sent as HTML file attachments instead
of as readable text, when a user replies to an E-mail the quoting feature
sometimes uses weird formatting that won't go away without dynamite, and
there's often a lot of junk that accompanies an outgoing E-mail message.)
More serious is the fact that both Outlook and its server-end companion
Exchange tend to strip fields from E-mail headers, a practice that is
largely frowned upon. This also makes both network administration and
troubleshooting more difficult.
Outlook comes with a lot of hooks into Internet Explorer. IE code is being
used to render HTML file attachments, including scripts that may be embedded
into an HTML-formatted E-mail message. Again Microsoft seems to have been
completely unaware of the need for any security here; code embedded in inbound
E-mail is by default executed without any further checking or intervention
from the user.
And that brings us to another major weakness of all Microsoft products:
security, or rather the lack thereof. The notorious insecurity of Microsoft
software is a problem in itself.
It all begins with Windows' rather weak security. The number of reports on
security holes has become downright embarrassing, but it still keeps increasing
regularly. On the other hand, Windows security holes have become so common
that they hardly attract attention anymore. Microsoft usually downplays the
latest security issues and releases another patch... after the fact.
If Microsoft really wanted to resolve these software problems, they would
take greater care to ensure such problems were fixed before its products went
on sale-- and thus reverse the way it traditionally conducts business. Doing
so would mean less resources wasted by its customers each year patching and
re-patching their systems in an attempt to clean up after Microsoft's
mistakes, but it would also decrease the customers' dependency on what
Microsoft calls 'software maintenance'.
In the meantime, hackers are having a ball with Microsoft's shaky security
models and even weaker password encryption (such as simple XOR bitflip
operations, the kind of 'encryption' that just about every student reinvents
in school). Hackers, script kiddies and other wannabees get to take their pick
from the wide choice of elementary security weaknesses to exploit. Some recent
and highly virulent worms, for example, spread so rapidly because they could
crack remote share passwords in about twenty seconds. (This did not stop
Microsoft from running an advertising campaign in spring 2003 that centered
on hackers becoming extinct along with the dodo and the dinosaur, all because
of Microsoft's oh so secure software. Unsurprisingly, this violated a few
regulations on truth in advertising, and the campaign had to be
withdrawn.)
Another large part of the problem is Windows' lack of adequate separation
between code running on various system and user levels. Windows always assumes
that code runs with the highest privilege so that it can do almost anything,
including malicious intent. This makes it impossible to prevent malicious
code from invading the system. Users may (inadvertantly or deliberately)
download and run code from the Internet, but it's impossible to prevent system
level resources from damage by user level code.
The tight integration between the various Microsoft products does little to
improve overall security. All software components are loaded with features,
and all components can use each other's functions. Unfortunately this means
that all security weaknesses are shared as well.
For example, the Outlook E-mail client uses portions of Internet Explorer to
render HTML that is embedded in E-mail messages, including script code. And
of course IE and Outlook hook into the Windows kernel with enough privileges
to run arbitrary malicious code that happens to be embedded in a received
E-mail message or a viewed web page. Since Outlook uses portions of
IE's code, it's vulnerable to IE's bugs as well. So a scripting
vulnerability that exists in Outlook also opens up IE and vice versa, and if
IE has a hook into certain Windows kernel functions, those functions can also
be exploited through a weakness in Outlook. In other words, a minor security
leak in one of the components immediately puts the entire system at
risk. Read: a vulnerability in Internet Explorer means a vulnerability in
Windows Server 2003! A simple Visual Basic script in an E-mail message has
sufficient access rights to overwrite half the planet, as has been proven by
Email virus outbreaks (e.g. Melissa, ILOVEYOU and similar worms) that have
caused billions of dollars worth of damage.
A good example are Word viruses; these are essentially VBS (Visual
Basic Script) routines that are embedded in Word documents as a macro. The
creation of a relatively simple macro requires more programming skills than
the average office employee can be expected to have, but at the same time a
total lack of even basic security features makes Word users vulnerable to
malicious code in Word documents. Because of the integrated nature of the
software components, a Word macro is able to read Outlook's E-mail
address book and then propagate itself through the system's E-mail
and/or networking components. If Windows' security settings prevent this,
the malicious virus code can easily circumvent this protective measure by the
simple expedient of changing the security settings. How's that for
security?
Similarly, VBS scripts embedded in web pages or E-mail messages may exploit
weaknesses in IE or Outlook, so that viewing an infected web page or receiving
an infected E-mail is enough to corrupt the system without any further action
from the user (including manually downloading a file or opening an attachment).
Through those weaknesses the malicious code may access data elsewhere on the
system, modify the system's configuration or even start processes. In March
2000, a hacker wrote (of course anonymously) on ICQ:
21/03/2k: Found the 1st Weakness: In Windows 2000 [...] there is a Telnet daemon service, which is not started by default. It can be remotely started by embedding a COM object into HTML code that can be posted on a web page, or sent to an Outlook client. Following script will start the Telnet service:
<SCRIPT LANGUAGE=VBScript> CreateObject("TlntSvr.EnumTelnetClientsSvr")</SCRIPT>
We've tried it and it really works. Only a Problem... we've put it into a html page. When opening the page... our best friend "IE5" shows an alert msg saying that "you're going to run some commands that can be dangerous to your PC...Continue?" We must fix it! No problem using Outlook... [sic]
It's interesting to note that after patching no less than seven different
security holes in the Windows 2000 telnet code (yes, that's seven security
leaks in telnet alone!) Microsoft released another patch in February 2002,
to fix security issue number eight: another buffer overflow vulnerability.
Somehow I don't think this patch will be the last. If you don't succeed at
first, try seven more times, try, try (and try some more) again.
It's not surprising that J.S. Wurzler Underwriting Managers, one of the
first companies to offer hacker insurance, have begun charging clients 5
to 15 percent more if they use Microsoft's Windows NT software in their
Internet operations.
Microsoft knows exactly how bad their own product security is. Nevertheless
they wax lyrical about new features rather than accept their responsibility
for their actions. To quote Tom Lehrer:
"The rockets go up,
who cares where they come down?
That's not my department,
says Werner von Braun."
Microsoft can't be unaware of the risks and damages they cause. They have
relied on a third-party product to secure their own mail servers for many
years. They always claimed to use Exchange for their own corporate E-mail,
but neglected to mention they were using Interscan Viruswall NT (by Trend
Micro, Inc.) to provide the essential security that Microsoft's own products
lacked. Another interesting detail is that the Microsoft servers were still
running Viruswall version 3.24 in the summer of 2001. This is a version for
NT that is not Windows 2000 'ready'. In other words, Microsoft had prudently
decided not to run Windows and Exchange 2000 on their own mail servers yet.
Only by the end of 2001 Microsoft had migrated to Windows and Exchange 2000
themselves.
This is not the only example of Microsoft's lack of trust in their own
products. Microsoft's SQL Labs, the part of the company that works on
Microsoft's SQL Server, is using NetScreen's 500-series security appliance to
defend its network against Code Red, Nimda and other worm attacks. Apparently,
the labs' choice was made despite the fact that Microsoft already sells its
own security product touted as a defense against worms. The Microsoft ISA
[Internet Security and Acceleration] Server was introduced in early 2001 and
was hailed by Microsoft as their first product aimed entirely at the security
market. In fact, the most important reason businesses ought to switch to ISA
Server, according to Microsoft, was that "ISA Server is an [...]
enterprise firewall and secure application gateway designed to protect the
enterprise network from hacker intrusion and malicious worms". Still
Microsoft's SQL Labs prudently decided to rely on other products than their
own to provide basic security.
And it gets even better, because of the sloppy code found in many Microsoft
products. The many buffer overrun vulnerabilities can be combined with
scripting weaknesses. You don't need to open E-mail attachments or even read
an incoming E-mail message to risk the introduction of malicious code on your
system. Just receiving the code (e.g. downloading E-mail from a POP3 server or
viewing a web page) is enough. Yes, I know, stories like this have long been
urban legend, but Outlook has made it reality. Microsoft explains: "The
vulnerability results because a component used by both Outlook and Outlook
Express contains an unchecked buffer in the module that interprets E-mail
header fields when certain E-mail protocols are used to download mail from
the mail server. This could allow a malicious user to send an E-mail that,
when retrieved from the server using an affected product, could cause code
of his choice to run on the recipient's computer." This vulnerability
has been successfully exploited by Nimda and other malicious worm programs.
Other worm programs (e.g. Code Red) combine vulnerabilities like this with
creatively constructed URL's that trigger buffer overruns in IIS. Even without
the Frontpage extensions installed it is relatively easy to obtain unencrypted
administration passwords and non-public files and documents from an IIS
webserver. Furthermore, this "E-commerce solution of the future"
contains a prank (a hardcoded passphrase deriding Netscape developers as
"weenies") in the code section concerned with the access
verification mechanism for the whole system. And there are many more
weaknesses like this. The list goes on and on and on.
IIS is supposed to power multi-million dollar E-commerce sites, and it has
many backend features to accomplish this application. But each and every time
we hear about a large online mailorder or E-commerce website that has spilled
confidential user data (including credit card numbers) it turns out that that
website runs IIS on Windows NT or 2000. (And that goes for adult mailorder
houses too. I'm not quite sure what kind of toy a Tarzan II MultiSpeed
Deluxe is, but I can probably tell you who bought one, and to which address
it was shipped. Many E-commerce websites promise you security and discretion,
but if they run IIS they can only promise you good intentions and nothing
more. Think before you order...)
The Code Red and Nimda worms provided a nice and instructive demonstration
of how easy it is to infect servers running IIS and other Microsoft products,
and use them for malicious purposes (i.e. the spreading of malicious code and
DDoS attacks on a global scale). Anyone who bothers to exploit one of the many
documented vulnerabilities can do this. Some of the vulnerabilities exploited
by Code Red and Nimda were months old, but many administrators just can't keep
up with the ridiculous amount of patches required by IIS. Nor is patching always
a solution: the patch that Microsoft released to counter Nimda contained bugs
that left mission-critical IIS production servers non-operational.
On 20 June 2001, Gartner vice president and analyst John Pescatore wrote:
IIS security vulnerabilities are not even newsworthy anymore as they are discovered almost weekly. This latest bug echoes the very first reported Windows 2000 security vulnerability in the Indexing Service, an add-on component in Windows NT Server incorporated into the code base of Windows 2000. As Gartner warned in 1999, pulling complex application software into operating system software represents a substantial security risk. More lines of code mean more complexity, which means more security bugs. Worse yet, it often means that fixing one security bug will cause one or more new security bugs.
The fact that the beta version of Windows XP also contains this vulnerability raises serious concerns about whether XP will show any security improvement over Windows 2000.
On 19 September 2001, Pescatore continued:
Code Red also showed how easy it is to attack IIS Web servers [...] Thus, using Internet-exposed IIS Web servers securely has a high cost of ownership. Enterprises using Microsoft's IIS Web server software have to update every IIS server with every Microsoft security patch that comes out - almost weekly. However, Nimda (and to a lesser degree Code Blue) has again shown the high risk of using IIS and the effort involved in keeping up with Microsoft's frequent security patches.
Gartner recommends that enterprises hit by both Code Red and Nimda immediately investigate alternatives to IIS, including moving Web applications to Web server software from other vendors, such as iPlanet and Apache. Although these Web servers have required some security patches, they have much better security records than IIS and are not under active attack by the vast number of virus and worm writers. Gartner remains concerned that viruses and worms will continue to attack IIS until Microsoft has released a completely rewritten, thoroughly and publicly tested, new release of IIS. Sufficient operational testing should follow to ensure that the initial wave of security vulnerabilities every software product experiences has been uncovered and fixed. This move should include any Microsoft .NET Web services, which requires the use of IIS. Gartner believes that this rewriting will not occur before year-end 2002 (0.8 probability).
In all honesty it must be said that Microsoft has learned to react
generally well to newly discovered security holes. Although the severity of
many security problems is often downplayed and the underlying cause (flawed or
absent security models) is glossed over, information and patches are generally
released promptly and are available to the user community without cost. This is
commendable. But then the procedure has become routine for Microsoft, since
new leaks are discovered literally several times a week, and plugging leaks
has become part of Microsoft's core business. The flood of patches has become
so great that it's almost impossible to keep up with it. This is illustrated
by the fact that most of today's security breaches successfully exploit leaks
for which patches have already been released. In fact the sheer volume of
patchwork eventually became sufficient to justify the automated distribution
of patches. For recent versions of Windows there is an automatic service to
notify the user of required "critical updates" (read: security
patches) which may then be downloaded with a single mouseclick. This service
(which does work fairly well) has become very popular. And for good
reason: in the year 2000 alone MS released about 100 (yes, one hundred)
security bulletins - that's an average of one newly discovered security-related
issue every three to four days!. The number of holes in Microsoft
products would put a Swiss cheese to shame. And the pace has increased rather
than slowed down. For example, once you install a "recommended
update" (such as Media Player 7.1) through the Windows Update service,
you discover immediately afterward that you must repeat the whole exercise
in order to install a "critical update" to patch the new security
leaks that were introduced with the first download! It's hardly reasonable to
expect users to keep up with such a rat race, and not surprising that most
users can't. As a result, many E-mail viruses and worms exploit security
holes that are months or years old. The MSBlaster worm that spread in the
summer of 2003 managed to infect Windows Server 2003 using a vulnerability
that was already present in NT4! In an age where smokers sue the tobacco
industry for millions of dollars, all Microsoft products had better come with a
warning on the package, stating that "This insecure product will cause
expensive damage to your ICT infrastructure unless you update frequently and
allocate time every day to find, download and install the
patch-du-jour". Unfortunately they don't, and Windows-based macro and
script viruses emerged at a rate of 200 to 300 a month in 2002.
An interesting side effect of the ridiculous rate with which patches need to
be released, is that some users now get the impression that Microsoft takes
software maintenance very seriously and that they are constantly working to
improve their products. This is of course rather naive. If they'd bought a
car that needed serious maintenance or repairs every two weeks or so, they
probably wouldn't feel that way about their car manufacturer.
And of course no amount of patching can possibly remedy the structural design
flaws in (or absence of) Microsoft products' security. A patch is like a
band-aid: it will help to heal a simple cut or abrasion, but for a broken leg
or a genetic deficiency it's useless, even if you apply a few thousand of
them. The obvious weak point in the system is of course integration. Microsoft
likes to call Windows "feature-rich" but when they have to release
an advisory on a serious vulnerability involving MIDI files it becomes obvious
that the set of features integrated in Windows has long since passed the
point of maximum usefulness.
Lately Microsoft lobbyists are trying to promote the idea that free
communication about newly discovered security leaks is not in the interest
of the user community, since public knowledge of the many weaknesses in their
products would enable and even encourage malicious hackers to exploit those
leaks. Microsoft's Security Response Center spokesman Scott Culp blamed
security experts for the outbreak of worms like Code Red and Nimda, and in an
article on Microsoft's website in October 2001 he proposed to restrict
circulation of security-related information to "select circles".
And it's all for our own good, of course. After all, censorship is such a
nasty word.
In August 2002, during a court hearing discussing a settlement between
Microsoft and the DoJ, Windows OS chief Jim Allchin testified how cover-ups
are Microsoft's preferred (and recommended) course of action:
"There is a protocol dealing with software functionality in Windows called message queueing, and there is a mistake in that protocol. And that mistake, if we disclosed it, would in my opinion compromise a company who is using that particular protocol."
In the meantime things are only getting worse with the lack of security
in Microsoft products. The latest incarnation of Office (Office XP) provides
full VBA support for Outlook, while CryptoAPI provides encryption for
messages and documents, including VBS attaches and macro's. In other
words, anti-virus software will no longer be able to detect and intercept
viruses that come with E-mail and Word documents, rendering companies
completely defenseless against virus attacks.
Clearly this is a recipe for disaster. It's like a car manufacturer who floods
the market with cars without brakes.
Another worrying development is that leaky code from products such as IIS or
other products is often installed with other software (and even with Windows
XP) without the system administrators being aware of it. For example: SQL
Server 2000 introduced 'super sockets' support for data access via the
Dnetlib DLL. It provides multi-protocol connectivity, encryption, and
authentication; in other words a roll-up of the different implementations of
these technologies in past versions of the product. A system would only have
this DLL if SQL Server 2000, the client administration tools, MSDE, or a
vendor-specific solution was installed on the box. However, with XP this DLL
is part of the default installation-- even on the home edition. One has to
wonder how a component goes from "installed only in specialized machines
on a particular platform" to "installed by default on all flavors
of the OS." What other components and vulnerabilities are now
automatically installed that we don't know about?
And the Windows fileset is getting extremely cluttered as it is. Looking
through the WINNT directory on a Windows 2000 or XP system, you'll find lots
of legacy executables that are obsolete and never used: Media Player 5,
16-bit legacy code from previous Windows versions as far back as version 3.10
(in fact the bulk of the original Windows 3.10 executable code is there),
files that belong to features that are never used in most cases (e.g. RAS
support) or accessibility options that most users fortunately don't need
(such as the Narrator and Onscreen Keyboard features). Dormant code means
possible dormant security issues. The needless installation of such a roundup
reflects the laziness of Microsoft's developers: if you just install
everything but the kitchen sink, you can just assume it's there at a later
time and not bother with the proper verifications. Of course this practice
doesn't improve quality control at all, it merely adds to the bloat that
has plagued Windows from day one.
The future promises only more of the same. Since Microsoft is already
working on the next versions of Windows, maybe to be released in 2003, it
seems a safe assumption that we're stuck with the current flawed Windows
architecture and that no structural improvements are to be expected. So far
Microsoft has never seemed interested in cleaning up their architectures.
Instead they concentrate on finding workarounds.
A good example is Microsoft's recommendation that PCs "designed for
Windows XP" no longer accept expansion cards but only work with USB
peripherals. This clearly indicates that XP still suffers from the
architecture-related driver problems that have caused so many Windows crashes
in the past. In an attempt to get rid of the problem, Microsoft tried to
persuade PC manufacturers to abandon the PCI expansion bus. The fact that this
recommendation was immediately scrapped by the hardware industry is
irrelevant; the point is that Microsoft tried to get rid of expansion bus
support rather than improve XP's architecture to make it robust.
This doesn't bode well for the future.
A good look at Windows and Windows applications shows that it's virtually
impossible for Microsoft to remedy the basic problems with their software.
They would have to redevelop their entire product range from scratch. Since
robust design and Windows compatibility are mutually exclusive in the
foreseeable future, this would mean to give up the limitations that bind the
end user to the Microsoft platform. This is commercially unacceptable for
Microsoft. In order to maintain revenues, they must hang on to their current
software architecture, flaws and all. Lately Microsoft has been making a lot
of noise about how they're now going to take security very seriously, but the
bad overall quality of their product code makes it impossible to live up to
that promise. Their Baseline Security Analyzer (which they released as part of
their attempts to improve their image) is a good indication: it doesn't scan
for vulnerabilities but merely for missing patches, and it does a sloppy job
at that with a lot of false positives as a result.
Another largely empty promise is the so-called ".Net Framework".
This is the development environment for the upcoming new .Net product lines.
Its most touted 'innovation' is "Zero Impact Install" which proposes
to do away with the tight integration between application and operating system.
Instead of the current mess of DLL's being inserted into the OS and settings
spread throughout an insecure registry database, applications will live in
their own subdirectories and be self-contained. Code will be delivered in a
cross-platform format and be JIT-compiled (Just In Time) for the platform it
runs on. While these things mean a dramatic improvement over the current
situation, their innovation factor is of course close to zero: the need for
an adequate separation between OS and application code makes sophisticated
ICT professionals long for Unix, mainframe environments or even DOS, and
JIT-compilation is nothing new either (it wasn't even a new idea when Sun
Microsystems proposed Java in the mid-90's). But more importantly: it
remains entirely unclear how Microsoft will realize all this. In order to
come up with a really good and robust platform, they'd have to replace
Windows (and therefore their whole product range) with something that has an
essentially different architecture. But .Net is not going to be a ground-up
rewrite of Microsoft's OS product line. Such a rewrite would detach Microsoft
from their installed base, so compatibility with (and therefore perpetuation
of the shortcomings of) existing Windows versions seems unavoidable.
Let's face it: Microsoft's promises about dramatic quality improvement are
unrealistic at best, not to say misleading. They're impossible to fulfill in
the foreseeable future, and everyone at Microsoft knows it. To illustrate,
in January 2002 Bill Gates wrote in his "Trustworthy computing"
memo to all Microsoft employees:
"Today, in the developed world, we do not worry about electricity and water services being available. With telephony, we rely both on its availability and its security for conducting highly confidential business transactions without worrying that information about who we call or what we say will be compromised. Computing falls well short of this, ranging from the individual user who isn't willing to add a new application because it might destabilize their system, to a corporation that moves slowly to embrace e-business because today's platforms don't make the grade."
Now, for "today's platforms" read "a decade of Windows, in
most cases" and keep in mind that Microsoft won't use their own security
products but relies on third party products instead. Add to that the presence
of spyware features in Windows Media Player, the fact that Windows XP Home
Edition connects to a server called wustat.windows.com, hooks for the Alexa
spyware in IE's 'Tools/Show Related Links' feature, and the fact that XP's
Search Assistant calls sa.windows.com as soon as you search for information...
and the picture is complete. Maybe Big Brother isn't watching you and nothing
is being done with the knowledge about what you search for and which websites
you visit... but don't count on it. And for those who still don't get it: in
November 2002 Microsoft made customer details, along with numerous confidential
internal documents, freely available from a very insecure FTP server. This
FTP server sported many known vulnerabilities, which made gaining access
a trivial exercise. Clearly, Microsoft's recent privacy-concerned and
quality-concerned noises sound rather hollow at best. They don't even have
any respect for their customers' privacy and security themselves.
Nor is this surprising. Stuart Okin, MS Security Officer for the UK, described
security as "a recent issue". During an interview at Microsoft's
Tech Ed event in 2002, Okin explained that recent press coverage on viruses
and related issues had put security high on Microsoft's agenda. Read: it was
never much of an issue, but now it's time to pay lip service to security
concerns in order to save public relations. And indeed Microsoft's only real
'improvement' so far has been an advertising campaign that touts Windows XP
as the secure platform that protects the corporate user from virus
attacks. No, really - that's what they say. They also make a lot of noise
about having received "Government-based security certification". In
fact this means that Windows 2000 SP3 has met the CCITSE Common Criteria, so
that it can be part of government systems without buyers having to get special
waivers from the National Security Agency or perform additional testing every
time. CC-compliance does not does not mean the software is now
secure, but merely means the testing has confirmed the code is working as per
specifications. That's all"- the discovery of new security holes at least
once a week has nothing to do with it. But even so, Windows 2000 SP3 was the
first Microsoft product ever that worked well enough to be CC-certified. Go
figure.
As if to make a point, a few weeks after Gates' memo on Trustworthy Computing,
Microsoft managed to send the Nimda worm to their own Korean developers, along
with the Korean language version of Visual Studio .Net, thus spreading an
infection that had originated with the third-party Korean translators. How
'trustworthy' can we expect a company to be, if they aren't even capable of
security basics such as adequate virus protection?
And of course since Gates wrote the above memo nothing has changed. Security
holes and vulnerabilities in all MS products, many of which allow blackhat
hackers to execute arbitrary code on any PC connected to the Internet, continue
to be discovered and exploited with a depressing regularity. Microsoft claims
to have put 11,000 engineers through security training to solve the problem,
but all users of Microsoft products continue to be plagued by security flaws.
It's obvious that real improvement won't come around anytime soon. Windows
Server 2003 is marketed as "secure by design" but apart from
a couple of improved default settings and the Software Restriction Policies
not much has changed. Shortly after Windows Server 2003 was released, the
first security patch (to plug a vulnerability that existed through Internet
Explorer 6) had to be applied, to nobody's surprise.
Since Gates' initial launch of the Trustworthy Computing idea, Microsoft's
plans on security have been long on rhetoric and short of actions. Nothing
really happened for about 18 months, and then Ballmer made the stunning
announcement that the big security initiative will consist of... a lot of
minor product fixes (yes, again), training users, and rolling up
several minor patches into bigger ones. Microsoft's press release actually
used the words, quote, "improving the patch experience", unquote.
So far this "improvement" has mainly consisted of monthly patch
packages, which had to be re-released and re-installed several times a month
in a 'revised' monthly version. Right...
Another sad aspect of Microsoft's actual stance on security is neatly summed up by internet.com editor Rebecca Lieb, who investigated Microsoft's commitment on fighting the epidemic flood of spam. She concludes:
"[Microsoft] executives are certainly committed to saying they are [committed to helping end the spam epidemic]. These days, Bill Gates is front and center: testifying before the Senate; penning a Wall Street Journal editorial; putting millions up in bounty for spammer arrests; building a Web page for consumers; and forming an Anti-Spam Technology & Strategy Group, "fighting spam from all angles-- technology, enforcement, education, legislation and industry self-regulation."
When I meet members of that group, I always ask the same question. Every version of the Windows OS that shipped prior to XP's release last year is configured --by default-- as an open relay. Millions have been upgraded to broadband. Ergo, most PCs on planet Earth emit a siren call to spammers: "Use me! Abuse me!" Why won't Microsoft tell its millions of registered customers how to close the open relay?"
Meanwhile Microsoft is branching into new markets and so far runs true to form. They have already shipped their first cellphone products. Orange, the first cellnet operator to run Microsoft Smartphone software on their SPV phones, has already had to investigate several security leaks. On top of that the phones are reported to crash and require three subsequent power-ups to get going again, call random numbers from the address book and have flakey power management. I shudder to think what will happen when their plans on the automotive software market begin to materialize.
In spite of what Microsoft's sales droids would have us believe, the facts
speak for themselves: developments at Microsoft are solely driven by business
targets and not by quality targets. As long as they manage to keep up their
$30 billion plus yearly turnover, nobody's rear is on the line no matter how
bad their software is. Microsoft products are immature and of inferior
quality. New versions of these products provide no structural remedy, but
are in fact point releases with bugfixes, minor updates and little else but
cosmetic improvement. If it weren't for additional security products such as
hardware-based or Unix-based filters and firewalls, it would be impossible to
run an even remotely secure environment with Windows.
MS products are bloated with an almost baroque excess of features, but that's
not the point. The point is that they are to be considered harmful, lacking
robustness and security as a direct result of basic design flaws that are in
many cases over a decade old. They promise to do a lot, but in practice they
don't do any of it very well. If you need something robust, designed for
mission-critical applications, you might want to look elsewhere. Microsoft's
need for compatibility with previous mistakes makes structural improvements
impossible. The day Microsoft makes something that doesn't suck, they'll be
making vacuum-cleaners.
Besides, 63,000 known defects in Windows should be enough for anyone.
Comments? E-mail me!