19. ETRE


Many people in the computer business have had a difficult time grappling with Be, Incorporated, for the simple reason that nothing about it seems to make any sense whatsoever. It was launched in late 1990, which makes it roughly contemporary with Linux. From the beginning it has been devoted to creating a new operating system that is, by design, incompatible with all the others (though, as we shall see, it is compatible with Unix in some very important ways). If a definition of "celebrity" is someone who is famous for being famous, then Be is an anti-celebrity. It is famous for not being famous; it is famous for being doomed. But it has been doomed for an awfully long time.

Be's mission might make more sense to hackers than to other people. In order to explain why I need to explain the concept of cruft, which, to people who write code, is nearly as abhorrent as unnecessary repetition.

If you've been to San Francisco you may have seen older buildings that have undergone "seismic upgrades," which frequently means that grotesque superstructures of modern steelwork are erected around buildings made in, say, a Classical style. When new threats arrive--if we have an Ice Age, for example--additional layers of even more high-tech stuff may be constructed, in turn, around these, until the original building is like a holy relic in a cathedral--a shard of yellowed bone enshrined in half a ton of fancy protective junk.

Analogous measures can be taken to keep creaky old operating systems working. It happens all the time. Ditching a worn-out old OS ought to be simplified by the fact that, unlike old buildings, OSes have no aesthetic or cultural merit that makes them intrinsically worth saving. But it doesn't work that way in practice. If you work with a computer, you have probably customized your "desktop," the environment in which you sit down to work every day, and spent a lot of money on software that works in that environment, and devoted much time to familiarizing yourself with how it all works. This takes a lot of time, and time is money. As already mentioned, the desire to have one's interactions with complex technologies simplified through the interface, and to surround yourself with virtual tchotchkes and lawn ornaments, is natural and pervasive--presumably a reaction against the complexity and formidable abstraction of the computer world. Computers give us more choices than we really want. We prefer to make those choices once, or accept the defaults handed to us by software companies, and let sleeping dogs lie. But when an OS gets changed, all the dogs jump up and start barking.

The average computer user is a technological antiquarian who doesn't really like things to change. He or she is like an urban professional who has just bought a charming fixer-upper and is now moving the furniture and knicknacks around, and reorganizing the kitchen cupboards, so that everything's just right. If it is necessary for a bunch of engineers to scurry around in the basement shoring up the foundation so that it can support the new cast-iron claw-foot bathtub, and snaking new wires and pipes through the walls to supply modern appliances, why, so be it--engineers are cheap, at least when millions of OS users split the cost of their services.

Likewise, computer users want to have the latest Pentium in their machines, and to be able to surf the web, without messing up all the stuff that makes them feel as if they know what the hell is going on. Sometimes this is actually possible. Adding more RAM to your system is a good example of an upgrade that is not likely to screw anything up.

Alas, very few upgrades are this clean and simple. Lawrence Lessig, the whilom Special Master in the Justice Department's antitrust suit against Microsoft, complained that he had installed Internet Explorer on his computer, and in so doing, lost all of his bookmarks--his personal list of signposts that he used to navigate through the maze of the Internet. It was as if he'd bought a new set of tires for his car, and then, when pulling away from the garage, discovered that, owing to some inscrutable side-effect, every signpost and road map in the world had been destroyed. If he's like most of us, he had put a lot of work into compiling that list of bookmarks. This is only a small taste of the sort of trouble that upgrades can cause. Crappy old OSes have value in the basically negative sense that changing to new ones makes us wish we'd never been born.

All of the fixing and patching that engineers must do in order to give us the benefits of new technology without forcing us to think about it, or to change our ways, produces a lot of code that, over time, turns into a giant clot of bubble gum, spackle, baling wire and duct tape surrounding every operating system. In the jargon of hackers, it is called "cruft." An operating system that has many, many layers of it is described as "crufty." Hackers hate to do things twice, but when they see something crufty, their first impulse is to rip it out, throw it away, and start anew.

If Mark Twain were brought back to San Francisco today and dropped into one of these old seismically upgraded buildings, it would look just the same to him, with all the doors and windows in the same places--but if he stepped outside, he wouldn't recognize it. And--if he'd been brought back with his wits intact--he might question whether the building had been worth going to so much trouble to save. At some point, one must ask the question: is this really worth it, or should we maybe just tear it down and put up a good one? Should we throw another human wave of structural engineers at stabilizing the Leaning Tower of Pisa, or should we just let the damn thing fall over and build a tower that doesn't suck?

Like an upgrade to an old building, cruft always seems like a good idea when the first layers of it go on--just routine maintenance, sound prudent management. This is especially true if (as it were) you never look into the cellar, or behind the drywall. But if you are a hacker who spends all his time looking at it from that point of view, cruft is fundamentally disgusting, and you can't avoid wanting to go after it with a crowbar. Or, better yet, simply walk out of the building--let the Leaning Tower of Pisa fall over--and go make a new one THAT DOESN'T LEAN.

For a long time it was obvious to Apple, Microsoft, and their customers that the first generation of GUI operating systems was doomed, and that they would eventually need to be ditched and replaced with completely fresh ones. During the late Eighties and early Nineties, Apple launched a few abortive efforts to make fundamentally new post-Mac OSes such as Pink and Taligent. When those efforts failed they launched a new project called Copland which also failed. In 1997 they flirted with the idea of acquiring Be, but instead they acquired Next, which has an OS called NextStep that is, in effect, a variant of Unix. As these efforts went on, and on, and on, and failed and failed and failed, Apple's engineers, who were among the best in the business, kept layering on the cruft. They were gamely trying to turn the little toaster into a multi-tasking, Internet-savvy machine, and did an amazingly good job of it for a while--sort of like a movie hero running across a jungle river by hopping across crocodiles' backs. But in the real world you eventually run out of crocodiles, or step on a really smart one.

Speaking of which, Microsoft tackled the same problem in a considerably more orderly way by creating a new OS called Windows NT, which is explicitly intended to be a direct competitor of Unix. NT stands for "New Technology" which might be read as an explicit rejection of cruft. And indeed, NT is reputed to be a lot less crufty than what MacOS eventually turned into; at one point the documentation needed to write code on the Mac filled something like 24 binders. Windows 95 was, and Windows 98 is, crufty because they have to be backward-compatible with older Microsoft OSes. Linux deals with the cruft problem in the same way that Eskimos supposedly dealt with senior citizens: if you insist on using old versions of Linux software, you will sooner or later find yourself drifting through the Bering Straits on a dwindling ice floe. They can get away with this because most of the software is free, so it costs nothing to download up-to-date versions, and because most Linux users are Morlocks.

The great idea behind BeOS was to start from a clean sheet of paper and design an OS the right way. And that is exactly what they did. This was obviously a good idea from an aesthetic standpoint, but does not a sound business plan make. Some people I know in the GNU/Linux world are annoyed with Be for going off on this quixotic adventure when their formidable skills could have been put to work helping to promulgate Linux.

Indeed, none of it makes sense until you remember that the founder of the company, Jean-Louis Gassee, is from France--a country that for many years maintained its own separate and independent version of the English monarchy at a court in St. Germaines, complete with courtiers, coronation ceremonies, a state religion and a foreign policy. Now, the same annoying yet admirable stiff-neckedness that gave us the Jacobites, the force de frappe, Airbus, and ARRET signs in Quebec, has brought us a really cool operating system. I fart in your general direction, Anglo-Saxon pig-dogs!

To create an entirely new OS from scratch, just because none of the existing ones was exactly right, struck me as an act of such colossal nerve that I felt compelled to support it. I bought a BeBox as soon as I could. The BeBox was a dual-processor machine, powered by Motorola chips, made specifically to run the BeOS; it could not run any other operating system. That's why I bought it. I felt it was a way to burn my bridges. Its most distinctive feature is two columns of LEDs on the front panel that zip up and down like tachometers to convey a sense of how hard each processor is working. I thought it looked cool, and besides, I reckoned that when the company went out of business in a few months, my BeBox would be a valuable collector's item.

Now it is about two years later and I am typing this on my BeBox. The LEDs (Das Blinkenlights, as they are called in the Be community) flash merrily next to my right elbow as I hit the keys. Be, Inc. is still in business, though they stopped making BeBoxes almost immediately after I bought mine. They made the sad, but probably quite wise decision that hardware was a sucker's game, and ported the BeOS to Macintoshes and Mac clones. Since these used the same sort of Motorola chips that powered the BeBox, this wasn't especially hard.

Very soon afterwards, Apple strangled the Mac-clone makers and restored its hardware monopoly. So, for a while, the only new machines that could run BeOS were made by Apple.

By this point Be, like Spiderman with his Spider-sense, had developed a keen sense of when they were about to get crushed like a bug. Even if they hadn't, the notion of being dependent on Apple--so frail and yet so vicious--for their continued existence should have put a fright into anyone. Now engaged in their own crocodile-hopping adventure, they ported the BeOS to Intel chips--the same chips used in Windows machines. And not a moment too soon, for when Apple came out with its new top-of-the-line hardware, based on the Motorola G3 chip, they withheld the technical data that Be's engineers would need to make the BeOS run on those machines. This would have killed Be, just like a slug between the eyes, if they hadn't made the jump to Intel.

So now BeOS runs on an assortment of hardware that is almost incredibly motley: BeBoxes, aging Macs and Mac orphan-clones, and Intel machines that are intended to be used for Windows. Of course the latter type are ubiquitous and shockingly cheap nowadays, so it would appear that Be's hardware troubles are finally over. Some German hackers have even come up with a Das Blinkenlights replacement: it's a circuit board kit that you can plug into PC-compatible machines running BeOS. It gives you the zooming LED tachometers that were such a popular feature of the BeBox.

My BeBox is already showing its age, as all computers do after a couple of years, and sooner or later I'll probably have to replace it with an Intel machine. Even after that, though, I will still be able to use it. Because, inevitably, someone has now ported Linux to the BeBox.

At any rate, BeOS has an extremely well-thought-out GUI built on a technological framework that is solid. It is based from the ground up on modern object-oriented software principles. BeOS software consists of quasi-independent software entities called objects, which communicate by sending messages to each other. The OS itself is made up of such objects, and serves as a kind of post office or Internet that routes messages to and fro, from object to object. The OS is multi-threaded, which means that like all other modern OSes it can walk and chew gum at the same time; but it gives programmers a lot of power over spawning and terminating threads, or independent sub-processes. It is also a multi-processing OS, which means that it is inherently good at running on computers that have more than one CPU (Linux and Windows NT can also do this proficiently).

For this user, a big selling point of BeOS is the built-in Terminal application, which enables you to open up windows that are equivalent to the xterm windows in Linux. In other words, the command line interface is available if you want it. And because BeOS hews to a certain standard called POSIX, it is capable of running most of the GNU software. That is to say that the vast array of command-line software developed by the GNU crowd will work in BeOS terminal windows without complaint. This includes the GNU development tools-the compiler and linker. And it includes all of the handy little utility programs. I'm writing this using a modern sort of user-friendly text editor called Pe, written by a Dutchman named Maarten Hekkelman, but when I want to find out how long it is, I jump to a terminal window and run "wc."

As is suggested by the sample bug report I quoted earlier, people who work for Be, and developers who write code for BeOS, seem to be enjoying themselves more than their counterparts in other OSes. They also seem to be a more diverse lot in general. A couple of years ago I went to an auditorium at a local university to see some representatives of Be put on a dog-and-pony show. I went because I assumed that the place would be empty and echoing, and I felt that they deserved an audience of at least one. In fact, I ended up standing in an aisle, for hundreds of students had packed the place. It was like a rock concert. One of the two Be engineers on the stage was a black man, which unfortunately is a very odd thing in the high-tech world. The other made a ringing denunciation of cruft, and extolled BeOS for its cruft-free qualities, and actually came out and said that in ten or fifteen years, when BeOS had become all crufty like MacOS and Windows 95, it would be time to simply throw it away and create a new OS from scratch. I doubt that this is an official Be, Inc. policy, but it sure made a big impression on everyone in the room! During the late Eighties, the MacOS was, for a time, the OS of cool people-artists and creative-minded hackers-and BeOS seems to have the potential to attract the same crowd now. Be mailing lists are crowded with hackers with names like Vladimir and Olaf and Pierre, sending flames to each other in fractured techno-English.

The only real question about BeOS is whether or not it is doomed.

Of late, Be has responded to the tiresome accusation that they are doomed with the assertion that BeOS is "a media operating system" made for media content creators, and hence is not really in competition with Windows at all. This is a little bit disingenuous. To go back to the car dealership analogy, it is like the Batmobile dealer claiming that he is not really in competition with the others because his car can go three times as fast as theirs and is also capable of flying.

Be has an office in Paris, and, as mentioned, the conversation on Be mailing lists has a strongly European flavor. At the same time they have made strenuous efforts to find a niche in Japan, and Hitachi has recently begun bundling BeOS with their PCs. So if I had to make wild guess I'd say that they are playing Go while Microsoft is playing chess. They are staying clear, for now, of Microsoft's overwhelmingly strong position in North America. They are trying to get themselves established around the edges of the board, as it were, in Europe and Japan, where people may be more open to alternative OSes, or at least more hostile to Microsoft, than they are in the United States.

What holds Be back in this country is that the smart people are afraid to look like suckers. You run the risk of looking naive when you say "I've tried the BeOS and here's what I think of it." It seems much more sophisticated to say "Be's chances of carving out a new niche in the highly competitive OS market are close to nil."

It is, in techno-speak, a problem of mindshare. And in the OS business, mindshare is more than just a PR issue; it has direct effects on the technology itself. All of the peripheral gizmos that can be hung off of a personal computer--the printers, scanners, PalmPilot interfaces, and Lego Mindstorms--require pieces of software called drivers. Likewise, video cards and (to a lesser extent) monitors need drivers. Even the different types of motherboards on the market relate to the OS in different ways, and separate code is required for each one. All of this hardware-specific code must not only written but also tested, debugged, upgraded, maintained, and supported. Because the hardware market has become so vast and complicated, what really determines an OS's fate is not how good the OS is technically, or how much it costs, but rather the availability of hardware-specific code. Linux hackers have to write that code themselves, and they have done an amazingly good job of keeping up to speed. Be, Inc. has to write all their own drivers, though as BeOS has begun gathering momentum, third-party developers have begun to contribute drivers, which are available on Be's web site.

But Microsoft owns the high ground at the moment, because it doesn't have to write its own drivers. Any hardware maker bringing a new video card or peripheral device to market today knows that it will be unsalable unless it comes with the hardware-specific code that will make it work under Windows, and so each hardware maker has accepted the burden of creating and maintaining its own library of drivers.

In The Beginning Was The Command Line
Share on Twitter Share on Facebook Share on LinkedIn Share on Identi.ca Share on StumbleUpon