No, I don’t think I did anything productive that entire summer. Classes for my second year at the university wouldn’t start until fall. My computer was not quite up to snuff. So I sort of hung around in my ratty bathrobe or played with Randi or, occasionally, got together with friends so they could chuckle about my attempts at bowling or snooker. Okay, I did do a little fantasizing about my next computer.
I faced a geek’s dilemma. Like any good computer purist raised on a 68008 chip, I despised PCs. But when the 386 chip came out in 1986, PCs started to look, well, attractive. They were able to do everything the 68020 did, and by 1990, mass-market production and the introduction of inexpensive clones would make them a great deal cheaper. I was very money-conscious because I didn’t have any. So it was, like, this is the machine I want to get. And because PCs were flourishing, upgrades and add-ons would be easy to obtain. Especially when it came to hardware, I wanted to have something that was standard.
I decided to jump over and cross the divide. And it would be fun getting a new CPU. That’s when I started selling off pieces of my Sinclair QL.
Now everybody has a book that has changed his or her life. The Holy Bible.
Das Kapital. Tuesdays With Maury. Everything I Needed to Know I Learned in Kindergarten.
Whatever. (I sincerely hope that, having read the preface and my theory on The Meaning of Life, you will decide that this book does the trick for you.) The book that launched me to new heights was
Operating Systems: Design and Implementation,
by Andrew S. Tanenbaum.
I had already signed up for my fall courses, and the one that was most looking forward to was in the C programming language and the Unix operating system. In anticipation of the course, I bought the aforementioned textbook during the summer in the hope of getting a head start. In the book, Andrew Tanenbaum, a university professor in Amsterdam, discusses Minix, which is a teaching aid he wrote for Unix. Minix is also a small Unix clone. Soon after reading the introduction, and learning the philosophy behind Unix and what the powerful, clean, beautiful operating system would be capable of doing, I decided to get a machine to run Unix on. I would run Minix, which was the only version I could find that was fairly useful.
As I read and started to understand Unix, I got a big enthusiastic jolt. Frankly, it’s never subsided. (I hope you can say the same about something.)
III
The academic year that began in the fall of 1990 was to be the first time that the University of Helsinki would have Unix, the powerful operating system that had been bred in AT&T’s Bell Labs in the late 1960s but had grown up elsewhere. In my first year of studies, we had a VAX running VMS. It was a horrible operating system, certainly not an environment that made you say, “Gee, I’d like to have this at home, too.” Instead it made you say, “Hmmm. How do you do
that?”
It was hard to use. It didn’t have many tools. It wasn’t suited to easily accessing the Internet, which was running on Unix. You couldn’t even easily figure out how large a file was. Admittedly, VMS was very well suited for certain operations, like databases. But it’s not the kind of operating system that you get excited about.
The university had realized it was time to move away from all that. Much of the academic world was then growing enamored of Unix, so the university acquired a MicroVAX running Ultrix, which was Digital Equipment Corporation’s version of Unix. It was a way of testing the waters of Unix.
I was eager to work with Unix by experimenting with what I was learning in Andrew Tanenbaum’s book, excited about all the things I could explore if I had a 386 PC. There was no way I could get together the 18,000 FIM to buy one. I knew that once the fall semester began, I would be able to use my Sinclair QL to access the university’s new Unix computer until I could afford to buy a PC on which I could run Unix on my own.
So there were two things I did that summer. Nothing. And read the 719 pages of
Operating Systems: Design and Implementation.
The red soft-cover textbook sort of lived on my bed.
The University of Helsinki sprang for a sixteen-user license for the MicroVAX. That meant admittance to the “C and Unix” course was limited to thirty-two students—I guess the thinking was that sixteen people would be using it by day, sixteen by night. Like the rest of us, the teacher was new to Unix. He admitted this up front, so it wasn’t really a problem. But he would read the text only one chapter ahead of the students, whereas the students were sometimes skipping ahead by three chapters. So it became something of a game in which people tried to trip up the teacher by asking questions that related to things we would be learning three chapters later, just to see if he had read that far.
We were all babes in the Unix woods, with a course that was being made up as we went along. But what was obvious from this course was that there was a unique philosophy behind Unix. You grasped this in the first hour of the course. The rest was explaining the details.
What is special about Unix is the set of fundamental ideals that it strives for. It is a clean and beautiful operating system. It avoids special cases. Unix has the notion of processes—a process is anything that does anything. Here’s a simple example. In Unix the shell command, which is what you type to gain entry into the operating system, is not built into the operating system, as with DOS. It’s just a task. Like any other task. It just happens that this task reads from your keyboard and writes back to your monitor. Everything that does something in Unix is a process. You also have files.
This simple design is what intrigued me, and most people, about Unix (well, at least us geeks). Pretty much everything you do in Unix is done with only six basic operations (called “system calls,” because they are the calls you make to the operating system to do things for you). And you can build up pretty much everything from those six basic system calls.
There’s the notion of “fork,” which is one of the fundamental Unix operations. When a process does a fork, it creates a complete copy of itself. That way, you have two copies that are the same. The child copy most often ends up
executing
another process—replacing itself with a new program. And that’s the second basic operation. Then you have four other basic system calls: open, close, read, and write—all designed to access files. Those six system calls make up the simple operations that comprise Unix.
Sure, there are tons of other system calls to fill in all the details. But once you understand the six basic ones, you understand Unix. Because one of the beauties of Unix is realizing that you don’t need to have complex interfaces to build up something complex. You can build up any amount of complexity from the interactions of simple things. What you do is create channels of communication (called “pipes” in Unix-speak) between simple processes to create complex problem-solving.
An ugly system is one in which there are special interfaces for everything you want to do. Unix is the opposite. It gives you the building blocks that are sufficient for doing everything. That’s what having a clean design is all about.
It’s the same thing with languages. The English language has twenty-six letters and you can build up everything from those letters. Or you have the Chinese language, in which you have one letter for every single thing you can think of. In Chinese, you start off with complexity, and you can combine complexity in limited ways. That’s more of the VMS approach, to have complex things that have interesting meanings but can’t be used in any other way. It’s also the Windows approach.
Unix, on the other hand, comes with a small-is-beautiful philosophy. It has a small set of simple basic building blocks that can be combined into something that allows for infinite complexity of expression.
This, by the way, is also how physics works. You try and find the fundamental rules that are supposed to be fairly simple. The complexity comes from the many incredible interactions you get from those simple rules, not from any inherent complexity of the rules themselves.
The simplicity of Unix did not just happen on its own. Unix, with its notion of simple building blocks, was painstakingly designed and written by Dennis Richie and Ken Thompson at AT&T’s Bell Labs, And you should absolutely not dismiss simplicity for something easy. It takes
design
and good taste to be simple.
To go back to the example of human languages: Pictorial writing like Chinese characters and hieroglyphics tend to happen first, and be “simpler,” whereas the building block approach requires far more abstract thinking. In the same way, you should not confuse the simplicity of Unix with a lack of sophistication—quite the reverse.
Which is not to say that the original reasons for Unix were all that sophisticated. Like so many other things in computers, it was all about games. It took somebody who wanted to play computer games on a PDP-11. Because that was what UNIX started out being developed for—Dennis and Ken’s personal project for playing Space Wars. And because the operating system wasn’t considered a serious project, AT&T didn’t think of it as a commercial venture. In fact, AT&T was a regulated monopoly, and one of the things they couldn’t do was to sell computers anyway. So the people who created Unix made it available quite freely along with source licenses, especially to universities. It wasn’t a big deal.
This all led to Unix becoming a big project in academic circles. By the time of the 1984 breakup, when AT&T was finally allowed to get into the computer business, computer scientists at universities—particularly the University of California-Berkeley—had been working on and improving Unix for years under the direction of people like Bill Joy and Marshall Kirk McKusik. People hadn’t always necessarily put a lot of effort into documenting what they did.
But by the early 1990s, Unix had become the number-one operating system for all supercomputers and servers. It was huge business. One of the problems was that there were, by now, a host of competing versions of the operating system. Some were derived from the more controlled confines of the AT&T code base (the so-called “System V” flavors), while others were derived from the University of California-Berkeley code-base BSD (Berkeley Software Distribution). Yet others were a mixture of the two.
One BSD derivation in particular is worth mentioning. It was the 386BSD project done by Bill Jolitz based on the BSD code-base, distributed over the Internet. It was later to fragment and become the freely available BSD flavors—NetBSD, FreeBSD, and OpenBSD—and it was getting a lot of attention in the Unix community.
That’s why AT&T woke up and sued the University of California-Berkeley. The original code had been AT&T’s but most of the subsequent work had been done at Berkeley. The University of California regents contended that they had the right to distribute, or sell for a nominal fee, their version of Unix. And they demonstrated that they had done so much work that they essentially rewrote what AT&T had made available. The suit ended up being settled after Novell, Inc., bought Unix from AT&T. Essentially, parts of the system had to be excised from what AT&T had made available.
Meanwhile, all the legal haggling had been instrumental in giving a new kid on the block some time to mature and spread itself. Basically, it gave Linux time to take over the market. But I’m getting ahead of myself.
Since I’m digressing anyway, I’d like to explain something. Unix has this reputation for being a magnet for the eccentric fringe of computing. It’s a reputation not worth arguing against. It’s true.
Frankly, there
are
a lot of fairly crazy people in Unix. Not postal-rage crazy. Not poison-the-neighbor’s-dog crazy. Just
very
alternative-lifestyle people.
Remember, much of the initial Unix activity took place in the late 1960s and early 1970s, while I was sleeping in a laundry basket in my grandparents’ apartment. These were flower power people—but
technical
flower power people. A lot of the Unix-must-be-free philosophy has more to do with the circumstances of the time rather than with the operating system. It was a time of rampant idealism. Revolution. Freedom from authority. Free love (which I missed out on, and probably wouldn’t have known what to do with, anyway). And the relative openness of Unix, even if it was mainly due to the lack of commercial interests of the time, made it attractive to this kind of person.
The first time I was introduced to this side of Unix was probably in 1991 or so when Lars Wirzenius dragged me along to an event at the Polytechnic University of Helsinki (which, as everybody knows, is not actually in Helsinki but right across the border in Espoo). They just want to be associated with the glamorous Helsinki even if only by name). The speaker was Richard Stallman.
Richard Stallman is the God of Free Software. He started to work on an alternative to Unix in 1984, calling it the GNU system. GNU stands for “GNU is Not Unix,” being one of many recursive acronyms where one of the letters stands for the acronym itself—a kind of computer science in-joke that nobody else ever gets. Geeks—we’re just tons of fun to be around.
More importantly, RMS, as he prefers to be called, also wrote the Free Software Manifesto, and the Free Software copyright license—the GPL (General Public License). Basically, he pioneered the notion of free source-code availability as something intentional, not just an accident, the way it happened with original Unix open development.
I have to admit that I wasn’t much aware of the sociopolitical issues that were—and are—so dear to RMS. I was not really all that aware of the Free Software Foundation, which he founded, and all that it stood for. Judging from the fact that I don’t remember much about the talk back in 1991, it probably didn’t make a huge impact on my life at that point. I was interested in the technology, not the politics—I had had enough politics at home. But Lars was an ideologist, and I tagged along and listened.
In Richard I saw, for the first time in my life, the stereotypical longhaired, bearded hacker type. We don’t much have them in Helsinki.