Book HomeRunning LinuxSearch this book

1.7. Open Source and the Philosophy of Linux

When new users encounter Linux, they often have a few misconceptions and false expectations of the system. Linux is a unique operating system, and it's important to understand its philosophy and design in order to use it effectively. At the center of the Linux philosophy is a concept that we now call Open Source Software.

Open Source is a term that applies to software for which the source code--the inner workings of the program--is freely available for anyone to download, modify, and redistribute. Software covered under the GNU GPL, described in the previous section, fits into the category of Open Source. Not surprisingly, though, so does a lot of other software that uses copyright licenses similar, but not identical, to the GPL. For example, software that can be freely modified but that does not have the same strict requirements for redistribution as the GPL is also considered Open Source.

The so-called "Open Source development model" is a phenomenon that started with the Free Software Foundation and which was popularized with Linux. It's a totally different way of producing software that opens up every aspect of development, debugging, testing, and study to anyone with enough interest in doing so. Rather than relying upon a single corporation to develop and maintain a piece of software, Open Source allows the code to evolve, openly, in a community of developers and users who are motivated by desire to create good software, rather than simply make a profit.

O'Reilly & Associates, Inc., has published a book, Open Sources, which serves as a good introduction to the Open Source development model. It's a collection of essays about the Open Source process by leading developers (including Linus Torvalds and Richard Stallman) and was edited by Chris DiBona, Sam Ockman, and Mark Stone.

Open Source has received a lot of media attention, and some are calling the phenomenon the "next wave" in software development, which will sweep the old way of doing things under the carpet. It still remains to be seen whether that will happen, but there have been some encouraging events that make this outcome seem not so unlikely. For example, Netscape Corporation has released the code for their web browser as an Open Source project called Mozilla, and companies such as Sun Microsystems, IBM, and Apple have announced plans to release certain products as Open Source in the hopes that they will flourish in a community-driven software development effort.

Open Source has received a lot of media attention, and Linux is at the center at all of it. In order to understand where the Linux development mentality is coming from, however, it might make sense to take a look at how commercial Unix systems have traditionally been built.

In commercial Unix development houses, the entire system is developed with a rigorous policy of quality assurance, source and revision control systems, documentation, and bug reporting and resolution. Developers are not allowed to add features or to change key sections of code on a whim: they must validate the change as a response to a bug report and consequently "check in" all changes to the source control system, so that the changes can be backed out if necessary. Each developer is assigned one or more parts of the system code, and only that developer may alter those sections of the code while it is "checked out."

Internally, the quality assurance department runs rigorous test suites (so-called "regression tests") on each new pass of the operating system and reports any bugs. It's the responsibility of the developers to fix these bugs as reported. A complicated system of statistical analysis is employed to ensure that a certain percentage of bugs are fixed before the next release, and that the operating system as a whole passes certain release criteria.

In all, the process used by commercial Unix developers to maintain and support their code is very complicated, and quite reasonably so. The company must have quantitative proof that the next revision of the operating system is ready to be shipped--hence the gathering and analyzing of statistics about the operating system's performance. It's a big job to develop a commercial Unix system, often large enough to employ hundreds (if not thousands) of programmers, testers, documenters, and administrative personnel. Of course, no two commercial Unix vendors are alike, but you get the general picture.

With Linux, you can throw out the entire concept of organized development, source control systems, structured bug reporting, or statistical analysis. Linux is, and more than likely always will be, a hacker's operating system.[11]

[11]What we mean by "hacker" is a feverishly dedicated programmer--a person who enjoys exploiting computers and generally doing interesting things with them. This is in contrast to the common connotation of "hacker" as a computer wrongdoer or an outlaw.

Linux is primarily developed as a group effort by volunteers on the Internet from all over the world. There is no single organization responsible for developing the system. For the most part, the Linux community communicates via various mailing lists and web sites. A number of conventions have sprung up around the development effort: for example, programmmers wanting to have their code included in the "official" kernel should mail it to Linus Torvalds. He will test the code and include it in the kernel (as long as it doesn't break things or go against the overall design of the system, he will more than likely include it).

The system itself is designed with a very open-ended, feature-rich approach. While recently the number of new features and critical changes to the system has diminished, the general rule is that a new version of the kernel will be released about every few weeks (sometimes even more frequently than this). Of course, this is a very rough figure; it depends on several factors, including the number of bugs to be fixed, the amount of feedback from users testing prerelease versions of the code, and the amount of sleep that Linus has had that week.

Suffice it to say that not every single bug has been fixed and not every problem ironed out between releases. As long as the system appears to be free of critical or oft-manifesting bugs, it's considered "stable" and new revisions are released. The thrust behind Linux development is not an effort to release perfect, bug-free code; it's to develop a free implementation of Unix. Linux is for the developers, more than anyone else.

Anyone who has a new feature or software application to add to the system generally makes it available in an "alpha" stage--that is, a stage for testing by those brave users who want to bash out problems with the initial code. Because the Linux community is largely based on the Internet, alpha software is usually uploaded to one or more of the various Linux web sites (see Appendix A, "Sources of Linux Information"), and a message is posted to one of the Linux mailing lists about how to get and test the code. Users who download and test alpha software can then mail results, bug fixes, or questions to the author.

After the initial problems in the alpha code have been fixed, the code enters a "beta" stage, in which it's usually considered stable but not complete (that is, it works, but not all of the features may be present). Otherwise, it may go directly to a "final" stage in which the software is considered complete and usable. For kernel code, once it's complete, the developer may ask Linus to include it in the standard kernel, or as an optional add-on feature to the kernel.

Keep in mind these are only conventions--not rules. Some people feel so confident with their software that they don't need to release an alpha or test version. It's always up to the developer to make these decisions.

What happened to regression testing and the rigorous quality process? It's been replaced by the modality of "release early and release often." The users are the best testers, because they try out the software in a variety of environments and a host of demanding real-life applications that can't be easily duplicated by any software Quality Assurance group. One of the best features of this development and release model is that bugs (and security flaws) are often found, reported, and fixed within hours--not days or weeks.

You might be amazed that such a nonstructured system of volunteers programming and debugging a complete Unix system could get anything done at all. As it turns out, it's one of the most efficient and motivated development efforts ever employed. The entire Linux kernel was written from scratch, without employing any code from proprietary sources. A great deal of work was put forth by volunteers to port all the free software under the sun to the Linux system. Libraries were written and ported, filesystems developed, and hardware drivers written for many popular devices.

The Linux software is generally released as a distribution, which is a set of prepackaged software making up an entire system. It would be quite difficult for most users to build a complete system from the ground up, starting with the kernel, then adding utilities, and installing all necessary software by hand. Instead, there are a number of software distributions including everything you need to install and run a complete system. Again, there is no standard distribution; there are many, each with their own advantages and disadvantages. In this book, we describe how to install the Red Hat, SuSE, and Debian distributions, but this book can help you with any distribution you choose.

Despite the completeness of the Linux software, you still need a bit of Unix know-how to install and run a complete system. No distribution of Linux is completely bug-free, so you may be required to fix small problems by hand after installation.

1.7.1. Hints for Unix Novices

Installing and using your own Linux system doesn't require a great deal of background in Unix. In fact, many Unix novices successfully install Linux on their systems. This is a worthwhile learning experience, but keep in mind that it can be very frustrating to some. If you're lucky, you will be able to install and start using your Linux system without any Unix background. However, once you are ready to delve into the more complex tasks of running Linux--installing new software, recompiling the kernel, and so forth--having background knowledge in Unix is going to be a necessity. (Note, however, that many distributions of Linux are as easy to install and configure as Windows 98 and certainly easier than Windows NT.)

Fortunately, by running your own Linux system, you will be able to learn the essentials of Unix necessary to perform these tasks. This book contains a good deal of information to help you get started. Chapter 4, "Basic Unix Commands and Concepts" is a tutorial covering Unix basics, and Chapter 5, "Essential System Management" contains information on Linux system administration. You may wish to read these chapters before you attempt to install Linux at all; the information contained therein will prove to be invaluable should you run into problems.

Nobody can expect to go from being a Unix novice to a Unix system administrator overnight. No implementation of Unix is expected to run trouble and maintenance free. You must be aptly prepared for the journey that lies ahead. Otherwise, if you're new to Unix, you may become overly frustrated with the system.

1.7.2. Hints for Unix Gurus

Even those people with years of Unix programming and system-administration experience may need assistance before they are able to pick up and install Linux. There are still aspects of the system Unix wizards need to be familiar with before diving in. For one thing, Linux is not a commercial Unix system. It doesn't attempt to uphold the same standards as other Unix systems you may have come across, but in some sense, Linux is redefining the Unix world, by giving all other systems a run for their money. To be more specific, while stability is an important factor in the development of Linux, it's not the only factor.

More important, perhaps, is functionality. In many cases, new code will make it into the standard kernel even though it's still buggy and not functionally complete. The assumption is that it's more important to release code that users can test and use than delay a release until it's "complete." Nearly all Open Source software projects have an alpha release before they are completely tested. In this way, the Open Source community at large has a chance to work with the code, test it, and develop it further, while those who find the alpha code "good enough" for their needs can use it. Commercial Unix vendors rarely, if ever, release software in this manner.

If you have been a Unix system administrator for more than a decade, and have used every commercial Unix system under the sun (no pun intended), Linux may take some getting used to. The system is very modern and dynamic, with a new kernel release approximately every few months and new utilities constantly being released. One day your system may be completely up to date with the current trend, and the next day the same system is considered to be in the Stone Age.

With all of this dynamic activity, how can you expect to keep up with the ever-changing Linux world? For the most part, it's best to upgrade incrementally; that is, upgrade only those parts of the system that need upgrading, and then only when you think an upgrade is necessary. For example, if you never use Emacs, there is little reason to continuously install every new release of Emacs on your system. Furthermore, even if you are an avid Emacs user, there is usually no reason to upgrade it unless you find that some feature is missing that is in the next release. There is little or no reason to always be on top of the newest version of software.

We hope that Linux meets or exceeds your expectations of a home-brew Unix system. At the very core of Linux is the spirit of free software, of constant development, and of growth. The Linux community favors expansion over stability, and that is a difficult concept to swallow for many people, especially those steeped in the world of commercial Unix. You can't expect Linux to be perfect--nothing ever is in the free software world (or any other world, for that matter). However, we believe that Linux really is as complete and useful as any other implementation of Unix.



Library Navigation Links

Copyright © 2001 O'Reilly & Associates. All rights reserved.

This HTML Help has been published using the chm2web software.