logo

History of Free Software

Authors: Lloyd Hardy, Richard Stallman

Revision: 1.8 (2014/12/22)

3.0 – History of Free Software WORKING DRAFT

This history of software and freedom is so largely unknown that for many IT professionals proprietary software is software "in its natural state". However, the situation is rather the opposite and the seeds of change that could first be discerned globally in the first decade of the 21st century had already been sown in the early 1980s. Since the first computer program was written, software has always has been either free or non-free, however Free software as a formal definition did not appear until the beginning of the 1980s. The story of how Free Software became formally defined and articulated is an interesting one helps us to understand why it is important to protect our freedom.

The catalytic roles of Richard Stallman, the GNU Project and the Free Software Foundation in the Free Software movement are pivotal and through understanding the person, the project and the foundation, we will understand more about the history of Free Software.

 

3.1 – The Story of the Hacker and the Xerox Printer

When someone tries to restrict a hacker from hacking[2], it usually evokes a different response than the oppressor intends it to. Hacking is a way of having fun, experimenting or exploring new ideas, proposing new methods and overcoming new challenges. Hacking is about answering the question:

What if I....[insert activity to which to apply ingenuity and make smiles] ?”

So, when a hacker thinks of a hack, such as “the printer keeps jamming, I'll hack the code and let it alert us if it's jammed” and the oppressor responds by denying the hacker access to the code, then it is unlikely the hacker will give up on the hack.

This makes the Free Software movement the biggest hack ever. “What if I.... challenged the entire proprietary software development sector and educated all of the users in the world on their rights and consumer power to demand free software?” . Now, that'd be a cool hack. This is what the Free Software movement could be considered as – the ultimate hack.

Richard Stallman was denied access to the source code for a gifted (i.e. trojan horse) printer with proprietary software. His intention was not to cause any harm to the copyright holder, it was to do the opposite! To help to solve a problem – if anything, to assist the copyright holder by providing an added feature, attempting to patch (or at least alleviate the effects of) an error in the copyright holder's work. However, the copyright holder had chosen to do something very destructive to the software development community.

The copyright-holding vendor had secured an agreement with employed authors that legally obliged them not to disclose details of the software development, including the source code (an NDA, a Non-Disclosure Agreement). This was shocking and hurtful to Richard Stallman as he sadly found out when he tried to access the source code to fix the problem.

As time progressed, it was not just the printer that came under attack from proprietary software. A thriving community of innovators, sharing source code and ideas was changed into a divided community, physically and logically.

 

3.1.2 – Free Software Movement

Richard was born in New York City in the USA in 1953. As an exceptional student, he graduated magna cum laude (with very great honour) with a BA in physics from Harvard University in 1974. He enrolled as a graduate student at MIT (Massachusetts Institute of Technology), but ended his pursuit of a doctorate in physics to focus on programming at the AI (Artificial Intelligence) Laboratory at MIT [1].

During the sixties, the IT panorama was dominated by large computers. These were deployed in companies and governmental institutions. IBM was the leading manufacturer, way ahead of its competition. During this period, when buying computer hardware, the software was included. As long as the maintenance contract was paid for, access was given to the manufacturer's software catalogue. Plus, the idea of programs being something "separate" from a commercial point of view was uncommon.

In this period, software was normally distributed together with its source code (in many cases just as source code), and in general, with no practical restrictions. However, by the mid-1970s it was already common to find proprietary software.

This meant an enormous cultural change among professionals who worked with software and was the beginning of the outbreak of a large number of companies dedicated to this new business model. It would still be almost a decade before what we now know as Free Software was born as a definition in response to this new proprietary threat to freedom.

(Section abridged from FTA 'Introduction to Free Software', p16-p17)

Many of the MIT programmers, or 'hackers' as they were affectionately known became involved in a proprietary venture. This work unfortunately came with a non-sharing agreement and eventually became the main vendor for mainframe resources.

However, a counter initiative evolved that did not have the aim of limiting the user's rights, but with a philosophy of sharing their ideas and code. This group found it difficult to keep up with the feature set developed by the proprietary group as that group had greater resources.

Richard Stallman worked night and day to code the same features into the non-proprietary version. He did not read the code from the proprietary vendor (his previous friends and peers) – this was to ensure that his code was entirely original. This was an original work and was the copyright of Richard, which allowed him to license it as he saw fit.

 

3.2 – The GNU Project

At the beginning of 1984, Richard Stallman left his job at the MIT AI Lab to start working on the GNU project. The goal of the GNU project was created to make an entirely Free operating system similar to Unix. The GNU Project was so-named as a funny, recursive acronym: 'GNU's Not Unix'.

UNIX was the operating system you used back in the 70's to do anything. This was years before Microsoft developed Windows and a long time before GNU/Linux was designed. It was one of the first portable operating systems, was originally created by Thompson and Ritchie (among others) from AT&T's Bell Labs. Towards the end of the 1970s and especially during the decade of the 1980s, AT&T changed its policy and access to new versions of Unix became difficult and expensive. The philosophy of the early years that had made Unix so popular among developers, changed radically to such an extent that in 1991 AT&T even tried to sue the University of Berkeley for publishing the Unix BSD code that Berkeley's CSRG had created.

Richard didn't like the way that his refusal to sign exclusivity or non-sharing agreements made him an outcast in his own world, and how the use of proprietary software in his environment left him impotent in the face of situations that could easily be resolved before. The GNU project included software in its system that was already available, but there was still a lot to be built. He started by writing a C compiler (GCC) and an editor (Emacs), both of which are still in popular use today.

To ensure that people who received programs GNU project and from subsequent distributions had the same rights (modification, redistribution, etc.), the GPL license was created. Richard Stallman called the generic mechanism that these GPL type licences use in order to achieve these guarantees, copyleft.

(Section 3.2 abridged from FTA 'Introduction to Free Software', p19-p21)

 

3.3 – The Free Software Foundation

The Free Software Foundation was founded by Richard Stallman in 1985 to provide financial support to the production and later the protection of Free Software. This foundation was initially used to employ programmers to work on GNU projects. Today the FSF is central to the organisation of Free Software advocacy, project organisation and affiliated with the Software Freedom Law Center to provide guidance and assistance in legal matters associated with Free Software licensing. Today, the FSF's mission is as follows:

The Free Software Foundation (FSF) is a non-profit that states is has "a worldwide mission to promote computer user freedom and to defend the rights of all free software users" [2]

The philosophy of free software is one of computing control and freedom to share among a community of ethical programmers: “The free software movement is one of the most successful social movements to emerge in the past 25 years, driven by a worldwide community of ethical programmers dedicated to the cause of freedom and sharing. But the ultimate success of the free software movement depends upon teaching our friends, neighbours and work colleagues about the danger of not having software freedom, about the danger of a society losing control over its computing”.[2]

 

3.4 – GNU/Linux

Six years after the GNU project was started, it had a near complete operating system which would be similar to Unix. There was just one important part of the system missing - the kernel which shares resources among other programs. In 1991, Linux Torvalds began a project to create a kernel and subsequently licensed it under the GPLv2. This then became the chosen kernel for the GNU project, creating an operating system that became known as GNU+Linux (GNU plus Linux) or GNU/Linux (GNU slash Linux).

(Section abridged from FTA 'Introduction to Free Software', p21)

GNU+Linux is often incorrectly referred to as 'Linux' but this does not recognise the true history and majority contribution of the programmers of the GNU Project. Linux is just the kernel of the operating system and cannot do anything by itself. Linus Torvalds himself wrote, in the first release fo the Linux kernel: “Sadly, a kernel by itself gets you nowhere. To get a working system you need a shell, compilers, a library etc. These are separate parts and may be under a stricter (or even looser) copyright. Most of the tools used with linux are GNU software and are under the GNU copyleft. These tools aren't in the distribution - ask me (or GNU) for more info.” [3]

The complete operating system we use today in popular distributions - such as gNewSense, Ubuntu, Debian, Fedora, Slackware etc are actually GNU + Linux + other software.

Free Software professionals understand that GNU software makes up more of the total lines of source code in a current Linux distribution than any other source, including the Linux kernel. “One CD-ROM vendor found that in their “Linux distribution”, GNU software was the largest single contingent, around 28% of the total source code, and this included some of the essential major components without which there could be no system. Linux itself was about 3%. (The proportions in 2008 are similar: in the “main” repository of gNewSense, Linux is 1.5% and GNU packages are 15%.).” [4]

The GNU Project has been working on its tools since 1985 and it wasn't until 1991 that the Linux kernel was written. We can learn a little more about the project from Richard Stallman:

The GNU Project was not, is not, a project to develop specific software packages. It was not a project to develop a C compiler, although we did that. It was not a project to develop a text editor, although we developed one. The GNU Project set out to develop a complete free Unix-like system: GNU.”

Many people have made major contributions to the free software in the system, and they all deserve credit for their software. But the reason it is an integrated system—and not just a collection of useful programs—is because the GNU Project set out to make it one. We made a list of the programs needed to make a complete free system, and we systematically found, wrote, or found people to write everything on the list. We wrote essential but unexciting (1) components because you can't have a system without them. Some of our system components, the programming tools, became popular on their own among programmers, but we wrote many components that are not tools (2). We even developed a chess game, GNU Chess, because a complete system needs games too.

By the early 90s we had put together the whole system aside from the kernel. We had also started a kernel, the GNU Hurd, which runs on top of Mach. Developing this kernel has been a lot harder than we expected; the GNU Hurd started working reliably in 2001, but it is a long way from being ready for people to use in general.

Fortunately, we didn't have to wait for the Hurd, because of Linux. Once Torvalds wrote Linux, it fit into the last major gap in the GNU system. People could then combine Linux with the GNU system to make a complete free system: a Linux-based version of the GNU system; the GNU/Linux system, for short.

Making them work well together was not a trivial job. Some GNU components(3) needed substantial change to work with Linux. Integrating a complete system as a distribution that would work “out of the box” was a big job, too. It required addressing the issue of how to install and boot the system—a problem we had not tackled, because we hadn't yet reached that point. Thus, the people who developed the various system distributions did a lot of essential work. But it was work that, in the nature of things, was surely going to be done by someone.

The GNU Project supports GNU/Linux systems as well as the GNU system. The FSF funded the rewriting of the Linux-related extensions to the GNU C library, so that now they are well integrated, and the newest GNU/Linux systems use the current library release with no changes. The FSF also funded an early stage of the development of Debian GNU/Linux.

Today there are many different variants of the GNU/Linux system (often called “distros”). Most of them include non-free software—their developers follow the philosophy associated with Linux rather than that of GNU. But there are also completely free GNU/Linux distros. The FSF supports computer facilities for two of these distributions, Ututo and gNewSense.

Making a free GNU/Linux distribution is not just a matter of eliminating various non-free programs. Nowadays, the usual version of Linux contains non-free programs too. These programs are intended to be loaded into I/O devices when the system starts, and they are included, as long series of numbers, in the "source code" of Linux. Thus, maintaining free GNU/Linux distributions now entails maintaining a free version of Linux too.

Whether you use GNU/Linux or not, please don't confuse the public by using the name “Linux” ambiguously. Linux is the kernel, one of the essential major components of the system. The system as a whole is basically the GNU system, with Linux added. When you're talking about this combination, please call it “GNU+Linux” or “GNU/Linux” or GNU with Linux Added. ” [4]

 

3.5 – Free Software Today

By the 2000's, Free Software was used by millions of people all over the world. GNU+Linux alone was used by around 30 million people, however Free Software is much further reaching, both directly and indirectly. Free Software is about freedom, not technical development or stability. These things are important for useful computer software, but Free Software doesn't even have to be useful to be Free. One might argue that simply by being Free, it is useful in some way.

However, by the 2000s, public adopters of Free Software through GNU+Linux include the NSA (National Security Agency), US Nuclear Submarines, NYSE (New York Stock Exchange) and various government agencies across the world. Computer hobbyists, charitable organisations and not-for-profits used Free Software, along with millions of others.

In the late 2000s, Android was developed as a free software mobile operating system. By the start of 2010, Android already had 23.8 millions users. Today it is hard to say what number of users using free or proprietary software. There are no official figures and no realistic way of measuring this accurately. However, what we can observe is that more and more developers are contributing to Free Software and more users are enjoying freedom. Academic institutions can benefit from the use of Free Software as it allows the students to study and share the source code.

By understanding more about the history of the Free Software movement, we can see how the choices we face today reflect those in the past. In many of our cellphones, video game consoles and even vehicles, proprietary software is embedded. Even as free software developers, we face further challenges. Rejecting proprietary software may not be the 'easy' thing to do and accepting the status quo may at first glance appear attractive, but it is a trojan horse that comes with more conditions than we would wish for.

When we acquire a new electronic device, we can ask the vendor if it contains proprietary software and reject it if it does. If we accept those restrictions then we making an agreement with the vendor not to share with out community. This is morally wrong. We are the consumers and if we demand free software, companies will have no option but to supply it (or fail in business).

Imagine our society looking back in 20 or 30 years, the same way we look back at MIT in the 1970s and wonder why academic institutions invested in the restriction of knowledge. Isn't it true that future society will be just as confused as to why we believed in the need to accept proprietary software as consumers.

 

References

 

[1] http://en.wikipedia.org/wiki/Richard_Stallman

[2] http://stallman.org/articles/on-hacking.html

[3] http://www.gnu.org/

 

[1] http://ftacademy.org/materials/fsm/1#1 (Accessed: 2011/02/05)

[2] http://www.fsf.org/about (Accessed: 2010/07/09)

[3] http://www.kernel.org/pub/linux/kernel/Historic/old-versions/RELNOTES-0.01

[4] http://www.gnu.org/gnu/linux-and-gnu.html (Accessed: 2011/04/16)

 

 

Further Reading