In 1984 Apple Computer introduced to the world an operating system designed for real people who want to get real work done, and changed the world of computing -- hopefully forever. But by the time the system memory and disk drive space were big enough to do useful work on it, only the users fully understood the power and value of the Macintosh operating system.
In 2002 Apple officially killed the Macintosh operating system and went back to a system designed by geeks for geeks. If you must use unix, Apple's OSX is arguably the best one available today, but it's not as easy to use as Windows, nor even as easy as Apple's own failed AU/X. The Macintosh vision lives on, crippled somewhat by vendors and developers who really don't get it but are smart enough to build something that the users want. It's called WindowsXP and it's a far cry from what the Macintosh was, but it's better than anything else, especially including Apple's own OSX.
What is it that made the Mac different? Direct manipulation, plain and simple. What you see is what you get, there ain't nothing else under the hood to make it hard to use. Everything that needs to be done can be done by clicks and drags; typing is for data and data alone. Five or ten years ago the difference between a Mac user and a PC user was stark: Mac users had a job to do and their computer was an unobtrusive tool to get the job done, while PC users liked to fiddle with their computer. That's less true today, probably because more and more you have to fiddle with the Mac also. For the last two years, all the traditionally Mac magazines have been filled with articles on how to open a terminal window to type in arcane commands that do what a couple clicks and a drag used to do in the original Mac OS -- or else didn't need doing at all. And the creative people who have been hard-core Mac addicts held out, using OS/9 to the bitter end and refusing to "upgrade" to OSX.
The Mac is gone and Apple is running on fumes in the gas tank, plus some innovative but easily copied industrial design. However, millions of computer users and potential users are still out there, resenting the fact that computers are too complicated to use. What they see is Windows and Linux/unix, and they are right. But it should not be so. Microsoft, while not really understanding the driving force, is still incrementally moving in the right direction.
We can do better than that.
The automobile was invented near the end of the 19th century. It was noisy, expensive, and hard to use. Only rich geeks could afford one -- and wanted to. Henry Ford changed that, by making the car affordable and a little (but not much) less geeky. You still had to crank it to start it, and double-clutch to change gears. All kinds of things could go wrong and often did. Then somebody invented the electric starter, and somebody else invented the automatic transmission. Now every person can drive. The geekiest of drivers today get a car with a stick shift, but nobody hand-cranks it to start. They don't even have a hole to put a crank in.
Forty years ago, when I started using computers, they were all command-line batch systems, the equivalent of hand-crank stick-shift automobiles from the 19th century. Then Apple gave us an electric starter and automatic transmission in one fell swoop. Just like the auto manufacturers 70 years before them, Microsoft copied the look and feel -- they added an electric starter and took out the clutch, but you still have to shift gears... also they left in the hole for the crank, perhaps to make the geeks feel good. Apple has abandonned their innovations and gone back to a Tin Lizzie: it has a metal body like the modern cars, but still runs like the 19th-century hand-crank double-clutch shifters.
Where is unix and Linux in all this? They are still stuck in the 19th century. But oh how much fun they have tinkering with their system! What is it that makes Linux and Unix adherents so loyal? That's another story.
Wake up, it's now the 21st century! Direct manipulation is where it's at. Command lines are dead (or should be).
In 38 years of making a living off of computers, I have never once met an application that was inherently easier and/or faster to drive from a command line than by a direct manipulation (graphical) user interface. I have seen some very badly done GUIs -- some even this year -- which are harder to use than a well-built command line interface, but a mediocre GUI will beat out the best of CLIs in performance and usability. By performance I mean the total time to get a job done, or else the total time it takes the user to think about getting the job done, and only after that the total number of CPU cycles it takes.
This challenge is addressed to the OpenSource community, primarily Linux geeks. That's not a put-down; as the joke goes, "I are one" too. Although I have not yet personally experienced it, I'm willing to concede that Linux is probably the most stable and robust of systems available today. And for the 90% of all geeks still stuck in the 19th century, Enjoy!
To the remaining 10% or 5% or 2% who would like to see Linux eat Microsoft's lunch, here's how we/you can make it happen. Linux is already eating Windows for breakfast in the server market. That's because the Windows server is complicated to set up and administer, requiring a high-priced guru with 6 months of specialized training. To run any kind of Linux system at all requires an equally high-priced guru with a lot of training, so the manpower requirements are the same; the remaining cost of server software is entirely on the Microsoft side. The cost there easily overcomes the fear of buying Linux (one copy, at $25) from small and brittle vendors.
The desktop is something else. Most desktop computers are not computer
experts, and don't want to be. ALL Linux users must be computer experts.
Unless the system is usable by your mother without a manual, it's not going
to make it on the office desktop. Windows is mostly usable; it still requires
some guru consultant help -- perhaps 5-20 hours per year (the median Mac
guru requirement was closer to 1 hour/year, but no longer). If Linux or
a Linux derivative could get its guru requirement below that of Windows,
Microsoft would be locked out of the lunchroom.
1. NO COMMAND LINE AT ALL. There was no command line in the Mac OS, and there's only a fake command line in WindowsXP: you can bring up a DOS prompt in a separate window, which I think runs in emulation on top of the system, and there are command-line options in some shortcuts. All this amounts to a hole in the front bumper to stick a hand-crank into, but it's not connected to the motor. There are some profound, but not insurmountable, implications of this requirement:
1a. No shell scripts. That does not mean no scripts. AppleScript works fine without a command line for it to drive. You need a scripting tool with a "watch me" mode, and a text representation for power users to edit and add loops and conditionals. It could even look somewhat like one of the existing shell scripts.
1b. No command-line options on programs. You can still support them the way Apple supported them in AU/X, with a wrapper dialog box where the options are represented by checkboxes and text fields, or maybe the way Win32 does it with a mini-CLI in the shortcut, but this 19th century model of computing should be deprecated and given a decent burial. It's not that hard to put options into menus and dialogs, and to remember their state across runs. Files are best selected by DragonDrop on the application icon (or its shortcut).
1c. Real people and real programs use structured data not text, so piping data from program to program is not very useful. However, if somebody really wants to do that, it's not so hard to build a graphical utility to lay out the plumbing, command line not needed.
1d. The windowing environment must be embedded in the kernel. There is no console logon, the system boots up into graphical user mode from the get-go. 19th century programs grandfathered in can dump their text/console output to a file that nobody will look at anyway.
2. Only one desktop environment, with fewer (and more understandable) options. Users do not need (nor want) two very different desktops, and they really don't need 25 different ways to set infrequent preferences, just one that they can find. Geeks like lots of choices; real users just want one that works.
3. A simpler and more robust file structure. Linux and unix are both very brittle, with zillions of hard-coded file pathnames that if the user moves something to a different folder, the whole system comes to a "panic" halt that is irrecoverable. The early Mac systems had two files that had to be in the same folder, that's all. Other system capabilities, if they weren't there, it muddled along without them. Very robust. Later Mac systems, as more unix-trained people got into the act, became more brittle with more ways to break it. Maybe the Linux use of the file system cannot be fixed. I don't know yet, I'm still learning. At least Apple has hidden the dangerous folders.
4. Automatic single-user login. Unix was originally designed for multiple users in a research environment, logged into a single computer. It's still primarily used in university labs where multiple users makes sense. Desktop computers are primarily used by a single person all day. There are still a lot of home computers that are shared by several family members, but that percentage is going down rapidly with the decreasing cost of computers. I do not personally know of any (multiple user) single-computer homes any more; everybody has their own. Microsoft understands that, and it's part of making the computer easy to use.
5. Simpler install. To install Windows, you push the CD into the drive, click OK a few times, then go get lunch. Linux is not even close. Microsoft does not own the hardware, it's made by independent vendors from an open specification; there's no reason Linux can't use the same spec and work as easily. It's not necessary to support every obscure hardware type, supporting the major volume vendors will be adequate for capturing the market -- especially if it's a no-hassle install. Also,
5a. Self-configuration at boot time. This is related to all those hard-coded file pathnames (#3 above), which make the system fail if you picked up the wrong boot floppy or changed out a piece of hardware. Apart from obsolescence, any Mac boot disk will boot up on any Mac it will physically fit in. If I change out a hard drive, whatever is there mounts and is immediately usable; if I change out or add some other peripheral, I may have to drag a driver into the SystemFolder and restart (should not be necessary, but it is), or just select it in the chooser. In the new USB Mac systems, I can hot-plug in a peripheral and there it is, ready to use. Why can't Linux do that?
5b. Better/faster bootup. WindowsXP takes forever to boot up, but at least they show a nice peaceful sky scene; Linux takes even longer and puts up these ugly (and sometimes disturbing "Failed") lines of text. If the boot process is not successful, the average user cannot do anything about it anyway. Even better: build a ready-to-go boot image at installation, then just swap it in on power-up.
That's for starters.
Let me know.
Tom Pittman (TPittman aatt IttyBittyComputers.com)
First Draft, 2003 June 23