As far back as I can remember -- or at least since high school -- I wanted my own computer. This was before the invention of the microprocessor, and mainframe computers were millions of dollars. IBM figured there was a market for ten of their vacuum-tube 704 scientific model and priced it accordingly. I think they eventually sold over a 150 of them, with windfall profits on the other 140, but that's another story. In high school I got involved with a electronics-specialty Boy Scout Explorer group sponsored by Beckman Instruments near where I lived, and I collected a lot of discard electronic equipment, so I could build my own computer. I didn't have a clue what was involved.
My first job was civil service, working as a computer programmer in a government research lab. We actually had one of those IBM 704 computers (by then out of date), and I spent a lot of time in the machine room talking to the IBM maintenance engineer and learning how the hardware worked. But minicomputers still cost more than two years' salary. The government closed the lab, and I took a variety of contract jobs.
Then I saw the double-page ad for a computer-on-a-chip, the Intel 4004. They offered a half-day seminar on programming it, so I drove to (I think it was) Cupertino to attend. The processor had a maximum of 1.2K 4-bit data memory, and 4K 8-bit read-only program memory (ROM). You programmed it in assembly language on a mainframe. So I raised my hand and asked, "Do you have anything for software development that runs resident?" (on the computer itself). No, they did not. Ever the entrepeneur, I followed up with, "If somebody would write one, would you buy it?" and he replied, "See me afterward." The deal was, they gave me a development board and a ROM-burner board, and I would give them an assembler and simulator that ran on it. They gave me an account on their mainframe for development. I had to buy my own TeleType (keyboard and printer and mass storage = paper tape) and my own power supply and my own ROM chips and ultra-violet erasor. It was a win-win: it cost them minimal effort and no cash for something that made their development tools more accessible to low-budget customers, and I got my very own computer.
I guess they liked the assembler, because I started getting calls from Intel customers who needed somebody to write their software. I made a decent living at it for several years, when the 4004 chip set was the only game in town. The available memory was too small for the simulator to be useful, so most of us writing code debugged it on an oscilloscope (I just got rid of my scope last year, although it hadn't been turned on in almost 20 years). It was labor-intensive for very tiny programs.
Plugging in ROM-chip programs was not the same as typing code on a keypunch (what I had done to program the 704), and especially the long programming cycles limited what I could do, so I self-taught what I needed to build a dynamic 4K memory that pretended to be ROM. The circuits were finicky and I asked a lot of help from the engineers building the hardware I was paid to program, but we worked well together, so they were eager to help me. Maybe my prices were too low, so they still thought they got the better deal. Whatever. Anyway, it was an ugly mess perched on top of the Intel board, but it worked.
Loading this new program memory from paper tape was still too slow for my tastes, and about then IBM created a floppy disk for loading the control memory of their new 360-series computers, and 3rd-party vendors started offering the same hardware for other purposes. I bought one of the first available and cobbled together a control circuit to convert the data from the speed the disk hardware wanted to see, to the much slower speed my 4004 processor could handle. I had a "disk operating system" before anybody else had a computer at all.
The hardware market wanted 8-bit data, so Intel came out with an 8-bit processor, and eventually a hobby supplier in New Mexico put together a complete computer kit for less than Intel was selling their processor chips for. Those of us in the industry firgured that Intel was selling out-of-spec chips at a reduced price to MITS, and everybody benefitted. Popular Electronics magazine ran a cover story on the kit, and MITS was back-ordered for something like a year, the demand was so high. A bunch of guys met in Gordon French's garage in Menlo Park to talk about their computers (on order, but not yet arrived). I heard about it at a computer conference and showed up the second or third meeting. The garage was already crowded, so one of the guys who worked at Stanford Linear Accelerator Center made their 500-seat auditorium available for what became known as the HomeBrew Computer Club (HBCC).
I was the only one who had a disk-based system, so I maintained the club mailing list. I wrote a sort program so they could present their mailings to the Post Office in zip-sorted order for reduced mailing rates. They never charged dues that I recall, so I guess it was a non-profit working on donations. Some of the leadership came out of a Menlo Park or Palo Alto storefront computer center (also a non-profit) with a Marxist-sounding name "People's Computer Company" (PCC). The HBCC moderator Lee Felsenstein (I later learned he was Marxist in political orientation) stood up every week, swung the pointer stick back and forth menacingly, and asked "How many people are here for the very first time?" The auditorium was typically half full, and half of the people there raised their hands. I was astonished when I did the math: over a two-year period, 100 weeks, 100 people each week were there exactly once, 10,000 people within driving distance from Palo Alto were technically literate and motivated enough to visit the HBCC meetings exactly once. Maybe that doesn't seem like so much now, but back then nobody had computers.
Another thing Felsenstein liked to do was to invite people to raise their hands on a sort of roll-call of which computer systems people had, how many 8080s, 6800s, and eventually 6502s. That was the processor in the original Apple. The chip house designed it to fit in the 6800 chip slot, but to cost only 5% as much. Steve Wozniak, a hardware wizard, bought one at the computer show where they were introduced, then showed up at HBCC with the Apple. The two Steves always sat in back, showing off their latest hardware, while I sat in front. Everybody knew me, because at the end of the roll-call, Lee would then dramatically pause and ask "and how many 4-bit computers?" of which there was exactly one, mine. Some people would kid me about having "half a computer," but I reminded them that *I* had a disk operating system at a time when most hobbyists had only cassette tape for mass storage. In fact I had it before most of them had a computer at all. Woz eventually designed a much simpler and more clever controller for the Apple disk system, but he had a much faster processor to do the work than I did.
One of the products of the HBCC was Tiny Basic. Dennis Allison, a computer instructor at Stanford, and Bob Albrecht, the guy who ran the PCC, got together to design something resembling Basic (the programming language of choice for the time-share computer at PCC), but which would run on the minimal memory of the new micros. The original article ran in the PCC tabloid, but it was so polular that Dennis and Bob launched a "Dr.Dobb's Journal of Tiny Basic Calisthenics and Orthodontia -- Running Light Without Overbyte" (the "D" nd "B" in "Dobb's" were from Dennis and Bob's initials). It was eventually shortened "Dr.Dobb's Journal" and became the technical journal of choice for small computers.
My 4004 system was much too small to run anything as big as TinyBasic (although I thought about it a lot), so I traded an assembler and TinyBasic implementation to RCA for one of their 1802 development systems (I did a lot of software-for-hardware trades in those days), and thus came to own a computer big enough to run TinyBasic on. In fact, I even wrote a compiler for TinyBasic. I had 12K of memory, and the compiler was 11K. I went through maybe a dozen or more iterations, thinking of ways to improve the code generation, which added a half-K of code, but then it compiled itself back down to 11K because of those improvements. There is more on my TinyBasic here.
I was living in a duplex at the time, and the landlord -- I guess he had marital problems and needed a place to live -- gave me notice, but the condo I moved to was a block away from a cell tower, and it generated so much electronic interference that my 4004 system (which hung out in the open) became non-functional and I gave it in a barrel of electronic junk to the local community college (which likely discarded it). If I'd hung onto it, I could have donated it to one of the museums that sprung up to memorialize the early hacker days. But I didn't know that then.
Then the Mac came out, and computing never was the same. Compiled TinyBasic was so much faster than assembly (machine) language, that I didn't miss the 4004. Strongly-typed Modula-2 (which I wrote the compiler for the Mac) was so much faster than (untyped) TinyBasic, and the WYSIWYG MacOS was so much faster than the command lines of earlier systems that I never wanted to look back. But that too is another story.
2016 February 1