The personal computer, foundation of today’s digital environment, was born in a rented building in the 100 block of West Rhapsody, just south of the airport — at least, design work began there in late 1969. The designers quickly moved to a larger facility, at the corner of Wurzbach and Datapoint.

Sí, ha pasado aquí en la ciudad de San Antonio. Key circuitry of the machine that you are using to read these words was probably derived directly from a machine that first saw the light of day right here in the Alamo City. Others may have tinkered along the same lines, but the first practical, fully featured, mass-produced, single-person computer was a San Antonio native.

And having read that, there are people in the computer industry who will frantically pick at and dismiss the main facts of the story, until they are satisfied that the creation myth of the personal computer is again safely contained within Silicon Valley.

Product literature for the original Datapoint 2200. Courtesy of Jack Frassanito.
Product literature for the original Datapoint 2200. Courtesy of Jack Frassanito.

Or so I found after I wrote a book on the subject. The basic story, derived from interviews and documents, is that in about 1968, NASA engineers Austin Oliver “Gus” Roche (1929-1975) and Jon Philip “Phil” Ray (1935-1987) decided that progress in electronics would soon allow intelligent machines that fit on a desktop, and they wanted to be the ones to do it. They found backers in San Antonio, carefully avoiding the word “computer” lest anyone think they planned to compete with IBM. They set up Computer Terminal Corp. (CTC) and their first product, the Datapoint 3300, was an electronic replacement for the electro-mechanical Model 33 Teletype, released in the fall of 1969. Having designed, built, and marketed an electronic desktop machine, they began working toward their larger goal, a general-purpose desktop computer, which they named the Datapoint 2200.

Since they could barely fit the necessary circuitry into the unit’s cabinet, they approached fellow startup Intel about reducing the processor, which took up a whole circuit board, to a single chip. By the time Intel had managed to do so in late 1971, CTC – later renamed Datapoint Corp. – had lost interest and told Intel to keep the chip — one of the most colossal business blunders of the century. Intel marketed it as the 8008 eight-bit processor chip, starting in April 1972. It’s a myth, incidentally, that the 8008 was derived from the earlier Intel 4004 calculator chip, despite the similar names.

An enhanced version, the 8080, sparked the wave of hobby computers that began appearing in 1975. A major upgrade in 1978, called the 8086, used 16-bit architecture and allowed access to more memory. A simplified version, the 8088, was the heart of the original IBM PC and its many clones. The continual succession of updates and clones for the 8086 is now called the x86 dynasty, and a vastly enhanced descendant likely resides inside the machine you are using now.

Datapoint would later invent local area networking (LANs), but filed bankruptcy in 2000 after the success of the PC – its progeny, ironically – dried up Datapoint’s traditional business market. An accounting scandal in 1982 and a hostile takeover by a corporate raider in 1985 didn’t help.

The “Datapoint deniers,” as I call them, accept the Intel/x86/LAN part of the story, but dismiss the Datapoint 2200 itself, saying it really wasn’t a personal computer. No, it was a “smart terminal.” Having dismissed it with a wave, they restore pride of invention to California.

Datapoint 2200 Computer owned by Jack Rubin. Original photo extensively edited by Michael Holley.
The Datapoint 2200, released by the San Antonio-based Computer Terminal Corp. in late 1970, was the first practical, fully functional, mass-produced, single-person computer to reach the market, to the dismay of many modern commentators, who prefer to mislabel it a “smart terminal” rather than accept that fact that it was actually a computer. Not only was it really a computer, but the machine you are reading this with probably uses a derivation of its circuitry. This is the Datapoint 2200 Computer owned by Jack Rubin. Original photo edited by Michael Holley.

Their position is that the Datapoint 2200 was intended for terminal emulation. The programs it was capable of executing would have allowed it to pretend to be various brands of computer terminals. Any number of online articles parrot this belief. If the deniers point to any evidence other than each other, it is to the fact that Roche and Ray did not use the term “personal computer” in their business plans or marketing. The story is that they were surprised when the buyers began using the machine as a general purpose computer.

There are two issues. First, Roche and Ray indeed did not use modern terminology like “personal computer” in their business plans or early marketing efforts — nor could they have in 1970. Meanwhile, as mentioned, they were initially leery about calling their creation a computer. But taking that into account, all the evidence points to an intention, from the start, to create a general-purpose computer that fit on a desk. While it is possible to quibble with the wording of this or that business plan or advertisement, it’s hard to quibble with the results — they did produce a general-purpose computer that sat on a desk. The Datapoint 2200 integrated in one enclosure a keyboard, screen, processor, memory up to 8K, mass storage (on tape), a programming language, and various I/O ports. That was an accident?

The other issue is marketing. There was no established market in 1970 for what we call personal computers, leaving no obvious way to sell them. Unsurprisingly, CTC’s early marketing plans were a moving target. Memory being expensive, simple, canned, appliance-like software, such as terminal emulation, that could fit inside 2,000 bytes – the base amount in the Datapoint 2200 – seemed like a good idea at one point. There is evidence that such software was being planned, giving the deniers their one germ of truth. But the price of memory continued falling, and the first customer, Pillsbury, installed four Datapoint 2200s in chicken farms to run payroll software that the firm’s programmers had written.

In other words, the first customer launched the first documented use of a personal computer.

Finally, in clinging to their one germ of truth, the deniers get it wrong. The Datapoint 2200 had a half-height screen, to give its screen the same aspect as an IBM punched card used for programming mainframes, since the designers anticipated a big demand for electronic replacements for the IBM 029 Card Punch machine. With an intelligent machine with tape storage you could dispense with cards entirely. But in the end, they found no demand for 029 replacements. Meanwhile, its half-height screen made the Datapoint 2200 a poor choice for terminal emulation, since most terminals had screens with twice as many lines.

So it’s no surprise that I have found no evidence that the Datapoint 2200 was ever seriously used as a “smart terminal.” On the contrary, it was used with great success with a system called Datashare, where multiple terminals – with full-height screens – were connected to a Datapoint 2200. The terminals were used for office work, typically data entry or lookup, using software running on the Datapoint 2200, while the half-height screen of the Datapoint 2200 was used for system control.

Admittedly, had there been a serious demand for “smart terminals,” Roche and Ray likely would have followed the money. Instead, they uncovered, and tapped into, a vast appetite for computational power at the personal level.

Today you can get personal computers with tens of thousands of times more computational power than the Datapoint 2200 possessed. But after the sale of billions of personal computers (and now smartphones, etc.) and the unfolding of the Internet, the public’s appetite for such power remains unquenched. The result is a digital environment beyond the conception of anyone in 1970, yet it is clear that we have only scratched the surface of what can be done with these devices.

Perhaps, someday, digital romantics will decide to erect a monument to the birth of the personal computer and the people behind it — and discover, painfully, that it does not belong in California. In the meantime that monument is the world that has resulted — look around you.

*Featured/top image: An original Datapoint 2200 beside the original product design sketch. Photo by Jack Frassanito.

Related Stories:

Cybersecurity: San Antonio’s Not-So-Secret Opportunity

Rackspace Opens for Business in Mexico City

Rackspace Staying in San Antonio, Names New CEO

Cutting Cable: A Trend With Momentum

Home-Grown Champs: Why Rackspace is Like the Spurs

Lamont Wood is a San Antonio-based writer who does the family shopping.

19 replies on “Was the First PC Created in San Antonio?”

  1. This looks a lot like the mechanical IBM Composer System I was using in 1972. It was a glorified IBM Selectronic typewriter (with a little interchangeable type ball) that memorized keystrokes on magnetic tape cassette and could spit back beautiful error free form letters or rudimentary graphically designed publications. The company, Neurosciences Research Program of MIT, published proceedings of their conferences on this office machine. In a couple of years that system was supplanted by phototypesetting on Compugraphic systems which in turn yielded to the less chemically-dependent, purely electronic desktop publishing we use today.

  2. Another piece of evidence is the listings of the CTOS (Cassette Tape Operating System) from the original 2200 Programmer’s Manual… complete with descriptions of the instruction set, using the I/O devices provided, how to assemble programs; and the six different versions of Cassette Databus, very much a business-oriented programming language. There was also a cassette general editor, a document formatting system (SCRIBE), and a lot more.

    The 2200 was the first self-contained general-purpose computer system designed to sit on a desktop, for use principally by one user (despite it later supporting Datashare, which allowed one 2200 to support up to eight separate user terminals… and later 5500 and 6600 versions which supported up to 24 user terminals).

    And it also can’t really be seriously disputed that Datapoint’s ARC System LAN (which, ironically, in Sept 1977 was being called “Internet” (!!) ) was the world’s first commercially available local area network system, based on a high-speed (2.5 megabit/second) packet-switched coaxial-cable based network with an “interconnected stars” cabling topology. That system supported mapping shared server volumes into imaginary local disk “drives”, and a fully distributed file locking system allowing coordinated transactions across multiple volumes, servers, and even multiple different cable systems.

    BTW, numerous Datapoint marketing videos from the ARC System period are online at my Youtube channel… the URL is above. The “Datapoint ARC System” video is a good place to start.

  3. I worked at Datapoint from 1978 to 1984. It was a good natured dynamic place to work.
    I began in Corporate Tax [and strangely enough, my daughter the M.A. Accountant works at a large Corporation [one of two oil related ones], in Corporate Tax.] Anyway, I worked for Datapoint’s customer service. In our department we tracked and retrieved computers needed to be found and returned to the company. Often they’d been gutted by tech pirates. I ended my work for Datapoint as an auditor between their marketing and customer service databases; an attempt to reconcile the two activities. I began my career as a mom to my then newborn son-who’d been born early; at 7.5 months. Early on, though, I didn’t realized the import of working for a computer pioneer company. So, there.

  4. Yes, the personal computer did not begin in Colorado–or Texas.

    There were a number of early desktop computers and computer-like devices as early as the late 1960s, including the HP desktop, programmable machine that was so successful and the Datapoint machine. But if we plot the evolution of the PC, the spark that ignited the era was the Altair 8800 designed by Ed Roberts of MITS, Inc. in 1974 and published as the cover story on the January 1975 issue of Popular Electronics. A kit Altair could be purchased for several hundred dollars during a time when minicomputers sold for a minimum of several thousands of dollars. Bill Gates and Paul Allen didn’t choose Datapoint to launch what became Microsoft. They chose MITS–and moved to Albuquerque to do so. The string of MITS competitors opened the door to Microsoft until IBM came along and took over the market with the help of Microsoft software. When the Smithsonian wanted to display the computer that began all this, they chose the Altair, not the Datapoint. In fact, they chose mine, which was one of the first five built. (It was my payment for writing the manual.) The Datapoint machine was a very impressive development, but it arrived before the environment was ready for it. That environment was fueled not by big businesses but by the 5,000 or so hobbyists who bought Altairs.

    1. First off, Bill Gates and Paul Allen were latecomers to the party when they decided to create a primitive version of BASIC to run on the Altair. The reason they decided to write it for the Altair and not for Datapoint was because the 2200 already had several very capable (for the time!) programming languages, and because Datapoint (like IBM at the time) gave away their operating systems and other software instead of selling the software a-la-carte.

      As for the Smithsonian, the Smithsonian also displayed a Datapoint 2200 with Diablo hard disk drive (I was particularly intrigued because the system they displayed there was in fact the same configuration I had in my master bedroom back at my apartment at the time! That’s a weird feeling to see something at the Smithsonian that you have running at home.)

      The environment is NEVER ready for anything that is going to revolutionize things. When The ARC System was announced, one of the posters called it “The most important development in data processing since the minicomputer.” At the time, nobody was asking for a LAN. They were asking for bigger/faster/more powerful computers, to support more programs, more applications, and more users. In short, the traditional mainframe approach to upgrading. The idea of incremental expandability and modular architecture was totally alien. But of course, once people understood what it meant, the days of the mainframe as the primary workhorse of business data processing were numbered.

      I’m still not convinced that IBM would have even produced the PC if they had realized that it (combined with the LAN) would ultimately help spell the end of their primary mainframe business. But by the time the first IBM PC came out in 1981, Datapoint had over 10,000 LANs installed worldwide, and the shift in market direction had become irreversible.

  5. Technically, the definition of a computer is an electronic device which can change programs. Most of the “push” to develop a computer at all came from NASA and the rush to get actual computers functional for the Apollo program.

    To me, the debate doesn’t center around who was more popular, but what IS the first device which sits on a desk and can run multiple different programs. Although Altair without denial popularized the concept, the Datapoint 2200 pre-dates it. It does not matter what the popularization was… what matters is who first came up with the innovation.

    1. The debate here really isn’t about what is the first computer. It’s about who was the FIRST in making a desktop PERSONAL computer. We’re not talking about a calculator (not even a ‘programmable’ calculator), we’re not talking about a bigger timesharing system (typically rack-mounted), or a traditional minicomputer. We’re talking about a desktop, user-programmable computer, with a direct, human-oriented interface (a keyboard and display, rather than toggle switches and flashing lights), and designed to be predominantly used (directly) by a human user sitting in front of it. The suggestion that the Altair somehow predated the Datapoint 2200 (and note here that the Altair was really inherently a cheap “mini”computer in style, not really having any inherent keyboard and display) is particularly absurd, since the Altair’s processor architecture and instruction set was basically EXACTLY the one that the Datapoint 2200 had introduced several years before. ;-))))

  6. Missing from the article and the discussion are (at least) two major points. One I expected to see and one few know about.
    1: The names of the people who actually came up with the architecture of the 2200.
    Roche and Ray may have wanted it but they didn’t design it. Victor Poor and Harry Pyle, over a Thanksgiving weekend, in Fredericksburg, MD developed the architecture and instruction set for the Datapoint 2200. Victor became the Senior VP of R&D and Harry a director in his department.
    intel as one of the (attempted) builders of the chip had a few design change requests to lower transistor count. First to switch number layout to little endian form and second to make subroutine call and return conditional, like branch instructions.
    So, we should remember Vic and Harry as the originators of the architecture of the first (and current) personal computer. You can still see their design in the modern AMD 64 bit architecture.

    2: Why did Vic Poor and Harry Pyle design that instruction set?
    Told to be my Harry Pyle himself. The original intent of the Datapoint 2200 was for a terminal emulator. This was because the Datapoint 3300 was hard wired for each of the terminal protocols it needed. One version for IBM mainframes, one for DEC VT 100, one for Burroughs, for Univac, etc. emulation of their unique terminal protocols. And so, its protocol was personalized in the factory for one variant. The Datapoint 2200 could be programmed allowed it to emulate these communication protocols and more with one piece of hardware.
    It was, as Mr. Wood noted, that when Datapoint saw Pillsbury writing software for their programmable terminal that they realized they had hit on something more important. They changed their direction and started selling the unit as a programmable personal computer for business use.
    It is ‘for business’ use, as their customers were Fortune 1000 companies that kept Datapoint from calling it a Personal Computer. Let alone that that term did not exist at the time.

    2b: Now, the amazing other side of the coin.
    Harry had a little binder of the design notes defining the architecture of the 2200. BTW: They were never returned when lawyers subpoenaed it for a trial, unconnected with Datapoint. These design notes were for the ‘terminal emulator’ but, he had another set.
    This other set of notes defined the architecture for a programmable computer. It was based on the HP 3000 architecture, a stack machine quite unlike the architecture of the Datapoint 2200 or IBM mainframe for that matter. It is like the difference between how you use an HP scientific calculator and most others.
    If Datapoint had realized that people would be programming their machine and not simply using it, Harry would have suggested they use the stack architecture design. This is quite unlike the architectures commonly used today.
    Just try and image how our computer and programming world would be different if that first Datapoint 2200 and its intel 8008 version was a stack based architecture.

    BTW: The designer of the Datapoint 3300 terminal circuitry and the Datapoint 2200 computer circuitry was Gary Asbell (their Steve Wozniack, if you will) another name that hopefully will not be lost.

    1. Harry’s major contribution to the 2200 processor architecture was the use of the stack for subroutine calling.

      The story about Pillsbury is apocryphal but I don’t really believe that Computer Terminal Corporation never expected the 2200 to be used for business applications, but only as a software-configurable intelligent terminal. Among the first programming tools offered with the 2200 was a variety of Cassette Databus versions (at least five or six variants), and these compilers definitely were business-oriented (decimal string math, for example).

      The 8008 didn’t do ANY 16-bit arithmetic, directly, so the “little endian” form was only used for memory addresses (at the processor level). The major advantage to storing numeric values in “little endian” (LSB,MSB) format was so that when adding or doing arithmetic, carries could propagate (or not) to the next byte location. That was primarily of value because the Version 1 2200 (which used shift register memory, for the processor memory like the display controller memory) didn’t have to wait for the more significant byte to come around again (entailing a complete cycle through all of memory). I don’t believe this was anything about reducing transistor count in the 8008 (likewise for the conditional subroutine calls and returns).

      I was recently at the Computer History Museum (out in the Bay Area) and they also seem to not understand the distinction between a “Personal Computer” versus a “Hobbyist Computer”. The 2200 is certainly a “personal computer”, in that it is in a single case, with a keyboard and display screen, intended for direct personal use by a single user (at a time, at least). There were other devices out and about around the same time, but to suggest that the Apple II was among the first “personal computers” (as one might be tempted to believe, from the exhibits at the CHM) is ridiculous.

      By the same token, it’s ridiculous that the CHM doesn’t recognize that local area networking wasn’t begun with Ethernet out in the Bay Area either.

      Before the first “personal computer” (IBM PC in 1981), Datapoint had more than ten THOUSAND ARC System LANs in use worldwide, with electronic messaging, word processing, business processing, communications, incremental architecture and modular expandability, and all the rest that we who worked at Datapoint knew totally well. Go to Youtube and look for “Datapoint Integrated Electronic Office” to see one of Datapoint’s marketing videos from those years. You’ll see how far Datapoint was advanced compared to its competition at the time.

  7. I took a computer programming course on a Datapoint “Trash 80.” Believe the language was called Databus.

    1. The term “Trash-80” normally was used to refer to the Tandy/Radio-Shack TRS-80 hobbyist computer… for which specific machine, at least, I don’t think there ever was a Databus compiler/interpreter. Although it is true that the Intel microprocessor used in the TRS-80 was the chip implementing the Datapoint 2200 CPU architecture and instruction set! Databus/Datashare was a VERY nice business-oriented programming language at the time, anyhow… an extremely versatile and useful programming tool.

  8. You have made my day. I was first introduced to the CTC Datapoint 2200 version 1 in Odessa, TX where I was working on main frame computers RCA and Univac for Univac. I attended and supported a customer of Univac who had a booth at the annual Oil show in Odessa. I was very interested in this product because I had supported and provided training on component repair of the 3300 for a small company in Rockville, MD (TST Communications). I had previously attended a training class in the old Kodak building in San Antonio, TX (my first visit to San Antonio) held by Computer Terminal Corporation.
    I argued with the sales person who was representing CTC about this so called computer which I called and Intelligent Terminal. He corrected me and said it was a computer. At that time I had a hard time accepting that this desktop terminal with a keyboard could be a real computer. Of course after I got over the shock of being introduced to a device which could be picked up by one person I came to the realization that it was what the Sales person had announced – a mini computer as he called it (micro had not yet been introduced into the technical description for a computer).
    Little did I know but within a year of that introduction I would be hired and moved to San Antonio to work for Datapoint Corporation as a Technical Support Engineer. I remained with Datapoint for 13 years and was promoted to be the first and only Support Engineering Specialist for Customer Service and traveled to most of the states in the US and as far away as Sidney Australia. I had a great and rewarding time with Datapoint and am remember my time there with a lot of pleasure! In all I have worked as a IT specialist for almost 50 years.

    Philip Schwirian

  9. I remember the 2200, 5500, 6600, etc. – even the “Godzilla”, which was a Datashare-capable machine supporting (as I recall) 32 users in 64K of RAM. Gordon, weren’t you the one responsible for the Databus compiler? Amazing technology for the time.

    As for the 2200 being a desktop terminal emulator, I wrote an inventory system on a 2200 back in 1983 for a local commercial radio shop in San Antonio. Databus was pretty sophisticated and versatile for it’s time.

    1. I was responsible for DOS-dot and The ARC System (among other stuff there). We still use Databus here in Dallas for processing ballot access petitions… there’s still NOTHING better than AIM for looking up voters (in a 1.2 million record voter database) based on fragments of handwritten information. The “Godzilla” you’re referring to would have been the 5500 or 6600… but they supported a max of 24 users (per machine, obviously you can have as many Datashare users as you want if you spread them across multiple Datashare applications processors).

Comments are closed.