Technology

From UNIVAC to Google

A computer in every kitchen?

|

The 1969 Neiman Marcus catalog included a futuristic product called the Honeywell Kitchen Computer. The red and white trapezoidal machine came equipped with an H316 minicomputer, a pedestal, a cutting board, and a handful of preprogrammed recipes. This tarted-up recipe box—which did not slice, dice, or make julienne fries—could be yours for a mere $10,600. None was sold, probably because in 1969 that sum could purchase at least three automobiles. It didn't help that the only interface was a collection of blinking lights and switches, or that you needed two weeks of classes to learn to use the bulky machine. In the end, there wasn't a single '60s housewife willing to learn binary to access her own recipes.

The kitchen computer, like the much more user-friendly Mac Powerbook on which this article was written, was descended from UNIVAC, the first commercial computer in the United States. Brought to market by the now defunct company Remington Rand in 1951, UNIVAC (an acronym for Universal Automatic Computer) did not incorporate a single recipe for Jell-O salad. It did, however, accurately predict Eisenhower's surprise landslide victory over Adlai Stevenson in the 1952 presidential election using a small sample of voters in key states. In Core Memory: A Visual History of Vintage Computers (Chronicle Books), former Wired culture editor John Alderman writes that this demonstration of predictive prowess "helped further solidify the hopes and fears that the general public had about these wondrous but scary machines."

Core Memory's glowing, vibrant photographs of ancient computers capture the complicated, worshipful, suspicious relationship we have always had with our machines. To modern Americans, the constant physical tune-ups required by the room-sized UNIVAC—its 5,000 vacuum tubes burned out one at a time and had to be replaced constantly—and the weeks of binary lessons for the Kitchen Computer seem ridiculously laborious, especially for such a small amount of computing power. But even as our computers become mightier, they still demand our constant attention and loving vigilance with their software updates, virus protection, chiming alerts for incoming mail, and occasional crashes. Lest we look down our noses at the devoted operators of those early elephantine calculators, consider how much time you spend worshiping at the altar of your computer—awaiting its pronouncements, diagnosing its illnesses, asking it to spit back your own words and thoughts.

UNIVAC, with its mercury-filled main memory tank (pictured at right) was the first to store programs and data on tapes instead of punch cards, inaugurating decades of fear that our computers would somehow lose our data—something that (irrationally) caused less anxiety when information was stored on perishable paper punch cards. Salesmen from competitor IBM amped up the anxiety by suggesting that UNIVAC's spinning metal tape posed a safety hazard.

This fearsome machine also helped inspire the 1955 Isaac Asimov short story "Franchise," set in a future America governed by an "electronic democracy." The machine that bears the burden of most decision making is MULTIVAC, an immense and nearly omniscient underground machine. In real life, the U.S. Census Bureau bought the first UNIVAC; in Asimov's delicately menacing story, the MULTIVAC contains such detailed census data about every citizen that when voting time comes the only information the computer needs to decide an election is an interview with one person, selected as the "most representative" American.

These days the presence of relatively dumb computers in our voting booths makes us jittery. We fear they will be co-opted, reprogrammed, or biased by their makers to favor one party or another. Instead of putting our trust in these tiny modern equivalents of MULTIVAC, we fret about the machines' smallest malfunctions and their potential for corruption. Asimov was wrong in his prediction that computers would get bigger and bigger, sprawling underground. But he was right that we would come to depend on these machines. Many of us tote our laptops and PDAs around as if they were small children.

The mechanistic, industrial look of the megacomputers of the '50s gave way to the Kitchen Computer's sleek lines and blinking lights, but it wasn't until Apple started making computers that looked like compact, nonthreatening office machines that they snuck into our daily lives. In the electronic Eden of 1976, the Apple I sold as a kit for $666.66. The kits allowed users to assemble the working parts and then bolt their own computers onto wooden boxes. Apple founders Steve Jobs and Steve Wozniak preassembled 50 of them, to be sold at the Byte Shop in Mountain View, California. The Apple I wasn't a huge seller, but the prefabricated Apple II, with its own snazzy beige box, was the breakout product that established Apple and helped turn personal computers from a home-brewed curiosity into a household commodity.

As our computers shrank and their power grew, our dreams for them got big again; utopian Internet visions began to take hold. William Gibson's 1984 novel Neuromancer popularized the concept of "cyberspace," described as "a consensual hallucination…abstracted from banks of every computer in the human system." In 1990 the science fiction writer Douglas Adams wrote and starred in a "fantasy documentary" called Hyperland, which predicted an Ask Jeeves–like electronic butler, indulging our every informational desire, a couple of years before Web browsers had even been invented. The computer with all the answers is once again at hand.

And the Kitchen Computer? Four decades after Honeywell's disaster, the idea of a computer as a cookbook has finally been realized in the form of Google, which for a new generation is fast becoming the cookbook of first resort. Decades of devotion to our machines are paying dividends, and using a search engine to figure out what's for dinner certainly beats learning binary.

Katherine Mangu-Ward is an associate editor of Reason.

Advertisement

NEXT: Go On and Take It

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. I know I’ll probably sound like a kook, but society’s reliance on computers, especially networks, is kind of spooky. I went to Blockbuster one day and found out that I could not rent a movie because some satellite somewhere was down. Same thing happens occasionally at gas stations…even if you pay with cash! I know these things are minor inconveniences, but there are so many more things that depend on networks and satellites which could have a severe impact on our lives if they should go down.

    [/tinfoil hat]

  2. jimmydageek:

    Stuff like…computers at an airport like LAX that handle the no-fly list and cause 20,000 people to be stuck for hours and hours?

  3. If you look at early home computer advertisements, they all mention storing recipes as one of the great applications. Of course, nobody actually did that, but they had to try to think of some reason why a housewife would care about (or at least tolerate) a computer.

  4. jimmydageek:

    Never mind computers, what about electricity? Or fresh water?

  5. I sold home computers back when their actual utility was less than clear (c. 1983). Word processing was an important selling point, but the biggee was spreadsheets like VisiCalc and, not too much later, Lotus 1-2-3.

    I was in high school at the time, so the utility to me was computer games, primitive as they were back then.

  6. games like Where in the World is Carmen San Diego?

  7. Carmen San Diego was an awesome game! So was The Oregon Trail! I kicked ass at both 🙂

  8. I think it’s kind of sad how much I use google to convert measurements for cooking. I subscribe to Cook’s Illustrated online and print out recipes (that way if they get splattered, I just reprint them!) I’m on cooking groups on livejournal. Am I doomed to start my own cooking show on youtube?

  9. Randolph Carter,

    Nah, that was too advanced for “PCs” in 1983. At that point, some text-based games were still as fun as the graphical ones. I can’t remember anything specific on the machines I was selling (Kaypro), but there were some okay games on the new IBM PC, the Apple whatever, and the various other platforms (even Atari sold PCs back then).

    megs,

    Like this man roasting a chicken?

  10. Kaypro! Wow! Of course, my first computer was an Osborne (Osbourne?). What a difference a coupla’ decades makes…

    CB

  11. Being a computer geek, I always wonder how all the people who would have been computer geeks, but were born too early, got along. If you were born in the twentieth century, I guess you could play with ham radios and toy train layouts, but what about the poor would-be hacker who was born in feudal Europe.

  12. Cracker’s Boy,

    I had a friend who owned an Osborne when I was selling the Kaypro. I thought the latter was better–it had a bigger monitor, and the company kept adding features. The Kaypro II had the power of dual floppy drives, and the Kaypro 10 had–wait for it–a ten megabyte hard drive. All in an entirely portable and self-contained unit. Portable, that is, if lugging twenty pounds around doesn’t bother you. Oh, and it used a CP/M operating system with a monstrous 64K of RAM. Zooom!

  13. Mike Laursen,

    They probably would have died in childhood. Those who would have made it to adulthood would have been more concerned with growing enough food to avoid starving in the winter.

    During the industrial age, they may have been interested in factory machines or railroads. The automatic looms used something close to programs to operate.

  14. Nah, they’d’ve been into alchemy.

  15. Wow, ProLib, I had a friend in the eighties who swore that Kaypro was the greatest thing going.

    We did both laugh about the “portability” though.

    He lives in Tampa too. Maybe you sold him his. 🙂

  16. Isaac,

    What’s funny is that it WAS a good computer in its heyday. But it took the company too long to accept the MS/IBM standard. Or to crush the standard before it came to life.

    My commission was something like $250 per computer sold, which was insane money for a 17-year old. I didn’t sell all that many, but I was making loads more than most of my friends.

  17. Yeah, I suppose someone in the future will be asking what did nerds do before home genetic engineering labs: oh, they had to settle for playing with computers.

  18. One of the things that I find annoying about the increasing prevalence of advance technology in our everyday lives is that, for good or ill, machines very often become de facto add-ons to our minds and bodies.

    Like, I used to have a bunch of my friends phone numbers memorized flat. Since I broke down and got a cell phone, I don’t know anyone’s number, except a few from before. If I lose the phone, I’m screwed.

  19. Randolph Carter,

    Your name really makes me wish there was a video game called “The Dream Quest of Unknown Kadath.” There could even be a secret weapon called Pickman’s Axe!

  20. While the article was mildly amusing, and the comments take me way back (Sinclair or TRS-80, anyone?), it would be nice if reason fact-checked articles, particularly with regard to unfolding history. It’s dismaying to see dates and events treated so haphazardly when the fabric of the Internet’s history is being unraveled before our eyes due to sloppy reporting.

    “In 1990 … “Hyperland” … a couple of years before Web browsers had even been invented.”

    The World Wide Web was [i]born[/i] in 1990 with the release of the HTML language. However prior to that there were several “browsers” that were quite capable of performing gopher/archie/WAIS searches and rendering documents composed in any number of markup languages. Personally, I was using both Erwise and Viola prior to 1990.

    It may seem small, but these types of small mistakes gather steam in our immediate-gratification culture, and tend to pollute future articles. Perhaps Ms. Mangu-Ward is the victim of a previous error?

    It would have been a better article if she had focused on the now-bizarre marketing strategy targeting housewives (did/does anyone really have such a large recipe collection that it couldn’t be simply managed via index cards?), rather than attempting to lay out a timeline.

  21. Ah, a TRS-80 using a tape cassette for memory. Oh, yeah.

  22. I was into computers when you had to know your way around a soldering iron and an oscilliscope AND be able to program in binary.

    I sold about 500 EPROM eraser kits. It was the fastest cheap eraser on the market for a while.

    I also got published in Kilobaud.

    That was great fun.

    However, I like what we have now better.

    I knew Ward Christiansen and Randy Seuss from the CACHE Club in Chicago. Exchanging programs at 2400 baud was state of the art. When that got to 9600 we were really zooming. My cable is up to 10 Mbaud and I could get faster if I wanted to pay more.

    Ted Nelson came to a few meetings and gave talks and pushed his books.

    The transition from 8″ floppies to 5 1/4 was traumatic. The 5 1/4 is now the standard for size even though it is not used any more in new computers. I still have a few of the old 8″ discs lying around.

Please to post comments

Comments are closed.