Whatever happened to ... this modern marvel?

Whatever happened to ... this modern marvel?

URBANA — The next time you pick up that supercomputer in your pocket, take a moment to consider its famous ancestor at the University of the Illinois.

The Illinois Automatic Computer, dubbed ILLIAC, went online in 1952, one of the first computers built with a design that is now ubiquitous in the computing world.

It was a 10-by-10-by-2-foot monster, weighing 5 tons. Its 5 kilobytes of memory consisted of 2,718 vacuum tubes and 40 cathode ray tubes (plus more, in later models, in a giant rotating drum).

And its speed?

It's hard to describe in today's terms, with the average smartphone faster than the supercomputers used in 1980, when computer science Professor William Gropp was a graduate student.

"Machines are literally over a million times faster than when I started in my career," said Gropp, now acting director of the National Center for Supercomputing Applications at the UI.

The ILLIAC could add two numbers in 180 microseconds — about two thousandths of a second. Not bad for the 1950s.

By comparison, a single processing element in a computer chip today can do it in a third of a nanosecond — half a million times faster for each operation, Gropp said. Some chips can perform 1,000 of those at the same time, and computers have many chips. A processor in a typical PC can do a few billion "flops" (floating-point operations per second).

But ILLIAC holds a prominent place in computer history. It was the first automatic electronic digital computer built and owned by an American university. With others of its generation, it helped set the direction of computer technology for decades to come.

And the ILLIAC line launched computer science at Illinois on a path of innovation in architecture and high-performance computing that continues today, Gropp said.

"These were literally some of the world's first supercomputers," said Professor Rob Rutenbar, head of the UI Department of Computer Science. "It's why we became the world power in supercomputing that we are."

Exhibit A: Blue Waters, the latest supercomputer to be built on campus, which operates at petaflop speed (1 quadrillion floating-point operations per second).

Exhibit B: the HAL 9000. True, it was fictional, but the mellow-voiced and sinister computer in Arthur C. Clarke's novel and film, "2001: A Space Odyssey" officially "became operational" on the UI campus on Jan. 12, 1992.

"The reason Arthur C. Clarke picked Illinois for this amazing machine in his fiction was that it was the place he thought would produce such a thing," Gropp said.

It's fitting, then that remnants of ILLIAC are held both on the UI campus and at the Computer History Museum in Mountain View, Calif., near San Francisco.

The lobby of the UI Siebel Center for Computer Science has several display cases with components from ILLIAC I and three of its descendants, ILLIAC II (1956-1968), ILLIAC III (1960-68), and CEDAR, sometimes referred to as ILLIAC V. A memory drum from ILLIAC I is also part of an exhibit at the Spurlock Museum on campus.

The Siebel display includes a large chassis from ILLIAC I, with circuits that "flip flop" between 0 and 1 for the computer's binary coding system. It also features one of the cathode ray tubes and its memory chassis, a box lined with three layers of shielding.

The Computer History Museum's collection includes logic modules, chassis and programs from ILLIAC I, as well as circuit boards, hard disk drives, coding documents, designs, manuals and other components of ILLIACs II through IV — 88 objects in all. The most impressive is the ILLIAC IV integrated circuit memory system, the size of two refrigerators on wheels, said senior curator Dag Spicer.

The museum was founded in 1979 by legendary computer designer Gordon Bell, who knew Daniel Slotnik, the UI professor who built ILLIAC IV, which is how the museum acquired so many ILLIAC parts, Spicer said.

The ILLIAC line of computers was built at different locations, some at the UI, between 1951 and 1974.

Computers were extremely rare in the 1940s, and after trying and failing to buy one commercially the UI decided to build its own. According to the Illinois Distributed Museum, an online history of UI innovation, a team led by Graduate College Dean Louis Ridenour sent a proposal in 1949 to UI President George Stoddard recommending that the university build a high-speed computing machine.

It was to be based on the designs of John von Neumann, a brilliant Hungarian-American mathematician and physicist at the Institute for Advanced Study at Princeton University, which counted Albert Einstein among its faculty.

Ridenour, a physicist who'd been recruited to Illinois in 1947, worked on the ENIAC (Electronic Numerical Integrator And Computer) at the University of Pennsylvania in the early 1940s.

The ENIAC and other early computers in the 1930s and '40s used punchcards and paper tape to feed instructions into the computer. The system featured programmable punchcard machines with hundreds of wires that had to be hooked up for a specific job. It might take two weeks to set up a problem that the computer only needed 20 minutes to solve, Spicer said.

The IAS machine designed by Von Neumann and two colleagues stored the program electronically in the machine, just as the data was, eliminating the need "for all these crazy plugboards," Spicer said. It could modify itself while it was running and do more complex calculations faster because it wasn't relying on a mechanical system, though punchcards were still used to enter data initially, he said.

The new design was also much smaller than the ENIAC, which was "about the size of a basketball court," he said.

There were 17 machines around the world based on von Neumann's design, all with the suffix "-AC," he said. Most notable, perhaps, was MANIAC, used in the production of the hydrogen bomb at Los Alamos National Labs.

The ILLIAC's design was identical to ORDVAC, a computer created for the Ballistic Research Laboratory in Aberdeen, Md., according to historical accounts by the UI Archives.

The lab had proposed that the UI build two machines, one for the lab and one for Illinois to keep. ORDVAC — Ordnance Discrete Variable Automatic Computer — was completed first and was used to solve ballistics and weapons systems problems.

ILLIAC became operational on Sept. 22, 1952. It featured an arithmetic unit, memory storage (the tubes), devices for input and output of information, and the ability to transfer information. A separate teletypewriter provided output in printed form, according to Archives.

It came to be called the "electronic brain" because of its circuits for storing, transmitting, adding, subtracting, multiplying and dividing numbers in digital form.

In the '50s, computers were mostly involved in numerical computations, said Bill Gear, former computer science department head and now an emeritus professor at Princeton. He was offered a research assistantship as a graduate student to write code for the ILLIAC I. He also worked on the design of the ILLIAC II, which came online during his first year on the UI faculty, and he was responsible for all of its software.

More abstract concepts, such as artificial intelligence, came later, Gear said, although some people were already involved in preliminary attempts.

"I recall attending a lecture at my undergraduate university in Cambridge, UK, about machine translation. The speaker was quite optimistic about its prospects, predicting that the problem would be solved 'fairly soon' — even though the computer they had at Cambridge was much smaller and slower than the ILLIAC I, which itself had less than the power of the first PC and much less memory," he said. "The speaker was only out by about half a century!"

But ILLIAC also performed advanced computational analyses and administrative tasks — evaluations of radar and antenna patterns, atomic blast effects, stresses in materials used for the construction of highway bridges, and even the composition of music.

By 1956 it had more computing power than all computers in Bell Labs combined. After the 1957 launch of Sputnik, ILLIAC was used to calculate the satellite's orbit.

It was also used in the revolutionary computer educational system known as PLATO (Programmed Logic for Automatic Teaching Operations) developed by former UI Professor Donald Bitzer in the 1960s. He rigged ILLIAC to work with a television output and started to program the very first online courses. His work led to innovations like the first keyboard input system, plasma display and touch screens, interactivity with other users and even instant messaging.

ILLIAC I was decommissioned in 1963 when ILLIAC II came online.

Each iteration of the ILLIAC featured novel computer hardware or architecture, Rutenbar said.

ILLIAC II, a hundred times faster than the original, was one of the first computers built out of transistors and semiconductors — not vacuum tubes, which were hot, weighed a ton and didn't last very long, he said.

Gropp remembers reading about students on roller skates moving around to replace vacuum tubes in the ILLIAC.

"It was a technology that was just about at the edge of what made sense, because it was essentially so unreliable," he said. "The parts had to be replaced so often."

ILLIAC III grew out of a pattern recognition computer developed by the Department of Physics, enabling researchers to analyze visual data.

ILLIAC IV was one of the first attempts to build a massive parallel-computing machine. But it didn't stay at the UI. In 1970, when the public learned that a majority of the computer's time would be used by the Department of Defense, campus protests led to its removal to the Ames Laboratory in California.

CEDAR, a shared-memory supercomputer completed in 1988 by a team led by Professor David Kuck, is sometimes referred to as ILLIAC V.

The ILLIAC name continued to be used through 2006, when the Trusted ILLIAC was completed at the UI Coordinated Science Laboratory and Information Trust Institute.

Gropp said ILLIAC and other von Neumann machines set the standard for the way all computers are broadly organized today. A central processing unit performs actions on data and instructions that are stored elsewhere in the machine, and it's surrounded by layers of memory — limited access or cached memory, main memory, files, tapes, etc.

"It's an elegant model," he said. "Now all of our programming languages, the way we instruct the computer to do something for us, reflects this model, for good or bad." Its simplicity made it possible to make computers faster and faster and design ways to program them in more sophisticated ways, he said.

The UI's foresight to start investing in the ILLIACs a half-century ago created a "technology culture" on campus that remains today, Rutenbar said.

"Of course we'll have one of the fastest computers on the planet. It's who we are. It's what we do. And if we have to invent a whole bunch of hardware or invent a whole bunch of software to make this stuff happen, OK, we can do that," he said.

"I think you can trace a fairly straight line from the sort of pioneering innovations of the early ILLIACs to things like the NCSA, which is literally the apex of supercomputer centers. We're right up in the world list of biggest, baddest machines," Rutenbar said.

 

Editor's note: This story has been updated to correct information about the MANIAC computer at Los Alamos, which was used for work on the hydrogen bomb, not the first atomic bomb.

 

Sections (2):News, Local

Comments

News-Gazette.com embraces discussion of both community and world issues. We welcome you to contribute your ideas, opinions and comments, but we ask that you avoid personal attacks, vulgarity and hate speech. We reserve the right to remove any comment at our discretion, and we will block repeat offenders' accounts. To post comments, you must first be a registered user, and your username will appear with any comment you post. Happy posting.

Login or register to post comments

airrecon wrote on February 19, 2017 at 10:02 am

I believe Mr. Bitzer's first name was Don, not Dan.

Dan Corkery wrote on February 20, 2017 at 10:02 am
Profile Picture

Thank you. Donald Bitzer is correct.

Dan Corkery

managing editor for administration

sifiglich wrote on February 20, 2017 at 11:02 am

Also, this is a small nit, but 180 microseconds is about two TEN-thousandths of a second, not two thousandths (0.000180).  So you've cheated poor ILLIAC out of an order of magnitude of processing speed! :)

Doesn't change the fact that a modern smartphone is so, so much faster than those pioneering machines that it's actually hard to intuitively understand the difference in capability.

Sherwin Gooch wrote on February 24, 2017 at 2:02 am

According to David Kuck in his lecture to our class circa 1974, although ILLIAC I was designed slightly later than EINIAC, the ILLIAC I was debugged and became operational before EINIAC.  The EINIAC team and Penn perpetually claim to have built the first electronic digital computer, but as an experienced UIUC-educated computer engineer, as a child who played tic-tac-toe against ILLIAC I at Engineering Open House, as a local hobbyist who built projects from the bushel baskets of used ILLIAC 12AT7's, and as an 11-year-old would-be hacker who crashed ILLIAC I in 1963 (while it hosted PLATO II), I find this insulting:  How can an engineering team reasonably justify the claim that they were first to build an electronic digital computer when, to do so, they must claim that their implementation of the computer hardware was completed before that hardware became operational?  What a sham!  For shame!!!  Nonsense!!!  We Illini computer engineers must claim our just place in history!!!  As engineers we were best at actually getting the rubber to meet the road in 1952, and we still are.

I feel strongly that someone at UIUC should set the record straight.  I publicly call upon the President of The University of Illinois to form a committee to see that UIUC's record of having constructed the first electronic digital computer is properly attributed.  Perhaps we UIUC alumni should consider withholding constributions until this miscarriage of history has been rectified.

Who agrees with me?

Sherwin Gooch

Silicon Valley, CA