© 2025 | Jefferson Public Radio
Southern Oregon University
1250 Siskiyou Blvd.
Ashland, OR 97520
541.552.6301 | 800.782.6191
Listen | Discover | Engage a service of Southern Oregon University
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
The Jefferson Journal is JPR's members' magazine featuring articles, columns, and reviews about living in Southern Oregon and Northern California, as well as articles from NPR. The magazine also includes program listings for JPR's network of stations.

Inside the Box: The End of WIMPy Computers

The Apple Lisa, a GUI-based personal computer with WIMP circa 1983
Wikipedia
The Apple Lisa, a GUI-based personal computer with WIMP circa 1983

The first time I used a computer, the only thing I could get it to do was generate syntax errors. A “syntax error” is what you get when the code you’ve written contains errors that are uninterpretable by the computer.

The year was 1984 and I was a high school sophomore taking “Computer Programming”. We were learning to program in BASIC on Apple IIe computers and I was sort of a prodigy when it came to generating syntax errors. My teacher, who was also my basketball coach, told me I had a better chance of becoming an NBA basketball player than I did a computer programmer.

I didn’t play in the NBA nor did I become a computer programmer. I did, however, take a course entitled “Introduction to the Apple Macintosh Computer” while I was in college that set in motion my career in tech.

Released in 1984, the Apple Macintosh was Apple’s second iteration of its mass-market GUI-based personal computer. “GUI” stands for “Graphical User Interface” that, today, is the familiar collection of windows, dialogue boxes, menus, and icons that you click on to execute tasks on a computer, tablet, or smartphone.

I loved the Macintosh. No typing of commands to execute code. No computer programming required. No syntax errors. Just move the mouse-pointer with your hand and click on icons to launch applications and select functions from menus. It was magical and I was hooked.

Both the Macintosh and Microsoft Windows operating systems can trace their roots back to the Alto personal computer developed at Xerox’s Palo Alto Research Center (PARC) in 1973. The Alto became well-known and its GUI touted as the future of computing by early tech entrepreneurs in Silicon Valley.

Many of today’s most popular generative AI systems like OpenAI’s ChatGPT, Google’s Gemini, and Anthropic’s Claude still utilize WIMP for interaction with users typing prompts into a GUI. But that’s rapidly being replaced by humans interacting with AI systems using spoken language.

In fact, Steve Jobs and a team of Apple engineers visited Xerox PARC in 1979 for a demonstration of the Alto. It is no coincidence that those same engineers built Apple’s first GUI personal computer called “Lisa” in 1983. It’s also no coincidence that Microsoft, which dominated the personal computer market with its MS-DOS operating system, released Windows, a GUI-based operating system, in 1985.

Fast-forward some 40 years, and the GUI-based operating system remains the computing paradigm with Microsoft Windows still dominating 72 percent of the global desktop operating system market with Apple trailing a distant second at 16 percent. Meanwhile there are more than 7 billion smartphones worldwide that all run GUI-based operating systems with Android currently capturing 72 percent of the global smartphone operating system market and Apple iOS in a distant second with 28 percent.

All of this is about to change.

“I think user interfaces are largely gonna go away,” said Eric Schmidt, former CEO of Google, in a recent interview.

“You can talk to them [AI agents]. You can tell them what you want and the UI can be generated for you,” Schmidt said. “Why do I have to be stuck in what is called the “WIMP” interface—Windows, Icons, Menus, and Pulldowns—that was invented in Xerox PARC 50 years ago? Why am I still stuck in that paradigm? I just want it to work.”

We’re not so much “stuck” in this WIMPy computer paradigm as it’s worked quite well and we haven’t invented a better, widely adopted alternative.

But that’s mostly because, until quite recently, computers weren’t very good at understanding human language and had to be given specific instructions in a language they could understand. Modern-day computers are binary systems that perform computation using electronic switches that are either on or off. All data processed and stored in a computer system is converted into a binary format of 1s or 0s. This is referred to as “machine language” and it is the only language computers understand.

Machine language looks like this: 01001000 01100101 01101100 01101100 01101111.

Humans are very bad at understanding machine language. When we say “hello” to someone, we just say, “Hello”—not the above, which is how a computer computes “Hello” in machine language.

Computer programming languages are highly structured formal languages that humans (well some humans) can use to instruct a computer what to do. Those instructions then pass through a “compiler” that translates the code into machine language that the computer can understand and compute.

Today’s AI systems run on this same computer architecture and process information using binary data. But they’re not programmed the same way as traditional applications like, say, Microsoft Excel. Traditional software operates in an environment where every instruction and condition has to be explicitly programmed.

While they do use programming languages and algorithms, AI systems use “machine learning” and are architected to be trained on large datasets, to learn patterns and make decisions. And while they still have traditional programming languages under the hood, they can be given instructions using human language. In other words, human language is becoming the “programming language” for AI.

Many of today’s most popular generative AI systems like OpenAI’s ChatGPT, Google’s Gemini, and Anthropic’s Claude still utilize WIMP for interaction with users typing prompts into a GUI. But that’s rapidly being replaced by humans interacting with AI systems using spoken language.

And if Eric Schmidt is right (and I believe he is) these generative AI systems will generate customized interfaces for us on the fly for visualizing and manipulating data in collaboration with AI.

Eventually, our WIMPy computers will be beaten out of existence by strong AI capable of creating personalized immersive simulations populated with interactive avatars, entire virtual worlds where we won’t just create, process, and consume information—we’ll experience it.

Scott Dewing is a technologist, teacher, and writer. He writes the technology focused column "Inside the Box" for the Jefferson Journal. Scott lives on a low-tech farm in the State of Jefferson. He was born in the same year the Internet was invented and three days before men first landed on the moon. Scott says this doesn't make him special--just old.