A look back at the pre-GPU era

For most people, the story of graphics cards begins with Nvidia and AMD. The reality is completely different.

Long before either of these companies started dominating the market, and long before GPUs powered modern gaming, we were witnesses to the early days of computer graphics. To help understand how we ended up with modern GPUs, we’ll have to go back to a time before the term “graphics card” existed.

HTG Wrapped 2025: 24 Days of Technology

24 days of our favorite gadgets, gadgets, and tech

What happened before the advent of graphics cards?

For most people, computer graphics wasn’t a thing yet.

Televideo ASCII character mode terminal. Credit: Wikimedia Commons

In order for us to now be able to discuss all the new GPUs coming out in 2026, someone had to invent computer graphics in the first place — and that’s not the same thing as a GPU, not even close.

Early computers did not display images, windows, or even graphics that could be controlled by pixels. Most of the output was text-based. Electromechanical teletypewriters, such as the 1963 Teletype Model 33, which adopted the new ASCII standard, were essentially glorified typewriters, spewing out results on paper line by line.

They were painfully slow, loud, and very literal. There were no visuals to speak of other than the text I told the computer to print.

Next came video stations, which were referred to as “dumb stations.” They were basically keyboards connected to monitors, but they were not computers themselves, they were all part of a network connected to a host computer, and the terminal would display whatever the host was sending. The screen was divided into a fixed grid (usually 80 columns wide), and each small square could display a character from a predefined set. This allowed people to get creative and create some basic ASCII art, but it was all done using characters that were pre-programmed and assigned to specific keys.

The 1960s saw the emergence of computer graphics in some form

Even while most people were still stuck with text, some researchers were already experimenting with interactive graphics. In 1963, Ivan Sutherland created Sketchpad, a system that allowed users to draw and manipulate line drawings directly on the screen. Sounds very similar to today’s touchscreens, doesn’t it? I used Sketchpad and a light pen to achieve this.

Interactive computer graphics are already coming to life in major corporate systems. IBM began shipping graphics terminals such as the 2250 in 1965.

The introduction of the personal computer accelerated the development of graphics

But there is still little to do with graphics cards.

IBM PC Model 5150 with a monochrome display. Credit: Wikimedia Commons

The late 1970s brought us the advent of personal computers. Not as we know them today, but in general. Before that, computers were huge machines that took up entire rooms and were used by businesses; The advent of computers made them widely available. This makes us more accessible to computer graphics.

The early days of computer graphics gave us low-resolution, often monochrome displays. Personal computers had limited memory, which meant that programmers had to be clever to achieve something remotely attractive in terms of graphics output.

Devices like the Radio Shack TRS-80 provided raster graphics, but at a very low resolution (128 x 48).

Before graphics cards and accelerators, CPU and memory played a huge role in display output. Because early personal computers had kilobytes, not megabytes, of RAM, storing full-screen images was expensive and impractical. Graphics were minimal and were vigorously reused as required.

This was a time before any standard image formats, before JPGs, BMPs, and PNGs. Software had to store images as raw bitmaps or custom data structures, and compress them like there was no tomorrow.

IBM’s early display standards shaped computer graphics

Apple was also at the forefront with the Macintosh.

Apple Macintosh 128K at the Apple Museum in Prague. Credit: Wikimedia Commons

IBM was a big player in the early days of personal computing. In 1981, IBM introduced the Personal Computer, followed by the PC-XT in 1983. However, the original IBM PC had no graphics power. Most workloads were still text-based, so text legibility and reliable software were the main concerns.

The IBM PC could be configured with the Monochrome Display Adapter (MDA), which was cutting edge at the time, and provided high-contrast text with no bitmap graphics at all. It has been used for word processing, databases, and spreadsheets.

The Color Graphics Adapter (CGA), also introduced with the IBM PC, finally added basic color graphics, but of course, the color palette was very small, and everything was very low-resolution. But hey, at least we had some drawings.

Apple took a different approach with its early Macintoshes, or Macs, as we know them today. Macs treated the screen as a bitmap rather than a pre-programmed character grid, giving the user more freedom. Apple has made headway in industries like graphic design and publishing thanks to these choices, and they remain a viable option for similar workloads today.

Although IBM wasn’t very generous with its graphical interfaces in those early days, it did something far more important: it helped standardize PC display modes in the IBM compatible ecosystem. As developers and manufacturers improved their products to be compatible with IBM as the industry standard, the door was wide open for computer graphics to finally flourish.

The emergence of 2D graphics was a pivotal moment in the world of computing

However, we are still a long way from today’s GPUs.

IBM PS/1 computer. Credit: Wikimedia Commons

By the late 1980s, bitmap graphics modes had become mainstream on IBM-compatible computers, and the advent of Windows meant more software was optimized for the pixel-addressable display. Text modes didn’t disappear overnight, but basic graphics became the norm. By 1987, IBM introduced PS/2 VGA, which became the baseline for the PC, and although these ports are officially quite obsolete now, they were revolutionary at the time.

Finally, VGA technology made personal computers capable of displaying near-realistic images, games, and movies. It also greatly expanded the range of practical resolution and color options in computers, introducing the popular 256-color mode.

Even if most videos were still small, heavily compressed, and generally unimpressive by today’s standards, they were still groundbreaking at the time; Besides, VGA finally became a baseline that developers could target, accelerating the graphics revolution. SVGA technology arrived in the early 1990s, and personal computers could finally run higher resolutions, up to 1024 x 768.

However, the CPU was still largely handling graphics, and dedicated “graphics cards” as we call them today were not yet common, although display adapters like CGA existed. But by the late 1980s and into the 1990s, 2D acceleration became more widespread.

Early 2D acceleration was implemented as dedicated hardware on add-in graphics cards (again, they weren’t called that at the time). In the 1990s, these drives increasingly became 2D drives built into mainstream VGA/SVGA cards.


These 2D accelerators did a lot of the heavy lifting in making your PC more responsive. More importantly, they demonstrated that computer graphics was a workload worthy of dedicated hardware, which eventually led us to the GPUs we know today.

Leave a Comment