Feature Story


More feature stories by year:

2024
2023
2022
2021
2020
2019
2018
2017
2016
2015
2014
2013
2012
2011
2010
2009
2008
2007
2006
2005
2004
2003
2002
2001
2000
1999
1998

Return to: 2015 Feature Stories

CLIENT: JON PEDDIE RESEARCH

Dec. 8, 2015: Electronic Design

11 Myths About Computer Graphics


Dr. Jon Peddie, President,
Jon Peddie Research
Over the years the magic of computer graphics (CG) has sparked the imagination and curiosity of millions—the specialist as well as the consumer. The movies, games, advertisements, science, medicine, and designs we have today would not be possible without CG. And because so much of it is magic (Arthur C. Clarke) and unknowable to those not in the priesthood of CG, we make up our own explanations (you older folks might remember the explanations for Les Paul and Mary Ford's original remixed songs).

So, since fools rush in where wise men don't, I've undertaken it to look at some of the myths of CG. I'll also favor you with some of my favorite axioms. This no doubt is not going to win me many new friends, but hell, I still have my cat. I do, don't I?

1. Aren't integrated graphics catching up with discrete?

Yes, they've almost caught up to 2010 discrete, and in another five years, they will catch up to today's. But then discretes will be at 2020—Moore's law works for all semiconductors, not just CPUs.

2. Enough already. Shouldn't 16 GB of graphics memory and 1-GHz GPU clocks be good enough?

Peddie's axiom #1: "In computer graphics, too much is never enough." The goal of CG is to create an image, or a series of images in the case of movies, that are so realistic the viewer suspends disbelief. We are zillions of gigabytes and gigahertz away from that.

3. All the colors in the rainbow can be generated in 8-bits—256 colors (pause, while I get control over my ROTFL).

The spectrum of colors in a simple rainbow is almost infinite. The acuity of your eye is enormous—color perception greater than 10 million colors, plus mixed hues. The upper limit isn't really known. The new 4K TVs with 10-bit color, referred to as high-dynamic color (HDC), brings us over a billion colors, and we can see them, all of them.

4. Anti-aliasing eliminates the jaggies.

Nope, your brain eliminates the jaggies. All screens today are quanta—little square pixels, little discrete boxes. Think of really small graph paper. Anti-aliasing shades adjunct pixels from the color of the line or edge to the color of the background, and your brain interprets that as a smooth line. That's because your brain doesn't like discordant things, and jaggies drive your eyes nuts as your brain tries to figure out where to focus.

5. Ray-tracing creates a perfect image.

Not really. Ray-tracing, materials libraries, and global illumination combined can create a(n almost) perfect image. Light bouncing off of, influencing and being influenced by what it bounces off of, creates the most complex sea of colors and luminance possible, and we still can't fully replicate it in our primitive computers and limited 4k screens.

6. Stereo 3D gives real depth.

CG is all about tricks. Images to trick the mind into thinking its seeing something it really isn't.Stereovision is a prime example of such a trick. It's a very good trick, but it relies on fabricating imaginary data, objects in the case of movies and games, and TV (you do remember 3D TV don't you).

7. Overclocking your CPU will make your game better.

Overclocking your CPU will make you crazy(ier – cause you'd have to be a tiny bit crazy to do it anyway). By overclocking your CPU and/or your GPU, you can get a game to deliver more frames per second (fps), and that's a better game play. Except it's only good in benchmarks. You can get an fps counter to run while you play a game. But if you're looking at it instead of where you are going, you die.

8. A 19-inch HD screen is all you need.

Remember the days when you had a desk and that desk was covered with papers, usually several papers thick; a kind of vertical filing system in the open. And then you moved to the computer and tried to replicate a 3D filing system with a surface area of at least 10 sq. ft. into a 15-in. VGA monitor. So now you spend too much time suffering apps and documents on screen and not working... which leads us to Peddie's axiom #2: "The more you can see, the more you can do." I use three, 30-in., 4k screens—and I want more.

9. Cars flip, stuntmen crash, pigs can fly.

No, no, and well, probably not. Today's movies are more special effects then they are real. Cars don't flip (see Newton's 2nd law), people can't fall three stories, get up, brush themselves off, and then chase the bad guys. As for pigs, OK, that's a special case. It's all CG (except for the pig flying part).

10. Things break.

Maybe. In a game, you can have real physics, or what's known as baked-in (or game) physics. In game physics, if you break something, it always breaks the same way. In real physics, if you break something, it breaks depending on how hard you hit it, and at what angle you hit it. You'll never look at a game the same way again. You're welcome.

11. And, the not-a-myth, Blinn's law: "As technology advances, rendering time remains constant."

Most people think the goal in CG is to get computers and GPUs to run faster so that the development of a design or movie will take less time. Aha! But refer back to Peddie's Axiom #1. We don't want faster—we want prettier, more believable. If you can give us more compute cycles, we'll spend on making a scene you will gasp at and wonder if it's real or not. That's what we live for in the CG world.

Return to: 2015 Feature Stories