Ten years of Nvidia

I haven't been writing actively for a couple years, and having a breath to catch up, I'm working on remediating that absence. I just took a quick scan through drafts I'd started, and I tripped over one from 2013 (yes, ten years ago). The draft was really an email I'd sent pasted into the editor to be picked up later. At the time, it may have been somewhat interesting, but in hindsight, I think it may be more impactful. Here's the email - the original context was me trying convince my ex-wife that a game development class my son was interested in wasn't a waste of time:

I sort of hinted at some reasons I thought the "intro to gaming" class [my son] was looking at next semester might be more useful than it sounds, but I probably wasn't very clear about it.

Here's an announcement from this morning from a graphics card mfg:

http://www.anandtech.com/show/7521/nvidia-launches-tesla-k40

http://www.engadget.com/2013/11/18/nvidia-unveils-tesla-k40-and-ibm-deal

http://www.anandtech.com/show/7522/ibm-and-nvidia-announce-data-analytics-supercomputer-partnership

If you actually look at the picture of the card, you'll notice there's nowhere to plug in a monitor, because this graphics card isn't really a graphics card in the same sense that we think of them.

As these cards have become crazy-specialized & powerful in the context of their original purpose, their insane number-crunching capabilities haven't been lost on people beyond game developers.  In the same way that you see these cards supporting physics engines in games, they're now becoming more and more commonly used in scientific and engineering applications because this same type of computing can used for simulations and other scientific uses.

As I mentioned to [my son] when he was working in Matlab, the type of programming needed for this sort of processing is very different from the traditionally-linear do-one-thing-then-do-the-next-thing style of programming used for "normal" CPU's -- by its nature, it has to be asynchronous and massively parallel in order to take advantage of the type of computing resources offered by graphics-type processors.

Cutting to the chase, even though "game programming" probably sounds like a recreational activity, I think there's a decent chance that some of the skills touched on in the class might translate reasonably well into engineering applications -- even if he doesn't necessarily see that during the class.

Damn. Right on the money. Ten years on, Nvidia rules AI. Why? It's the chips and the programming model. All the reasons I cited to give game programming a chance have come to pass, and for what it's work, the kid wound up using Nvidia GPUs to build neural networks in grad school.

Game programming, indeed.

Leave a Reply

Your email address will not be published. Required fields are marked *