Friday, January 10, 2014

On the history of how to waste CPU cycles

If you've ever sat in front of a Windows machine thinking "stop animating the fucking dog and just tell me where the fucking file is", it turns out that operating system developers' penchant for wasting as much computing power as possible on computationally spurious activities at the expense of making useful functions instantaneous has a longer history than you might think.

Wired Magazine has recently published a talk by Steve Jobs from way back in 1980 (courtesy of the Compter History Museum archives) where, even at that time, he expressed the opinion that computers essentially had the raw processing power to meet the needs of many users' tasks and that he envisaged a future where latent processing would be spent on what we might term "user friendliness". Of course, by today's standards, he was surely setting quite a low bar in terms of the tasks that users expected to complete: on a spectrum going from "text editing on a green screen monitor" to "editing HD video in real time", people's expectations were generally towards the former end of the scale... But three decades on, it's interesting to note that much of Jobs' projections are true: a modern OS dedicates much of its code base and resources to essentially UI sugar.

What apparently wasn't envisaged by Jobs in the early 1980s (and indeed by those designing many of the fundamental communications protocols that still underly much of the Internet) was the proportion of resources that would need to be dedicated to security. How times have changed...

Thursday, January 2, 2014

Playing the game dev game

For those wishing to get a flavour of what it's like battling in the computer game industry without actually developing a game, check out Game Dev Story. This mobile game gives players a flavour of hiring, firing and training developers and being subject to the whims and volatility of the console market.