computers

When Does Choosing A Better Computer Become Wasteful?

No, I'm not referring to green computing devices. Though, apparently computers account for 2% of the world's carbon emissions. I'm swapping my nearly four year-old PC notebook for a new 15" MacBook Pro. Can you say upgrade? Like many buyers, I'm tempted to get the fastest possible machine with the most memory, given my budget.

But I keep thinking about over-processing. It's wasteful to get a tool that's more powerful than what's needed for the job. Here's some of my thinking.

  • How fast? I do minimal multimedia work. Mostly, I access databases and documents on a local network and remotely, create text-based documents, and work on the web. But time is money (my time ends up being my clients' money, to be precise). So, I decided to get the fastest available processor along with a solid state drive. I can always upgrade RAM, but predict 4 GB will be plenty for 95% of my work.
  • How much memory? With a 500 GB hard drive I can save data for years to come without worrying about usable disk space. But I've only got 65 GB of data now. So I decided to get the 128 GB drive. I can always upgrade when I near capacity. And who knows what cloud storage options will look like then.

The hardest decision was whether to get a solid state drive. Ultimately, I chose one because they're more reliable (no moving parts) and run cooler (no motor). The result is a more efficient machine, with the related benefit of a longer battery life. I decided to go with the Apple OEM drive rather than with a third party upgrade. There may be better after-market drives out there, but I'd rather avoid any potential problems with warranties and service. If there's a problem, it's Apple's to fix. Period.

Now, it wasn't that hard to identify the right machine for today's work. The over-processing analysis would have been easy from that standpoint. But predicting the appropriate tool for two to four years from now? Given the extraordinary rate of change in consumer electronics and the web -- who knows what we'll all be doing then. That's what made this a challenge.

Has anyone else experienced this challenge when buying a computer? From an enterprise IT perspective, our firm certainly has, and larger organizations must have it even worse.

Imagining Personal Computing In The 1960s

Circa 1969, here's how folks foresaw personal computing and the internet:

Funny how they assume technology will change but gender roles won't.  Not to mention hairstyles and interior design.

But pretty well predicted, really -- folks certainly envisioned the basics of modern e-commerce and email. And the video even showcases one of my personal favorite innovations: three monitors.

(hat tip Kottke and Andrew Sullivan)

It's the Network, Stupid

It's official in my book.  At least for the near future, network performance is the limiting reagent for computing:

"Carrier networks aren’t set to handle five million tablets sucking down 5 gigabytes of data each month,” Philip Cusick, an analyst at Macquarie Securities, said.

Wireless carriers have drastically underestimated the network demand by consumers, which has been driven largely by the iPhone and its applications, he said. “It’s only going to get worse as streaming video gets more prevalent.”

An hour of browsing the Web on a mobile phone consumes roughly 40 megabytes of data. Streaming tunes on an Internet radio station like Pandora draws down 60 megabytes each hour. Watching a grainy YouTube video for the same period of time causes the data consumption to nearly triple. And watching a live concert or a sports event will consume close to 300 megabytes an hour.

Debates over the 16 nm node barrier and other theoretical limits of Moore's Law are certainly more interesting.  But in terms of what really constrains our ability to use technology, I think network issues will predominate.

Most people I know are reasonably happy with the speed of their computing devices, especially with newer devices.  But who doesn't wish for faster connectivity?

It's interesting how the size of operating systems are leveling off, or even getting smaller, and virtualization is helping to maximizing existing infrastructure.  Homegrown processing power is a sideshow.  The network is the main event.  Witness the rise of web apps, cloud computing, internet media, gaming, and an increasingly mobile or remotely-based workforce.  The trends don't bode well.

I'm not sure why industry was unprepared for this.  It's not as if these trends were unforeseeable. How we solve this emerging problem should be interesting.

D. Mark Jackson

Bookmark and Share