As a first blog post, I thought I would keep this simple and too the point, but relevent to all of us in a way. I was recently shopping for a new laptop to use as my main development and general purpose machine, to replace a dying older laptop I'd been using for over 3 years. One of the features of my old laptop, was the extremely high resolution display it came with. At the time I bought it, it was a built-to-order unit, that I customized a few things on, including the high resolution display option. The upgraded laptop display came with a resolution of 1920x1200, and was technically a 16:10 aspect ratio. This higher resolution was great for me, it allowed me to work on high resolution, even 1080p resolution video and graphic files and be able to fit it all on my desktop screen with room for things like the windows task-bar and application's toolbars visible. I could work on high resolution files without having to view my work in scaled down mode in order to have it fit on the screen. There are many reasons this resolution is great for, especially if your stuck on a laptop instead of a desktop and can't lug a 24"-32" monitor around with you to use.
Fast forward to just recently last month, and I was laptop shopping to find a suitable replacement for this, now older, laptop that I've been referring to. Finding most of my desired spec's wasn't difficult, as even the slowest of processor chips now days are faster than the Core2Duo that my old laptop had. Same with graphics chips and the like. Again, since this is my main workstation, at home and at the jobsite, and it's a development machine, I like having a powerful machine. I opted for the newest i7 processor, maxed out the ram, and got a great graphics card. The only really surprising thing I came across and discovered, was the revelation that at some point in the last 3 years or so, laptops have almost completely killed off any resolution options higher than 1920x1080 (referred to as 1080p). Surely this couldn't be, could it? Even the higher-end (and larger sized) gaming laptops, and desktop replacment laptops have seemed to ditch any option for 1920x1200 resolution displays these days! I found 1 HP laptop boasting the high resolution, however upon closer examination, the model that had this spec was recently dis-continued from what it appeared, and ran the older Core2Duo processor, and had an astonishing $3500 price tag on it, for what was mostly obolete hardware! The machine I was replacing had about the same spec's as the HP, so I knew it was an older model, add in the price, and no thanks.
After looking high and low, I could not find anything suitable, even in upper price ranges. I'm not sure if Apple had any pickings, but if so, the price would be high, and all my work is done in windows, so for many reasons this wasn't an option for me. I ended up getting a great machine, that is considered a gaming machine in the end, and has all the top specs, but somehow I still feel a little ripped by having to downgrade my screen resolution from my previous laptop. This also brings up another point, in that my new laptop came with an NVidia 3GB card (running DDR5 memory!), 3 freaking Gigs of dedicated video memory! That's a lot, expecially considering that my older XP maching only had 4GB of RAM installed and windows could only use about 3GB of it. The video cards have come a long way, but then I ask myself, WHY? In the old days, screen resolutions were getting higher and higher each generation of monitors, and also, color depth was the first graphics battle, starting out at monochrome and eventually working up to 32-bit color. These leaps and bounds in the hardware capabilities of the monitors and color depths were what started the graphics card race to begin with. You needed more memory to move up to higher screen resolutions and higher color depth. Now I realize much of the GPU's capabilities now days are geared toward 2d and 3d graphics manipulation directly on the card, and other impressive improvements for game rendering and overall performance. But with screen resolutions seeming to be stuck at 1080p now, what is the point of bigger and better graphics cards? Sure improving the GPU processing power itself helps games and such, but even that demand might eventually slow down or possibly top out if screen resolutions are going to also top out at 1920x1080 for the forseable future.
As an AV professional, I like HDTV, and the like. BluRay movies are cool and all, and it's a great improvement to an old standard that it is replacing, ie 480i analog. However, looking back in the past, as all the HDTV hoopla and rollouts were starting to hit the market several years ago, I never anticipated that this "Television" format would come to dominate the laptop and computer monitor market as well. For better or worse it has, and maybe it's because of the price points of these monitors from the manufacturers that is driving this. Maybe PC makers are gambling on the fact that the average Joe think 1080p is good enough for TV and hope it's acceptible for PC displays too. I'm not sure, but I'm a little disappointed with the fact that PC monitors stopped getting better once they hit the HDTV mark, and in some cases, the ones that were better, are now being phased out. If you look at desktop monitors, the same thing is happening, however at least here, there are some options for higher resolution displays, but again, the price entry for them is MUCH higher than it was a few years ago, and the options are more limited as well. What's the deal? Will this change? I guess we'll have to wait and see.