bit-tech.net

Intel warns of impending high-resolution explosion

Intel warns of impending high-resolution explosion

Intel predicts that Ultrabooks, laptops and all-in-one systems with 'retina'-class displays will be available as early as 2013.

Intel has indicated that it predicts laptops and desktops to go high-resolution as early as next year, with the PC market following Apple into the world of the 'retina-class' display.

Unlike Apple's efforts, which are limited to small-scale screens on its iPhone and iPad products, Intel predicts that high-resolution displays will be the order of the day across ultrabooks, laptops and all-in-one systems.

High-resolution computer monitors are nothing new, of course, but typical laptops top out at 1920x1080. Intel's vision of the future, outlined in slides obtained by Liliputing, sees 11in Ultabooks getting displays capable of 2560x1440, or around 250 pixels per inch.

The larger 13in Ultrabooks go a step further: Intel predicts that by 2013, these displays will offer a 2800x1800 native resolution, while 15in laptops will sit somewhere in the region of 3840x2160. Large-format 21in all-in-one systems, meanwhile, will offer a display of around 220 pixels per inch at 3840x2160.

The differing pixel densities, which sees hand-held devices hit 300 pixels per inch, laptops hit 250 pixels per inch and all-in-one systems hit 220 pixels per inch, should all allow for a 'retina'-like experience, Intel claims. The reason for this is the difference in viewing distances: the further away from the display you are, the lower the pixel density required to cause individual pixels to disappear and a smooth image appear in their place.

While Intel isn't confirming that future products will definitely come equipped with high-resolution displays - it can't, given that it doesn't actually make any display panels itself - it is warning that the ecosystem should start preparing as if it were a given.

There's plenty to do: as owners of Apple's new iPad are finding, having a high-resolution display is nothing without the infrastructure behind it. Web developers will need to ensure graphics are of a high enough resolution that they won't appear blocky or blurred, or switch to a scalable format such as SVG; games developers will need to use higher-resolution textures, which in turn means that graphics card makers will need to equip their hardware with more memory and higher processing power; even streaming media may have to look beyond the usual 1080p High Definition format to keep viewers happy.

Although Intel is predicting the appearance of high-resolution devices as early as next year, it will likely be a while before the format reaches majority saturation: industry watcher StatCounter revealed that 1366x768 has overtaken 1024x768 as the dominant screen resolution for web use on non-mobile platforms for the first time this week, despite manufacturers having standardised on widescreen displays for many years.

51 Comments

Discuss in the forums Reply
3lusive 13th April 2012, 11:27 Quote
The sooner 2560x1600 displays become cheaper, the quicker I'll be buying two of them. It's a joke how they are so expensive when monitors (and thus screen resolution) are such an integral part of your computer experience.

And 4K displays just makes me wet, but so will the price haha!
r3loaded 13th April 2012, 11:34 Quote
Hurray! I'm absolutely sick to death of seeing 1366x768 on laptops everywhere. It's about acceptable on an 11 inch laptop, but not on a 15 inch one.

While we're at it, let's also push for IPS displays all around too. £1000 laptops should have better displays than £500 iPads.
fdbh96 13th April 2012, 11:42 Quote
This is good, but until intergrated graphics improve, I wont be buying an ultrabook with a resolution higher than hd.
r3loaded 13th April 2012, 11:44 Quote
The Intel HD 4000 coming with Ivy Bridge is supposed to be a massive improvement on the current generation, with even bigger improvements planned for Haswell. I think Intel are already well-prepared for the high DPI revolution. Besides, Nvidia and AMD also stand to benefit strongly from this.
3lusive 13th April 2012, 12:06 Quote
Quote:
Originally Posted by fdbh96
This is good, but until intergrated graphics improve, I wont be buying an ultrabook with a resolution higher than hd.

True but could you not just window it and run it at a smaller res? I don't game on a laptop, and never will, so I couldn't care less about gpu grunt in them and just want them coming with larger resolutions. If I did want to game on them though, I wouldn't be using integrated graphics.
jb0 13th April 2012, 12:07 Quote
About frickin' time. I'm sick of all these "HD" displays crowding out anything with a resolution I couldn't match in NINETEEN-NINETY-EIGHT.
1920*1080 is only a little bigger than 1600*1200, and I was running that on a smaller screen.
Parge 13th April 2012, 12:19 Quote
Well we can forget about running games in native resolution on lappys then.
Madness_3d 13th April 2012, 13:08 Quote
yay, just as we reach a sweet-spot where it is practical to run modern games at decent settings on inexpensive notebook hardware. You can run anything and well on a GT 540M @ 1366x768. That's not gonna happen with that res quadrupled. Need to get a message out to ATI & Nvidia to get their act in gear and get some more cores and wider memory interfaces going...
Hustler 13th April 2012, 13:55 Quote
Quote:
Originally Posted by Madness_3d
yay, just as we reach a sweet-spot where it is practical to run modern games at decent settings on inexpensive notebook hardware.

Which is why they want a new standard....these companies always need a reason for the consumer to spend money upgrading.
ZeDestructor 13th April 2012, 14:24 Quote
Just 3 words: About f***ing time!

@r3loaded: IPS displays will have equal and higher resolutions than the run-of-the-mill parts, just like today. Prepay to pay for it though.
sandys 13th April 2012, 14:52 Quote
Quote:
Originally Posted by Parge
Well we can forget about running games in native resolution on lappys then.

If its high enough resolution and the scaler hardware is good enough it won't matter.
Star*Dagger 13th April 2012, 15:51 Quote
I have three 30 inchers, when these monitors come out, I will buy three of them at 30 (or whatever the largest size is in) inches.

Graphic cards will have to be tripled, this is good news!
Bauul 13th April 2012, 17:02 Quote
I imagine this will be the ceiling for some time. At 300ppi, you simply can't see the pixels any more unless you're nose-to-screen, so I can't see anyone bothering to produce anything higher than that for quite a while.

For that reason, I also don't think we'll see TVs go natively higher than Full HD for quite a while. The average punter can't see the individual pixels at even just 1080 from across a room.
3lusive 13th April 2012, 17:51 Quote
^I agree, and the size of 4K files will be huge (hundreds of GBs), so they'll have to figure out a realistic platform to push this standard. But I definitely can't wait for 4K displays to appear; it should push the price down of 1600p and 1440p ones too.
Elton 13th April 2012, 20:53 Quote
About damn time. However this will sure drive up demand for powerful gpus..
LennyRhys 13th April 2012, 21:02 Quote
I honestly don't see the appeal of high res screens that have a small dot pitch - I deliberately didn't get the U2711 because the pixel pitch is just too small for my liking.

A photographer friend of mine recently bought an iPad 3 and says hes "underwhelmed" by the crazy resolution - the last thing you'd expect to hear from somebody who spends his life working with pixels. Frankly I think it's just another gimmick. Bigger screens with high res - now that's more like it. ;)
Elton 13th April 2012, 21:19 Quote
I'd just want a 2k screen for large HDTV.
schmidtbag 13th April 2012, 21:34 Quote
anyone else find it ridiculous that its always apple that gets technology moving? seriously when was the last time a ridiculously popular computer-based phenomenon happened that apple DIDN'T start?

These high-resolution screens could have been done a very long time ago. Tablets could have been accomplished even longer ago. The really stupid thing is a lot of what apple comes up with isn't even that cool or practical, but people can't help but get their panties wet because THEY made it.


I have no problem with higher-resolution monitors. I think screen resolutions today on higher-quality monitors are good enough but I see no problem with increasing it.
r3loaded 13th April 2012, 21:51 Quote
Quote:
Originally Posted by Madness_3d
yay, just as we reach a sweet-spot where it is practical to run modern games at decent settings on inexpensive notebook hardware. You can run anything and well on a GT 540M @ 1366x768. That's not gonna happen with that res quadrupled. Need to get a message out to ATI & Nvidia to get their act in gear and get some more cores and wider memory interfaces going...
I'd say that Nvidia and AMD have been far ahead of the curve on graphics technology (partly due to console games and the software being far behind). Eyefinity and Nvidia Surround have demonstrated that their cards are easily capable of pushing 5760x1080 pixels, so it's not as far away as you think. Besides, you always have the option of gaming at a lower resolution or disabling AA as it's not needed at high resolutions.
l3v1ck 13th April 2012, 22:58 Quote
I like they idea of a higher pixel density, but I don't really need more than my 1080 laptop.
It would be nice to get that 1080 in a smaller, lighter laptop than I currently have though.

Of coarse if something replaces BluRay with a higher pixel standard, then I may well want that in the future.
yougotkicked 13th April 2012, 23:00 Quote
hmm, I was just talking with my history of technology professor about how intel has been pushing the HD trend to encourage consumers to buy more powerful computers.
Quote:
Originally Posted by schmidtbag
anyone else find it ridiculous that its always apple that gets technology moving? seriously when was the last time a ridiculously popular computer-based phenomenon happened that apple DIDN'T start
Netbooks jump to mind, there were some precursors like the OLPC, but the actual marketing trend was unquestionably pioneered by ASUS. But I do agree, somehow apple - without innovating any new hardware of it's own - has become the driving force behind tech culture trends.
schmidtbag 13th April 2012, 23:08 Quote
Quote:
Originally Posted by yougotkicked
hmm, I was just talking with my history of technology professor about how intel has been pushing the HD trend to encourage consumers to buy more powerful computers.


Netbooks jump to mind, there were some precursors like the OLPC, but the actual marketing trend was unquestionably pioneered by ASUS. But I do agree, somehow apple - without innovating any new hardware of it's own - has become the driving force behind tech culture trends.

Well obviously intel would be pushing something like this, but since they don't actually make a complete product (just parts for products), its kinda hard for them to get anywhere with their motives. That's like saying tire manufacturers for cars want their customers to hit the brakes harder and peel out, but you can't just get your customers to push their hardware to its limits.

As for netbooks, I didn't even think about them but once again that's apple's doing. Asus may be the one that made netbooks popular, but Apple is the one that started the idea, with the original Macbook Air. The funny thing about the Air is it was the first (AFAIK) that used intel's Atom CPU, but then intel was like "wait a minute, we made the processor, why does it have to be exclusive to apple?" so then they sold it to companies like Asus, which sold a similar product to Air for a much more reasonable price. And since then, Air has been wildly unpopular because tablets and non-apple netbooks are better for their value.
PlayLoud 13th April 2012, 23:14 Quote
I would love retina class computer monitors, though I am more interested in the awesome black levels of OLED displays. Computer LCDs are horrible when it comes to black levels, and plasma technology can't get high resolution at such a small screen.
BurningFeetMan 14th April 2012, 02:06 Quote
With growing resolutions, Ctrl + will be my best friend. Time to start investing in Monocles me thinks!
pingu666 14th April 2012, 05:30 Quote
yay!
but there will be challenges to go through, but windows 8 probably has that covered if its more resolution independant.

graphics cards might get massive for the eyefinity stuff though... :o
CowBlazed 14th April 2012, 06:30 Quote
I doubt it Intel, manufacturers can barely be bothered to produce a decent 1080p notebook screen at any semi decent price, there is no way we're getting even higher resolutions at $1000 ultrabook level anytime soon without some SERIOUS compromise such as no real GPU to back it.
fdbh96 14th April 2012, 08:34 Quote
Quote:
Originally Posted by CowBlazed
I doubt it Intel, manufacturers can barely be bothered to produce a decent 1080p notebook screen at any semi decent price, there is no way we're getting even higher resolutions at $1000 ultrabook level anytime soon without some SERIOUS compromise such as no real GPU to back it.

If Apple put a 'retina display' on a macbook air then I doubt manufacturers would have any choice.
m0zes 14th April 2012, 10:49 Quote
Quote:
Originally Posted by schmidtbag
As for netbooks, I didn't even think about them but once again that's apple's doing. Asus may be the one that made netbooks popular, but Apple is the one that started the idea, with the original Macbook Air.

The basic concept of a netbook had nothing to do with apple, they originated from the OLPC [One Laptop per Child] initiative to provide low cost systems to the 3rd world.

All this talk of high res displays is making me weak at the knees, there is one major problem that needs to be addressed particularly for desktop systems, that is display interface technologies. As it currently stands dvi remains the defacto standard even though display port has been around for many years now. Problem is display port doesn't provide that much more bandwidth than dual-link dvi. Right now if a manufacturer pushed the interface to it's limit a screen would be limited to 2560x1600 @ 120hz or 5120x3200 @ 60hz. Therefore it's time for a new display interface technology or a major revamp of display port to allow for bandwidths next generation screens will require.
ZeDestructor 14th April 2012, 11:55 Quote
Quote:
Originally Posted by Parge
Well we can forget about running games in native resolution on lappys then.
Quote:
Originally Posted by sandys
If its high enough resolution and the scaler hardware is good enough it won't matter.

Or you could, you know, let your high-end GPU handle scaling with a wee tiny bit of its rather extensive power. For the life of me I can't figure out how to fully disable scaling on my radeon....
Quote:
Originally Posted by m0zes
The basic concept of a netbook had nothing to do with apple, they originated from the OLPC [One Laptop per Child] initiative to provide low cost systems to the 3rd world.

All this talk of high res displays is making me weak at the knees, there is one major problem that needs to be addressed particularly for desktop systems, that is display interface technologies. As it currently stands dvi remains the defacto standard even though display port has been around for many years now. Problem is display port doesn't provide that much more bandwidth than dual-link dvi. Right now if a manufacturer pushed the interface to it's limit a screen would be limited to 2560x1600 @ 120hz or 5120x3200 @ 60hz. Therefore it's time for a new display interface technology or a major revamp of display port to allow for bandwidths next generation screens will require.

Just improve the transducers to go for higher bandwidth and there, job = done. From an architectural perspective, it's _that_ easy. Implementation-wise, we have Thunderbolt around with the necessary bandwidth. Apply TBolt PHY layer with DP signalling and its done.

Remember, CAT-7 (now called Class F cabling) cabling (4 twisted pairs, 4 "lanes") will do 100Gbit/s over 100m (with 32/22nm chips they say) so we most definitely have the base tech for it, especially considering DisplayPort does a "mere" 17.28Gbit/s over 4 lanes
r3loaded 14th April 2012, 12:15 Quote
Quote:
Originally Posted by m0zes
Right now if a manufacturer pushed the interface to it's limit a screen would be limited to 2560x1600 @ 120hz or 5120x3200 @ 60hz.
That's still plenty of headroom if all the display manufacturers do is double the pixel count of existing displays in both directions. Besides, DisplayPort can always be improved with faster signal clock speeds. It's very easy to boost speeds with a serial interface this way.
m0zes 14th April 2012, 13:29 Quote
It's just enough headroom for a doubling, but i'd suspect that there would also be a market for 300dpi screens in professional industries. Then you factor in higher refresh rates, larger screen sizes and what headroom might have been there is extremely quickly gone. Whilst it maybe easy to boost speeds with a serial interface the same limitations always apply, that is you may get backwards compatibility with the input controller but no forwards compatibility with previous generation output controllers. Will it end up coming down to requiring a new dp revision every year to cater for ever increasing pixel densities? It all just ends up being a compatibility nightmare.
schmidtbag 14th April 2012, 17:46 Quote
Quote:
Originally Posted by m0zes
The basic concept of a netbook had nothing to do with apple, they originated from the OLPC [One Laptop per Child] initiative to provide low cost systems to the 3rd world.
Well in that case you could say apple didn't have anything to do with touchscreens, tablets, mp3 players, online music stores, computerized TVs, smartphones, and probably a lot more things I can't think of ATM. Apple doesn't invent anything, they're one of the least original companies out there. The only difference is apple makes a solid product that for some weird reason becomes wildly popular due to a spokesperson in straight-leg jeans and a black turtleneck, and then other manufacturers decide at last minute that this unoriginal product is suddenly a good idea because apple says so.
ZeDestructor 14th April 2012, 19:48 Quote
Quote:
Originally Posted by m0zes
It's just enough headroom for a doubling, but i'd suspect that there would also be a market for 300dpi screens in professional industries. Then you factor in higher refresh rates, larger screen sizes and what headroom might have been there is extremely quickly gone. Whilst it maybe easy to boost speeds with a serial interface the same limitations always apply, that is you may get backwards compatibility with the input controller but no forwards compatibility with previous generation output controllers. Will it end up coming down to requiring a new dp revision every year to cater for ever increasing pixel densities? It all just ends up being a compatibility nightmare.

Posted this a little earlier (couple of edits as well for clarity's sake):
Quote:
Originally Posted by ZeDestructor
Just improve the transducers to go for higher bandwidth and there, job = done. From an architectural perspective, it's _that_ easy.

Implementation-wise, we have Thunderbolt around with the necessary bandwidth. Apply TBolt PHY layer with DP signalling and its done.

As of now, we have CAT-7 (now called Class F cabling) cabling (4 twisted pairs, 4 "lanes") will do 100Gbit/s over 100m (with 32/22nm chips they say) so we most definitely have the base tech for it, especially considering DisplayPort (currently) does a "mere" 17.28Gbit/s over 4 lanes

As a student in electrical engineering (and general tech enthusiast), I reckon I'm speaking sense here.
m0zes 14th April 2012, 23:10 Quote
Quote:
Originally Posted by schmidtbag
Well in that case you could say apple didn't have anything to do with touchscreens, tablets, mp3 players, online music stores, computerized TVs, smartphones.

Nice over reaction, only it was Steve Jobs himself that said that apple would never produce a netbook. All technology is interrelated, and it's not hard to draw relations between the two. But the ultimate question for me is would one exist if the other hadn't, and in that respect had the Air not existed the netbook still would have. If the Macbook Air was a netbook then yes it would be fair to say that they had a major role it popularizing the technology, but it isn't and was never meant to be a netbook, it was a slim light notebook - it used the same hardware, it ran the same software and it was priced accordingly. The original netbook used customized low power hardware, it ran different software and above all else was designed to be cheap. If you look at the pc equivalent of the Air we now have intel and other manufacturers pushing ultrabooks, they are the descendants of the Air not netbooks.
Quote:
Originally Posted by ZeDestructor
As a student in electrical engineering (and general tech enthusiast), I reckon I'm speaking sense here.

Thanks ZeDestructor, I agree that the solution is already available but I'm not worried about technology availability but rather that it is implemented. Dual-link DVI was always a part of the DVI standard, and it was a part of ATI and Nvidias reference card designs for many generations but manufacturers only ever used Single-link DVI outputs. In 2004 Apple release their first 30" 2560x1600 monitor, but due to the lack of dual-link dvi support only several of their cards and a small hand full of professional level cards were fully compatible. Dell release their first 30" around 18 months later and these had to be timed accordingly with a new generation of dual-link dvi enabled graphics cards so that they could be used at there full spec by the consumer market. Dell probably could have released there's earlier seeing as they used the exact same panel as apples but it was pointless doing so with out the compatible output hardware. If high res screens are about to make a massive appearance then it's time that we had a display interface with some serious bandwidth ready to go, otherwise development will be stifled all over again.
fdbh96 15th April 2012, 08:30 Quote
Quote:
Originally Posted by m0zes


Thanks ZeDestructor, I agree that the solution is already available but I'm not worried about technology availability but rather that it is implemented. Dual-link DVI was always a part of the DVI standard, and it was a part of ATI and Nvidias reference card designs for many generations but manufacturers only ever used Single-link DVI outputs. In 2004 Apple release their first 30" 2560x1600 monitor, but due to the lack of dual-link dvi support only several of their cards and a small hand full of professional level cards were fully compatible. Dell release their first 30" around 18 months later and these had to be timed accordingly with a new generation of dual-link dvi enabled graphics cards so that they could be used at there full spec by the consumer market. Dell probably could have released there's earlier seeing as they used the exact same panel as apples but it was pointless doing so with out the compatible output hardware. If high res screens are about to make a massive appearance then it's time that we had a display interface with the some serious bandwidth ready to go, otherwise development will be stifled all over again.

Surely we have thunderbolt for this. Intel did just develop it (along side apple of course.)
Guinevere 15th April 2012, 20:06 Quote
Quote:
Originally Posted by schmidtbag
Apple doesn't invent anything, they're one of the least original companies out there. The only difference is apple makes a solid product that for some weird reason becomes wildly popular due to a spokesperson in straight-leg jeans and a black turtleneck, and then other manufacturers decide at last minute that this unoriginal product is suddenly a good idea because apple says so.

What's the opposite of fanboy?
Xir 15th April 2012, 20:09 Quote
Quote:
Originally Posted by LennyRhys
A photographer friend of mine recently bought an iPad 3 and says hes "underwhelmed" by the crazy resolution - the last thing you'd expect to hear from somebody who spends his life working with pixels. Frankly I think it's just another gimmick. Bigger screens with high res - now that's more like it. ;)

Yup, I don't see the use for more than "HD" on small devices.
The result is that you get interpolation when playing video's...

For big screens, sure, I understand. But do we really need more than 1920x1080 at 10"?
Remember, full resolution video's weigh in at ~5-10GB per Hour...try pulling that over wireless :D
TC93 16th April 2012, 02:21 Quote
And I totally disagree with Intel.

Good luck trying to play a game at those resolutions on a laptop. Changing the resolution to a lower one will just make it look awful on an LCD.

This will of course be a marketing gimmick, again. Something Intel would love of course, to sell more of their products.
m0zes 16th April 2012, 03:38 Quote
Quote:
Originally Posted by TC93
Good luck trying to play a game at those resolutions on a laptop. Changing the resolution to a lower one will just make it look awful on an LCD.

Sure gaming at the native res is going to be a challenge, especially for the latest and greatest titles. However there is an advantage to higher resolution screens that needs to be considering. The reason why outputing a non-native reslution looks considerably worse than a native res is regardless of the output resolution, the display is always going to output it's native resolution. Therefore there has to be a level of interpolation to 'guess' what the missing pixels would actually be.

Depending on the screen this can be reasonably good to down right terrible, blurry text etc. However if you can reduce the resolution by a factor of 4x then what you get is perfect scaling, each pixel becomes larger and interpolation isn't required. If we were to do that right now with a 1080p screen the end result is 960x540 this certainly isn't a res you'd want to game at, expecially when 1080p gaming is possible.

Boost the native res to 2560x1600, this scales down to 1280x800. That's a res that many a laptop currently ships as a native resolution and will provide a nice trade off between gaming and general application usage. Aliasing will be more problematic with such systems, but a higher level of anti-aliasing will be easier to deal with than 4x more pixels to render. Just to note scaling down a current 30" 2560x1600 screen to 1280x800 yields excellent results, yes everything is big but the over all image quality is great. Shrink this down to a 13-15" screen and the results will be considerably better.
wuyanxu 16th April 2012, 09:19 Quote
im not sure, a 2560x monitor still costs over £500. i can't see £1000 laptops (already considered expensive) having that kind of screen.

need to get IPS into those screen first!
BLC 16th April 2012, 11:48 Quote
At least for my own use, I still question the need for such massively-high spec screens in laptops. If I want a powerhouse with very large screen resolutions, I have a desktop PC. I'm really not interested in 1080p video on my laptop; if it's very hard for most people to distinguish between 720p and 1080p video on screens smaller than around 32-40", then you sure as hell aren't going to notice the difference on a screen that's no bigger than 15-20". Having the extra resolution might be nice, but the trade off of needing more powerful hardware, and thus increased power usage, is too costly for a machine that is supposed to be portable. Besides, there aren't many tasks that I would want to achieve on my laptop that would require a massive resolution. Something higher than 1024x768 - what I currently have - would be nice, but you don't have to go mad with it.

But then, my opinion on the matter may be skewed somewhat; I already think that people (read: consumers) already buy machines far in excess of the spec that they actually need. The UberLaptop5000000 may be able to play games at full 1080p, render your 3D models in fractions of a second or speed up your photo editing by x% more than the competitor, but that doesn't make much difference if all you do with your crotch-boiling energy leech is couch surfing.
dr-strangelove 16th April 2012, 15:48 Quote
I feel more and more like an old man with my 1680x1050 screen
Gareth Halfacree 16th April 2012, 17:05 Quote
Quote:
Originally Posted by dr-strangelove
I feel more and more like an old man with my 1680x1050 screen
1920x1200 here, but only because my trusty old 1280x1024 started playing silly beggars. I still prefer 5:4 to 16:9 on the desktop, but I'll take widescreen as long as the vertical resolution is sensible.

My laptop, sadly, is 1366x768. It didn't feel cramped until I got the new monitor for the desktop...
ZeDestructor 18th April 2012, 11:21 Quote
Quote:
Originally Posted by wuyanxu
im not sure, a 2560x monitor still costs over £500. i can't see £1000 laptops (already considered expensive) having that kind of screen.

need to get IPS into those screen first!

Tablets already have such dense IPS screens and cost less than the average high-end monitor DESPITE including what amounts to a full-blown computer with it, so as I said earlier, ABOUT F***ING TIME!

To everyone moaning about games: Scale down to exactly one quarter or half the resolution and you'll be fine. It'll be a bit blurry, but you'll play your games.
PingCrosby 18th April 2012, 19:28 Quote
Well, if your gonna have an explosion you might as well have it in high resolution
fdbh96 18th April 2012, 21:56 Quote
Quote:
Originally Posted by BLC
At least for my own use, I still question the need for such massively-high spec screens in laptops. If I want a powerhouse with very large screen resolutions, I have a desktop PC. I'm really not interested in 1080p video on my laptop; if it's very hard for most people to distinguish between 720p and 1080p video on screens smaller than around 32-40", then you sure as hell aren't going to notice the difference on a screen that's no bigger than 15-20".

I think 1080p is the perfect res for a laptop after having tried a dell 15z with one. There is just so many more pixels to play with and everything just looks s much clearer.
yougotkicked 18th April 2012, 22:36 Quote
Ack! forgot about this thread and missed a conversation about a topic I love

let's try and squeeze in a word or two before this qualifies as necro;

I have actually written a detailed research paper on the development of the netbook so I really know my stuff on this one;

The macbook air is an important step in the evolution of the netbook, it gave the concept of an ultraportable system a lot of publicity and interest, but asus had coined the term 'netbook' and released the Eee PC 700 by october 2007. The air was publicly announced in late 2008 during one of Steve Jobs' Keynote addresses.

I think it's important to define what a netbook is when discussing their origins; many people will call any small form factor laptop a netbook, if you go that direction there are precursors as early as 1996 with the Toshiba Libretto. I like to specify the netbook as a low-cost low-power ruggedly built laptop, with a sub-12" display. I say this because that is how the netbook started out; as a cheap, small and durable system meant for developing markets (not just the OLPC, asus had planned to market the Eee 700 in developing markets as well, it launched in Taiwan first).

based on my research it's hard to attribute the netbook to anyone but asus, the OLPC is important, but it really had far less impact than it's given credit for. the macbook Air helped fuel the market by showing consumers something cool but expensive, making the super-cheap ASUS models look all the more appealing.


</history lecture>


with that said and done;

Monitors, higher resolution, displayport, intel, computer stuff. grr.
BLC 19th April 2012, 08:58 Quote
Quote:
Originally Posted by fdbh96
I think 1080p is the perfect res for a laptop after having tried a dell 15z with one. There is just so many more pixels to play with and everything just looks s much clearer.

But are there really that many situations where you absolutely have to have that extra desktop space and you don't have access to an external monitor? My 5-year old ThinkPad can drive a 1080p (and probably higher) display just fine. I do quite a bit of work on my laptop and I don't find the 1024x768 res restrictive in that many cases; when I do need the extra space I can just plug in an external monitor or switch to a desktop machine.

Sure it's nice to have all that extra space - I'm not denying that - but do you really need it? Is it really worth the additional expense (and knock-on impact on battery drain)?
ZeDestructor 19th April 2012, 10:25 Quote
Quote:
Originally Posted by BLC
But are there really that many situations where you absolutely have to have that extra desktop space and you don't have access to an external monitor? My 5-year old ThinkPad can drive a 1080p (and probably higher) display just fine. I do quite a bit of work on my laptop and I don't find the 1024x768 res restrictive in that many cases; when I do need the extra space I can just plug in an external monitor or switch to a desktop machine.

Sure it's nice to have all that extra space - I'm not denying that - but do you really need it? Is it really worth the additional expense (and knock-on impact on battery drain)?

Some of us don't have the ability to use a high-res screen all the time. I use my laptop to read notes during class at uni (and often take notes as well), so being able to split a big, high-resolution screen is far nicer than constantly alt+tabbing on a low-resolution screen. In fact, when I am at home, I use a dual-screen layout, with documentation/manual/notes/textbook/book on one screen and actual work/source code on the main screen. When gaming/browsing/having fun, I keep temperature monitors and IRC chat on the secondary monitor so I can easily glance back and forth.

For me, I really need a big high-resolution screen since I spend a LOT of time reading, and it's nice to have when I'm doing more relaxed tasks.

In terms of actual manufacturing costs, increasing the resolution isn't all that expensive since all the R&D has been done already. Just because Dell and co. put a high markup it doesn't mean its all that much more expensive at the base.

Finally, bigger, higher-resolution screens have practically NO effect AT ALL on battery life for these reasons:

1. The lamp is almost always the exact same (if not more efficient on the higher-end models) and the panel takes in a minimal amount of power to do the actual rendering, so no difference there.
2. During general 2D/Aero usage there is no practical difference since the GPU won't even bother ramping clocks and voltage up (yes, I have swapped a lot of resolutions and the GPU just doesn't care as the actual GPU load remains at 2-3%) and during 3D usage, the GPU will be maxed as a matter of design regardless of resolution.

So really, its just a matter of manufacturers hurrying up deployment already.
schmidtbag 19th April 2012, 15:59 Quote
Quote:
Originally Posted by BLC
But are there really that many situations where you absolutely have to have that extra desktop space and you don't have access to an external monitor? My 5-year old ThinkPad can drive a 1080p (and probably higher) display just fine. I do quite a bit of work on my laptop and I don't find the 1024x768 res restrictive in that many cases; when I do need the extra space I can just plug in an external monitor or switch to a desktop machine.

Sure it's nice to have all that extra space - I'm not denying that - but do you really need it? Is it really worth the additional expense (and knock-on impact on battery drain)?

I'd say its more than just a matter of space. When you use a resolution as low as 1024x768 then yes, some people would care about the space different. But for many people who use 1680x1050 or higher, space isn't a problem anymore - its image quality. Once you start reaching resolutions near the 2000s (in width), default fonts become hard to read and everything looks tiny, so you need to re-size everything to look larger.

Currently, Windows is the only modern OS that has no way to compensate for limited space. All programs must be crammed into the same taskbar and desktop, unless you have 2+ monitors, which is just an expense whereas Linux, Mac, Free-BSD, and others use multiple workspaces/desktops that you can switch to on 1 monitor. This is an immensely useful feature to me.
ZeDestructor 20th April 2012, 09:21 Quote
Quote:
Originally Posted by schmidtbag
I'd say its more than just a matter of space. When you use a resolution as low as 1024x768 then yes, some people would care about the space different. But for many people who use 1680x1050 or higher, space isn't a problem anymore - its image quality. Once you start reaching resolutions near the 2000s (in width), default fonts become hard to read and everything looks tiny, so you need to re-size everything to look larger.

Currently, Windows is the only modern OS that has no way to compensate for limited space. All programs must be crammed into the same taskbar and desktop, unless you have 2+ monitors, which is just an expense whereas Linux, Mac, Free-BSD, and others use multiple workspaces/desktops that you can switch to on 1 monitor. This is an immensely useful feature to me.

As someone who uses 8pt fonts (limited by density) regardless of resolution for high-density text (yes, even on a 2560 panel, hell I use 9px high on my phone which is even smaller at normal viewing distances), I have to disagree on your first point.

In addition, IQ is gradually becoming a focus. Dell for one offers a full-blown IPS panel in their Precision laptops, so it's far from difficult to make it. In addition, the cheaper 1080p TN panels in the XPS laptops manage some pretty good colour accuracy despite being shitty TN, so again, RAISE THE BAR ALREADY!

On the subject of multiple workspaces, I have to agree, Windows is hopeless there, although KDE intends to port over it's whole DE to windows at some point, which means other major DEs like GNOME and Unity might well make it across.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums