bit-tech.net

BrightSide DR37-P HDR display

Comments 76 to 100 of 135

Reply
yodasarmpit 6th October 2005, 16:17 Quote
WOW



But what happens if one or two or 50 of the LED's die.
Not only will you need to worry about dead pixels, but also dead LED's, and unlike replacing a cold cathode this would be a tricky operation.
CyberSol 6th October 2005, 16:49 Quote
Meanmotion 6th October 2005, 17:13 Quote
Quote:
Originally Posted by yodasarmpit

But what happens if one or two or 50 of the LED's die.
Not only will you need to worry about dead pixels, but also dead LED's, and unlike replacing a cold cathode this would be a tricky operation.


Why would it be anymore difficult? Ok, a bit of soldering might be required but it's still just replacing an LED. As for dead pixels, well this technology would go some way to reducing their effect as the backlight would at least be at an appropriate brightness behind the dead pixel.
hitman012 6th October 2005, 17:41 Quote
I think this is something that BrightSide would have thought of and implemented some system where they can be fairly easily replaced. Besides, LED technology will mature greatly in the next few years, and as we won't be seeing this display for a while, they will be able to take full advantage of it.
The_Pope 6th October 2005, 17:47 Quote
Don't panic guys - I have anticipated this question, and I have lots of material from their CTO.

Unfortunately, it's 2:52am and I just got finished today's Modding Worklog. Don't panic - BrightSide have thought of everything. I'll fill you in on the detail after some sleep
teme_l 9th October 2005, 00:33 Quote
Nice to see someone make working piece with that led-backlight-technology.

One thing about the article: after reading the article I felt like BrigtSide has done something ubercool-invention, but I'm quite sure, that BrightSide was just first company to patent that technology (come on, even I had tought that before I heard such a techonlogy had been invented, and I'm just stupid student).. so what i'm saying here is that article made just a bit too much big deal about BrightSide as a company and it got annoying towards end (I don't have anything against brightside ;) and imho it's great to see working one and all the respect of that to the BrightSide)


EDIT: that lotr-video really showed the potential of that technology, but what I noticed was that occasionally whites leaked from up and down when there was something bright near those black blocks which means that eventually 45x31 leds aren't enough (I don't think that it was camrecorders fault (most of the cases it was, but there are few where I'm not that sure), because it didn't leak anywhere else at that point and there should be a round burn on video, where as I could occasionally spot a half circle). Look at scenes where there's fire near those black boxes... altough I'm sure that in future amount of led's can be multiplied many times and that feature can be eliminated.
Meanmotion 9th October 2005, 11:58 Quote
Quote:
Originally Posted by teme_l
Nice to see someone make working piece with that led-backlight-technology.

One thing about the article: after reading the article I felt like BrigtSide has done something ubercool-invention, but I'm quite sure, that BrightSide was just first company to patent that technology (come on, even I had tought that before I heard such a techonlogy had been invented, and I'm just stupid student).. so what i'm saying here is that article made just a bit too much big deal about BrightSide as a company and it got annoying towards end (I don't have anything against brightside ;) and imho it's great to see working one and all the respect of that to the BrightSide)

You obviously don't know much about how patents work. You only have to come up with an idea to get it patented so by that measure they were the first to think of it. I know what you mean, it seems so obvious now, but you only think you've thought of it before.
Quote:
Originally Posted by teme_l

EDIT: that lotr-video really showed the potential of that technology, but what I noticed was that occasionally whites leaked from up and down when there was something bright near those black blocks which means that eventually 45x31 leds aren't enough (I don't think that it was camrecorders fault (most of the cases it was, but there are few where I'm not that sure), because it didn't leak anywhere else at that point and there should be a round burn on video, where as I could occasionally spot a half circle). Look at scenes where there's fire near those black boxes... altough I'm sure that in future amount of led's can be multiplied many times and that feature can be eliminated.

You're missing the point. With an identical LCD with conventional backlighting the problems you describe would still occur but you don't notice them so much because the darks aren't dark enough. Also, the technology is in it's infancy so whatever perceived problems there are will take time to be resolved.
Tim S 9th October 2005, 12:07 Quote
When I saw the display in London the other day, there were occasional honey comb issues, but BrightSide assured us that the problem is firmware related (the software that controls the LED brightness inside the panel). We were told that the display that we saw had firmware that was over 1 month old.

FWIW, the display(s) that they have in their demo room are working honey comb free with the latest firmware.
JADS 9th October 2005, 12:40 Quote
It looks impressive, but I will wait till I see a proper live demo of one before making any decisions! Are they previewing the tech anywhere :)?

This is one technology that has massive potential to improve in the future based on LED development , which should make it interesting to watch over the next few years :) If it is capable of that with 1400 LEDs then how about 140,000 LEDs?

For the record my Mitsubishi CRT produces a better image with regards to contrast than my Dell TFT. For example the in the TV program Battlestar Galactica the space scenes will be grey and blocky on my TFT, whereas they'll be a deep uniform black on my CRT. Give it a good colourful widescreen image though and the TFT will have you drooling on the floor :)
DTH 9th October 2005, 18:49 Quote
Great article and a very interesting product!

Still trying to fully get my mind around this and relate it to something. Hence the following questions that mainly relate to film post production and home cinema:

1. What is the effective black value in a lit environment? I realise that this depends om the amount of light, but say e.g. reasonably dim at around 30 lux?

2. How does the black value relate to the black value of a movie theatre? SMPTE specifies brightness of 41 - 72 cd/m2, but I haven't found any reference to the typical (or specified) black value of a cinema.

3. It is mentioned in the article that the human eye has a dynamic range of 5 decades (1:100.000). Is this absolutely simultaneous, or including iris adjustments? I.e. I am wondering if it is possible to take in the full range at once or if it takes a few seconds of concentrating on different areas? A second, somewhat related question is if the eye's dynamic range goes as low as the BrightSide`s in "normal" viewing conditions.

4. What is the typical (and/or maximal) dynamic range of a current movie? My understanding is that a filmbased frame (and similar for CCD/CMOS sensors) can capture approx. 7 f-stops (1:128-ratio) of dynamic range w discernible detail. I furthermore understand that the best print films (like Kodak's 2393) have a practical range of up to 4 decades (1:10.000) of dynamic range. a.) How does that compute (if it is correct)? Do darker scenes occupy a lower range of the print films total range and the brighter scenes a corresponding higher part of the total range? b.) Assuming 1:10.000 is correct for the best film prints, (how) does one benefit from a greater range on the display?

5. I have seen impressive examples of HDR images produced with the HDR-function in Photoshop CS2 w up to 14 f-stops of dynamic range (1:16.000). This is done by processing a multiple bracketed exposures into one image. Will such an image display differently on a BrightSide display? (Or does this particular function rather take 14 f-stops of reality and compress them into a "viewable range" for standard display and print?)

W kind regards and lots of curiousity
DarkReaper 9th October 2005, 23:44 Quote
Woah, that post confused me more than the article :S

As far as your first question goes, surely the effective black value is still zero? If there's nothing lit up then the screen will be dead black, or just reflect as much light as a LCD that isn't turned on?

Q3 - Your eye does take a few moments to adjust to different light levels, which is why we actually experience glare in real-life - like when we are forced to leave the PC and venture into the 'real world', the overwhelming brightness when we open the door is nearly as realistic as the Lost Coast blooming efect ;)
dream caster 10th October 2005, 01:44 Quote
Quote:
Originally Posted by teme_l
Nice to see someone make working piece with that led-backlight-technology.

One thing about the article: after reading the article I felt like BrigtSide has done something ubercool-invention, but I'm quite sure, that BrightSide was just first company to patent that technology (come on, even I had tought that before I heard such a techonlogy had been invented, and I'm just stupid student).. so what i'm saying here is that article made just a bit too much big deal about BrightSide as a company and it got annoying towards end (I don't have anything against brightside ;) and imho it's great to see working one and all the respect of that to the BrightSide)

. . .

The record of Patent Offices, specially in the U.S., is not very bright now, there have been many patents that are unjustly issued.

Having a too wide patent, like that about any means of dynamically adjusting backlight seems a little too much for me; that is just patenting a concept without any specification about how are you going to make it come true. It looks doubtful for me.

Intelectual property is not like being the owner of any material object; you are based upon all knowledge and experience mankind has been able to get and you are adding a little to it; there has to be a balance between what you receive from that repository and your "intelectual property"; that property should not block contributions from other people.
dream caster 10th October 2005, 01:56 Quote
Quote:
Originally Posted by JADS
It looks impressive, but I will wait till I see a proper live demo of one before making any decisions! Are they previewing the tech anywhere :)?

. . .

For the record my Mitsubishi CRT produces a better image with regards to contrast than my Dell TFT. For example the in the TV program Battlestar Galactica the space scenes will be grey and blocky on my TFT, whereas they'll be a deep uniform black on my CRT. Give it a good colourful widescreen image though and the TFT will have you drooling on the floor :)

CRT's do not produce any light behind black pixels and do have better contrast than LCD's.
The_Pope 10th October 2005, 05:08 Quote
OK Guys, as promised, here is the quote from BrightSide's CTO about LED life and what happens in one fails.

Quote:
First, the notion of "lifespan" is defined differently for an LED than for many other electronics components. Usually components "live" and then "die" after some time, a binary state change. LEDs virtually never "die" if the electrical design is working (if it isn't then they will die right away). Instead they slowly decay in brightness over time. This decay time is dependent on a lot of factors but the big one is heat. A common spec in the LED industry is a decay to 75% of peak brightness after 50000 hours of running time at full rated current in moderate ambient temperature.

There are a lot of important parts to the sentence above with respect to the HDR display:

A) "at full rated current" - the HDR display on average drives the LED at around 25% of its full current because the LED vary all the time according to the image and it is very rare that a portion of the screen stays at the full 3000cd/m2 peak brightness for days (or even a minutes for that matter). This increases the 50000 hrs statement significantly because the lifetime loss is related to high temperature so that there is no linear relationship between power, time and lifetime. In other words, an LED driven at 1/2 rated current won't decay to 75% peak brightness in 100000 hrs (2x 50k) but last much much longer (~200k hrs roughly).

In fact, at some point the current becomes low enough that the lifetime becomes virtually endless simply because the current is never enough to cause a thermal challenge for the LED and cooling system. LED users and manufacturers have studied this in great depth and the net result for the HDR display is a significant increase of the 50k hrs value.

B) "75% of peak brightness" - like I said before, this is just a reduction of the light output, not a catastrophic failure. We have a patent pending approach to include optical sensors and wave guides in the display to monitor the LED over the lifespan of the display and adjust their driving level according to any drift. So of an LED decays by 10% the device simply raises the drive current by 10% (or whatever value is necessary to restore it to the original brightness).

Of course ultimately this leads to an accelerated decay but given the very long time it takes to get to 75% this is a fairly negible concern especially since we start the display with LEDs having a "peak brightness" about 25% lower than the actual peak brightness (this btw accounts for the confusion about 4k and 3k cd/m2 luminance in some of the specs out there). That means the first 25% of decay are complete compensatable without accelerated decay and by the time that "overhead" is used up the lifetime will already be very very long.

C) The above to items take care of LED decay over time. Should for any reason whatsoever an LED encounter catastrophic failure (mechanical shock, etc) then we have another patent pending solution. Here, the same sensors as above report the failure and the software inside the display controller automatically adjusts its computation to overdrive the surrounding 6 LEDs to "spill" light into the "dark spot" left by the dead LED. The LCD image is then darkened appropriately on top of the bright ring caused by the overdriven 6 LED and lightened somewhat on top of the dark spot. In this fashion the system can compensate for a dead LED or even a whole line of dead LEDs.

At Siggraph 2005 we had one Dr37 at the booth with an entire cluster of LED switched off (~60 LED in a strip 3 LED wide) and we asked people to identify the dead area. Almost nobody saw it and that was for a big cluster. The chance of an LED failing is pretty small (see above). The chance of two LED failing is even smaller. The chance of two LED failing side by side on a grid of over a thousand LED is pretty darn small. The chance of 60 LED all failing, all in one area is virtually nil (unless of course you knock that section out with a hammer...).

I'd like to hope that this answers your questions, and I'm sure it will, but I also bet there will be a bunch more :D
The_Pope 10th October 2005, 05:24 Quote
JADS: right now, the display is being demoed only to hardcore image-based companies like the ones listed here: http://www.bit-tech.net/hardware/2005/10/04/brightside_hdr_edr/9.html since they are the only ones able to afford to pay $49,000 for one of these current generation models. Once the technology has been licensed by the big display manufacturers, I'm sure they will demo it according to their schedule for mass market introduction.

For example, I've been told that CES 2006 (5-8 Jan, 2006) will be too soon. There are several other tradeshows throughout the year that we *might* see it at, and if they're not in-store for January 2007, y'all might see a commercial model at CES 2007 perhaps.

I don't know what their position is on increasing the number of LEDs dramatically - it's my understanding that this simply isn't necessary because of the advanced algorithms / image processing they do. That's ignoring the increased cost too.

CRT vs LCD: a standard CRT will give you decent blacks, but a quick spec check reveals a quite pathetic 95 cd/m2 "typical brightness". Any LCD will beat that figure several fold, and the HDR display is 10 times brighter again.


DTH: err, I'm going to have to defer to the BrightSide staff for the answers to those mate - they are the experts in all this.
DTH 10th October 2005, 07:06 Quote
Quote:
Originally Posted by DarkReaper
Woah, that post confused me more than the article :S

As far as your first question goes, surely the effective black value is still zero? If there's nothing lit up then the screen will be dead black, or just reflect as much light as a LCD that isn't turned on?

I think you partly answered yourself. The screen will reflect some amount of light if it is in a lit environment. I am trying to get an understanding of how much this effects the effective black value of the BrightSide and thus its dynamic range and contrast.
Quote:
Originally Posted by DarkReaper
Q3 - Your eye does take a few moments to adjust to different light levels, which is why we actually experience glare in real-life - like when we are forced to leave the PC and venture into the 'real world', the overwhelming brightness when we open the door is nearly as realistic as the Lost Coast blooming efect ;)

That is exactly what I'm trying to get my mind around. I.e. how much dynamic range is useful/desirable for film watching? Can we handle the full real dynamic range of, say, a backlit scene on a sunny summer day, or would one need a pair of shades when watching "uncompressed" HDR material on a true HDR display? B)

Cheers
Meanmotion 10th October 2005, 18:18 Quote
Quote:
Originally Posted by dream caster
The record of Patent Offices, specially in the U.S., is not very bright now, there have been many patents that are unjustly issued.

Having a too wide patent, like that about any means of dynamically adjusting backlight seems a little too much for me; that is just patenting a concept without any specification about how are you going to make it come true. It looks doubtful for me.

Intelectual property is not like being the owner of any material object; you are based upon all knowledge and experience mankind has been able to get and you are adding a little to it; there has to be a balance between what you receive from that repository and your "intelectual property"; that property should not block contributions from other people.

Yeah it does seem odd that people can do that - and some people make a career out of abusing it. However, a patent lasts at most 20 years and everyone does it so these things balance out. Also, if brightside (or any other small company with a new invetion) weren't allowed to patent a fairly broad description of there innovation then they would be open to one of the big manufactures coming up with a slightly different version and bringing it to mass market much quicker than brightside could. Thus totally undercutting the original innovators.
BGD 11th October 2005, 18:46 Quote
Brightside is the same company that previously went under the name Sunnybrook, right? The Sunnybrook web site is at least redirected to the Brightside web site, so it seems to be the case. Anyway, I was a little surprised to see this HDR hype all over again when it's been known about since at least early 2004 and we still haven't seen a commercially available model yet...

Here's a Feb 2004 article about the then Sunnybrook HDR prototype LCD display featuring a 40.000:1 contrast ratio. Also interesting to point out is the information that BenQ were working on the same thing. Are you saying that they won't be able to patent a similair LED modulation technique now that Sunnybrook/Brightside have patented theirs? Note the optimistic availability date as well as the price estimate mentioned in the article, I guess that's part of the marketing of a product...

Getting the LED out

Sunnybrook Technologies, Vancouver, British Columbia, has teamed with researchers at the University of British Columbia and York University, also in Canada, on a backlight system that promises to increase an LCD panel’s dynamic range as high as 90,000:1 — more than 100 times greater than today’s best LCD panels.

The High Dynamic Range (HDR) system replaces the backlights in an LCD panel with an array of white or tri-colored LEDs. The result is an extremely bright, low-resolution LED display that sits behind the high-resolution LCD panel. Software balances the two displays’ output to cancel out byproducts such as blurring, creating a display that the developers say provides brighter, more realistic images.

HDR’s development group hopes to tap the widest possible market by leveraging existing equipment. “The HDR display is 100 percent plug-and-play-compatible,” says Helge Seetzen, director of the emissive display program at Sunnybrook Technologies. “It’s beneficial to use a floating-point graphic card to do fast, 16-bit rendering, but the display can be driven by any VGA signal.”

The biggest commercialization challenge is the classic one — cost. Seetzen estimates that when the technology is commercially available in late 2004, an HDR display will cost 15 percent to 20 percent more than a conventional LCD display. “As a result, Sunnybrook is currently focusing on high-end, niche-market applications such as medical imaging, film editing, and military command-and-control viewers,” Seetzen says.

How quickly HDR moves into the mainstream AV market depends partly on the underlying technologies. “LED prices are coming down rapidly every year — usually by a factor of two or three per year,” Seetzen says. “Moreover, the currently available bright LEDs are more than bright enough for the job. We expect that LED prices will come to a level where the HDR display is a good proposition for mainstream displays or TV in approximately two to three years.”

Backlight bonanza

Sunnybrook isn’t the only vendor exploring backlighting as a way to improve LCD displays. Spark Huang, associate vice president of BenQ’s network display business group, says BenQ is working on an LED backlight technology to increase the dynamic range of its LCD displays. Thermal issues, power consumption, and cost are the main challenges, Huang adds, but the technology could be available in 2005.

An additional challenge for any attempt to place a super-bright backlight behind an LCD display is that the light can actually photo-activate the transistors, creating unusual results. “One thing they sometimes find is a purple haze, where you’re turning the transistors on by photo-inducing current in the transistors that drive each picture element,” says Joel Pollack, vice president of the display business unit at Sharp Microelectronics of the Americas.

Another challenge is that today’s LEDs have a relatively short life span and no two LEDs age alike. “As a result, it won’t take long before that backlight will look highly non-uniform,” Pollack says.

Sharp favors different approaches for improving dynamic range, particularly ones that use more finesse than force. One example is dynamic Gamma correction, where the Gamma curve is tweaked with each frame of video. “In doing so, over time your eye looks at that as being a much bigger dynamic range,” Pollack says. “But it’s simply taking advantage of the temporal changes and fooling it by changing Gamma from frame to frame.”
The_Pope 12th October 2005, 01:49 Quote
Yes, Sunnybrook renamed themselves to BrightSide.

The prototype discussed is exactly that - a prototype, based on an 18.1" desktop LCD. It used the PCB photographed for my article.

The difference now is that they have moved to a more suitable 37" widescreen HDTV format, and while they may cost $49,000, they are a proper product. I'm not permitted to discuss the internals, but let's just say they're a big step beyond that original 18" model.

Many manufacturers are experimenting with LED backlighting, I believe spurred on by some sort of EU ban soon to come into effect that outlaws a toxic component of CCFLs. I can't remember exactly, but it's lead or something nasty.

I know for a fact that BrightSide hold patents on *dynamic* backlighting, whether it be LED-based, CCFL or "other". So at a guess, maybe BenQ will introduce static LED backlighting to their LCDs, and perhaps with the extra luminance of LEDs over CCFLs, this will allow them to claim a greater contrast ratio. But nothing will have the same effect as what BrightSide can achieve without just licensing their technology and adopt IMLED.

At a guess, maybe the news of that first prototype was used to help drum up some investor interest for the privately-funded BrightSide. But so what - that happens all the time in many other industries.

The only thing that matters is that the image quality is astounding, and only through consumer demand from geeks like us will big manufacturers take notice and bring this technology to displays in the future.

And for any conspiracy theorists out there, I do not hold BrightSide stock. I am not on their payroll or any other relationship beyond manufacturer and tech journalist.
Elroys 14th October 2005, 17:41 Quote
From what I can gather Brightside increases the color reproduction and accuracy of an LCD tenfold, but does this mean the other LCD image quality problems go away?

What about the smearing of quick pans, juddering, slow response time, pixelation and all the other things that make LCD (and plasmas for that matter) look horrible?

That is why I still think that the SED technology is more exciting than this, because it directly addresses all those other issues.

The good thing about brightside though is that you can buy them now. Even if they are expensive concept designs, it will only take 2 years or so to reach mass market price levels.
The_Pope 15th October 2005, 01:04 Quote
Welcome to the forums, Elroys.
Quote:
What about the smearing of quick pans, juddering, slow response time, pixelation and all the other things that make LCD (and plasmas for that matter) look horrible?

It sounds like you've got a keen eye there mate. I'm not sure if the BrightSide image processing extends to addressing any of these - backing a backlighting technology, it's likely to be as good (or bad) as the chosen LCD panel. However, I have to say, I didn't notice any of the issues you mentioned on either the Westinghouse HDTV or the HDR panel. The former has a Response Time of 12ms, which seemed to do the trick, and being 1080p, there was no problem with 'pixelation'.

We'll certainly be keeping an eye on SED too, but from what I have heard so far, it's still fairly experimental, and initial manufacturing capacity is likely to be limited. Translation: poor availability and high prices. Compared with the humongous investment in regular LCD production, it is hard to see SED taking off at this point in time.

But that's what makes this technology lark so interesting: nobody can predict the future :)
Elroys 15th October 2005, 04:59 Quote
Nice Pope im jealous, wish I could get to see them in real life. Did you see it playing any movies or just the HD demos.

So is it confirmed that Benq is the "far east manufacturer" licensee. Or is it somebody else? Would be good to see some brightside LCD computer monitors by next christmas...
The_Pope 16th October 2005, 09:55 Quote
Just the Lord of the Rings you saw in the demo video.

The far east manufacturer has NOT been confirmed at this point. The only reason BenQ were mentioned was because they have publically said they are / were working on LED backlighting. But so pretty much every LCD manufacturer. However, that is just looking to replace CCFL with LED - to move to HDR, we need them to license IMLED technology.
Curmudgeonx 6th December 2005, 03:30 Quote
Ming.
hitman012 6th December 2005, 08:12 Quote
Thanks for that priceless nugget of information.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums