bit-tech.net

Dolby buys BrightSide Technologies

Dolby buys BrightSide Technologies

High Dynamic Range movies in your lounge closer to reality as Dolby buys BrightSide Technologies for $28 million

In a deal announced today, surround-sound experts Dolby Laboratories is to aquire Canadian start-up BrightSide Technologies for approximately $28 million USD.

If that name sounds familiar, you'd be right. BrightSide is the talent behind the world's first true HDR LCD display, producing whiter-than-white whites and indeed blacker-than-black blacks. Contrast ratios move beyond the current 1000:1 marker to 200,000:1 and more.

To date, if you wanted a true HDR LCD display you were limited to ponying up $50,000 for a BrightSide DR37-P which we looked at back in October 2005. We were told then that discussions were taking place with "several major Far-Eastern manufacturers" but the suggestion of mainstream production units being two years away made us want to cry. We had seen the future and we wanted it now!

Fast forward 18 months and things have moved on apace. As recently reported by our sister site, TrustedReviews, Samsung has demoed what it is calling Locally-dimming LED backlights which must be BrightSide's tech under license, since the Canadian company holds all the right patents for dynamic adjustment of backlighting.

The question remains though, why would audiophiles get involved in backlighting technology?

"Dolby has built its strong reputation and brand by delivering products and technologies that make the entertainment experience more realistic and immersive, and BrightSide's HDR image technology complements that strategy," said Bill Jasper, President and CEO at Dolby. "Acquiring BrightSide reflects our long-term focus on delivering innovative technology solutions to our licensees and their customers."

Having pioneered Dolby Stereo, Pro Logic, Dolby Digital and the latest, Dolby Digital EX, it is clear that 7.1 is more than enough for home-based surround sound. Where does a company like Dolby go from there? Financing cutting-edge image processing and backlighting technology, obviously!

Joking aside, we certainly wouldn't disagree with Bill Jasper's comments. With audio fidelity as good as done and dusted, the new frontier is visual fidelity and BrightSide's IMLED technology raises the bar so high, it's in orbit. The potential earnings from future royalties are staggering: anyone currently making LCD displays (HDTV and desktop displays alike) will want a piece of this action.

Discuss in the forums

29 Comments

Discuss in the forums Reply
DLoney 28th February 2007, 00:26 Quote
Quote:
white-than-white whites and indeed blacker-than-black blacks
I think your looking for "whiter"
The_Pope 28th February 2007, 00:27 Quote
Rule #1: don't proof news stories after 11pm Timmy! :P
Bluephoenix 28th February 2007, 00:28 Quote
Definitely filed under smart business strategy.

This will also be a bonus to consumers as Dolby has shown a history of successfully spreading technologies and also of a rapid reduction of prices for said technologies over short periods.
The_Pope 28th February 2007, 00:30 Quote
I wonder if it's related to the 3D Cinema stuff Dolby announced AGES ago: http://blog.wired.com/gadgets/2006/08/dolby_goes_3d.html?entry_id=1531278

BrightSide do have High Dynamic Range projector technology too...
EQC 28th February 2007, 00:32 Quote
Bluephoenix makes me happy. I certainly hope that Dolby successfully gets this tech on the market fast and cheap.
_DTM2000_ 28th February 2007, 01:03 Quote
I only thought about this a few days ago and wondered when this technology was going to hit the mainstream. This is a smart move by Dolby and I look forward to seeing what they can achieve with this aquisition. 28 million is a small price to pay for technology with this much potential.
speedfreek 28th February 2007, 01:23 Quote
Quote:
Originally Posted by EQC
Bluephoenix makes me happy. I certainly hope that Dolby successfully gets this tech on the market fast and cheap.
I was wondering what the hold up was, it would be nice to see this stuff in person. :D
Aankhen 28th February 2007, 01:49 Quote
Quote:
With audio fidelity as good as done and dusted
I'd say it isn't, at least until we get theater-quality sound from cell phones! :)
Constructacon 28th February 2007, 05:57 Quote
Quote:
Originally Posted by The_Pope

BrightSide do have High Dynamic Range projector technology too...
Sweet. When will we see the front page article on this?

Good on Dolby. All I can say is "bring it on".
The_Pope 28th February 2007, 09:39 Quote
There's a Whitepaper on how it works but we haven't seen a demo yet. I believe the results aren't quite as good as the LCD TV (how could it be?) BUT it still promises to be lightyears ahead of current projectors, which are notorious for having rubbish black-level performance
geek1017 28th February 2007, 09:48 Quote
All I can say is that I want one.

How long will it take for licensing and production?
Will we see these out by the Holiday season?

I would gladly exchange a major extremity for a true Hi-Def home theater with 7.1 surround sound and HDR projector.
blackerthanblack 28th February 2007, 10:07 Quote
Quote:
white-than-white whites and indeed blacker-than-black blacks


You namedropper you :D
mclean007 28th February 2007, 11:26 Quote
Ooooh, can't wait for my 1080p HDR set. Hopefully within 12 months now :D
The_Pope 28th February 2007, 15:33 Quote
I think within 12 months is perfectly reasonable, though at what pricepoint, I daren't guess. 1080p is already a reality, albeit a fairly pricey one. BrightSide's HDR IMLED technology has nothing to do with the LCD panel, so it can not only work with any existing 720p / 1080p panel, but might almost be considered a "drop in" solution.

Of course, there are the mechanics (and economics!) of the backlight itself but with Samsung already demoing what we believe to be the same tech then Time to Market and Pricepoint are really just a question of marketing strategy.

They're definitely going to appear at the high end first and then trickle down across the range. Eventually CCFL backlights will be a thing of the past and we might then see static LED backlight models and IMLED models....
JADS 28th February 2007, 16:28 Quote
Samsung certainly aren't going to release a display the requires water cooling for it's backlight ;) That certainly can't be helping the release date!

I'd like a Samsung monitor with IMLED, WQUXGA, DisplayPort, HDMI, VGA, Component, and a high colour gamut. :)
The_Pope 28th February 2007, 16:31 Quote
Quote:
Originally Posted by JADS
Samsung certainly aren't going to release a display the requires water cooling for it's backlight ;) That certainly can't be helping the release date!

The latest generation don't require watercooling. In the past 18 months there has been a lot of improvements in the efficiency of the super-bright white LEDs used. The latest generation are twice as bright / half the power kind of thing.

By the time this stuff goes mainstream, it shouldn't be any harder to cool than maybe a regular plasma screen (which often have active aircooling)
mclean007 28th February 2007, 16:54 Quote
Quote:
Originally Posted by JADS
I'd like a Samsung monitor with IMLED, WQUXGA, DisplayPort, HDMI, VGA, Component, and a high colour gamut. :)
Me too. And I'm not paying over £100. :D :)

Joking aside, white LEDs help broaden the colour gamut, so that is pretty much a given.

Why would you want VGA on that next-next-gen monitor? That's like wanting a PC that's compatible with a 9600 baud modem, isn't it? I'd just have a bank of HDMI and DisplayPort connectors, and maybe Component for XBox 360.

WQUXGA - had to wikipedia that one. 3840 x 2400! I guess you'll be wanting quad cryogenically cooled Geforce 9900 GTX cards to power it, then. :D Incidentally, I think you'll need more bandwidth than HDMI or DisplayPort can afford - the highest spec (1.3) only stretches to 10.2Gbps, and DisplayPort offers four 2.7Gbps channels (10.8 Gbps total). At a resolution of 3840 x 2400 x 24 bpp, that only gives you enough for 46fps, not even allowing for audio bandwidth and overheads. Not good. Dual-link HDMI, anyone? Add to that the fact you'll want more colour depth for HDR (say 36 bpp, maybe 48), and you're reducing that fps still further. I'd guess you're also pushing the limits of component and VGA, though I don't really understand analogue video well enough to say for sure.

Maybe it will be out in time to play Duke Nukem Forever.
mclean007 28th February 2007, 17:01 Quote
Quote:
Originally Posted by The_Pope
I think within 12 months is perfectly reasonable, though at what pricepoint, I daren't guess. 1080p is already a reality, albeit a fairly pricey one.

Sweet. 1080p is getting quite reasonable now - you can get a 40" 1080p Sony 40W2000 for not much over £1200, which is quite amazing. I don't think I'll bother with an LCD until I can get one in that range. The whole idea of 1366 x 768 confuses me - surely that means that all HD content (720p / 1080i / 1080p) has to be scaled to get a full frame image? Don't understand the point. Why is that, and not 1280 x 720, the standard res for a 720p LCD?
Quote:
BrightSide's HDR IMLED technology has nothing to do with the LCD panel, so it can not only work with any existing 720p / 1080p panel, but might almost be considered a "drop in" solution.
In technical terms related to the actual panel, true, though of course there are complications with the display driver circuitry, because there's an interrelationship between what the panel does and what the backlight does, to get an even, consistent image across the screen. I guess it needs more horsepower to do this at 1080p than (say) 720p. That said, given all the signal processing / image enhancement features modern LCDs already sport, I doubt it will be too taxing.
EQC 28th February 2007, 20:35 Quote
Quote:
Originally Posted by mclean007
The whole idea of 1366 x 768 confuses me - surely that means that all HD content (720p / 1080i / 1080p) has to be scaled to get a full frame image? Don't understand the point. Why is that, and not 1280 x 720, the standard res for a 720p LCD?


I've been wondering the same thing for years. Interestingly, I can remember when there were lots of 1280x720 TV's. Then, suddenly, there were a few 1280x768 TVs -- I assumed that that 16x10 ratio happened to be friendlier for PC inputs, given that "768" and "1280" were already portions of standard computer resolutions (1024x768 and 1280x1024). Then, soon after that, they jumped up to 1366x768 -- I kindof assume that was to get rid of the "black bars" that a 16x10 TV uses to display a 16x9 image, because 1366x768 is back to a 16:9 ratio.

Yeah...I've never really had a chance to compare image quality on a "768p" set vs a 720p set, but I can't imagine the scaling is very accurate, given the strange ratio. 720p and 1080i/p scale with a factor of 1.5, which shouldn't be too bad....but the ratio between 768 and 720 is 1.066666, which seems like it's going to show more artifacts.

I'm kindof thinking nowadays that perhaps "Average Joe" thinks more pixels is always better....so a 1366x768 set will always sell better than a 1280x720 set sitting next to it. So, manufacturers can't just go back to the format that makes sense without risking sales.
JADS 1st March 2007, 00:22 Quote
Quote:
Originally Posted by mclean007
Me too. And I'm not paying over £100. :D :)

I'm quite willing to pay for it :)
Quote:
Originally Posted by mclean007
Joking aside, white LEDs help broaden the colour gamut, so that is pretty much a given.

This is a good thing.
Quote:
Originally Posted by mclean007
Why would you want VGA on that next-next-gen monitor? I'd just have a bank of HDMI and DisplayPort connectors, and maybe Component for XBox 360.

Interestingly enough for the XBox 360... You'd go for component, I'd plump for VGA or ideally HDMI if they only offered a damn HDMI cable ;) Twin HDMI and no VGA if you can get the XBox 360 with HDMI.
Quote:
Originally Posted by mclean007
WQUXGA - had to wikipedia that one. 3840 x 2400! I guess you'll be wanting quad cryogenically cooled Geforce 9900 GTX cards to power it, then. :D Incidentally, I think you'll need more bandwidth than HDMI or DisplayPort can afford - the highest spec (1.3) only stretches to 10.2Gbps, and DisplayPort offers four 2.7Gbps channels (10.8 Gbps total). At a resolution of 3840 x 2400 x 24 bpp, that only gives you enough for 46fps, not even allowing for audio bandwidth and overheads. Not good. Dual-link HDMI, anyone? Add to that the fact you'll want more colour depth for HDR (say 36 bpp, maybe 48), and you're reducing that fps still further. I'd guess you're also pushing the limits of component and VGA, though I don't really understand analogue video well enough to say for sure.

WQUXGA should scale most widescreen resolutions using integer scaling, which means you should be able to run a game at 1280x800 and it'd be sharp. I'm more interested in the desktop space offered by a 9MP display tbh rather than gaming on it. The old IBM T221 used four DVI connectors to generate it's WQUXGA image, but that really was designed for editing still images.

I see no reason why a graphics card with gen-lock on its Displayports should not be able to drive a WQUXGA screen at a reasonable refresh rate and colour depth :)
Aankhen 1st March 2007, 00:41 Quote
Quote:
Originally Posted by mclean007
Incidentally, I think you'll need more bandwidth than HDMI or DisplayPort can afford - the highest spec (1.3) only stretches to 10.2Gbps, and DisplayPort offers four 2.7Gbps channels (10.8 Gbps total). At a resolution of 3840 x 2400 x 24 bpp, that only gives you enough for 46fps, not even allowing for audio bandwidth and overheads.
Solved. ;) Are you sure your calculations are accurate, BTW? I'd done some approximations a while back, and I worked out that DisplayPort could handle 3,840×2,400 at 24bpp and 70 FPS. Of course, I'm not quite sure I did it correctly.
JADS 1st March 2007, 08:32 Quote
Quote:
Originally Posted by Aankhen
Solved. ;) Are you sure your calculations are accurate, BTW? I'd done some approximations a while back, and I worked out that DisplayPort could handle 3,840×2,400 at 24bpp and 70 FPS. Of course, I'm not quite sure I did it correctly.

Unfortunately UDI is not widely supported, but DisplayPort is. I think DisplayPort is obsolete before it ever hits the market, but it does look like we will all have DisplayPorts on our gfx cards rather than UDI ports.

Hmm...

3840 (H) x 2400 (V) x 60 (R) x 24 (C) = 13,694,720,000 (13 GBit/s)

This is beyond the max specification of DisplayPort of 10.8GBit/s, but within the UDI specification of 16GBit/s.

Ideally you'd want

3840 (H) x 2400 (V) x 120 (R) x 36 (C) = 39,813,120,000 (39 GBit/s)

Yet there is nothing remotely capable of supporting that much bandwidth!
Da_Rude_Baboon 1st March 2007, 10:56 Quote
Quote:
Originally Posted by mclean007
Sweet. 1080p is getting quite reasonable now - you can get a 40" 1080p Sony 40W2000 for not much over £1200, which is quite amazing.

I wish people would get over this 1080p marketing BS band wagon. Unless you are using a display 50 inches or above then 1080p is pretty much useless as you can not tell the difference.
DougEdey 1st March 2007, 11:01 Quote
Quote:
Originally Posted by Da_Rude_Baboon
I wish people would get over this 1080p marketing BS band wagon. Unless you are using a display 50 inches or above then 1080p is pretty much useless as you can not tell the difference.

I would like to know if someone running a 2407FPW notices the difference between 1080p and 768p
mclean007 1st March 2007, 11:26 Quote
Quote:
Originally Posted by Da_Rude_Baboon
I wish people would get over this 1080p marketing BS band wagon. Unless you are using a display 50 inches or above then 1080p is pretty much useless as you can not tell the difference.
Depends how close you sit mate. From a comfortable viewing distance you should easily be able to discern the additional resolution of 1080 lines over 720. To say it is useless for anything less than 50 inches is daft - otherwise, why do people spend a fortune on 1920x1200 and even 2560x1600 res monitors with diagonals of 'only' 24" or 30"?

A 40" screen viewed from 6ft has exactly the same perceived aspect as a 50" screen viewed from 7.5ft, so why do you say the latter benefit from 1080p while the former would not?

Also, a 1080p screen can play 720p content with nice 1.5x scaling, rather than crazy number scaling as required on a 1366x768 screen, and can play 1080i and 1080p content at its native res.
Da_Rude_Baboon 1st March 2007, 16:19 Quote
Monitor usage for computer applications is different to television viewing so monitor comaprisons are a moot point tbh.

Who watches a 42" display at 6ft? By most standards the recommended viewing distance of a 42" screen would be 10 feet so anyone who would purchase a 50" screen to watch from 7.5 feet is an idiot imo.
mclean007 1st March 2007, 17:12 Quote
Quote:
Originally Posted by Da_Rude_Baboon
Who watches a 42" display at 6ft? By most standards the recommended viewing distance of a 42" screen would be 10 feet so anyone who would purchase a 50" screen to watch from 7.5 feet is an idiot imo.
Actually, The Society of Motion Pictures and Television Engineers (SMPTE) recommends that your TV screen should subtend an angle of 30 degrees in the horizontal plane, which translates to a viewing distance of 1.87xscreen width. The width of a 40" diagonal 16:9 screen is, by my calculations, a little under 34.9", so this gives an ideal viewing distance of 65.2", or 5'5". For a 50" screen, the figure is 81.5", or 6'9.5". Of course, it is a matter of personal preference. I like to sit reasonably close (in the 6ft range), so the screen fills more of my field of view, and from that sort of distance I can definitely distinguish 1080 lines of resolution from 720 - I guess if you are the kind of person to watch your TV from a very great distance, you won't be able to resolve the difference between 1080p and 720p, but then taken to its logical conclusion, you may as well watch SD rather than HD of any variety, because from a great enough distance it looks the same.

(Link - http://www.practical-home-theater-guide.com/Tv-viewing-distance.html)

In any event, it was just a comparison. Substitute 8ft and 10ft, or 10 and 12.5, it doesn't really matter. Point is, with 1080p content, a 1080p 40" screen will look superior to a 720p, a 1080i or a 1366x768 so-called 720p screen of the same size. If you disagree, that is your prerogative, and you are welcome to the savings you will make by buying a cheaper, lower spec screen. Personally, I have seen 1080p footage on a 1080p screen side by side with a 720p screen of similar size, and I know which I prefer. For me, the small extra outlay will be worthwhile.

EDIT: Your figures may be based on the guidelines for old-school analogue TVs, which (because of their lower resolution) would appear pixellated when viewed from closer than 3x the screen width. The article linked states that the human eye can resolve detail down to 1 minute of arc, so within your 30 degree viewing field from 1.87x screen width, you can resolve 1800 subdivisions. This correllates closely with the 1920 columns of a 1080p display (of course, the pixels closer to the view axis subtend a slightly smaller arc, while those on the screen edges subtend a slightly larger arc, but this effect is relatively small). As such, I would suggest that 1080p is the IDEAL resolution for viewing a screen of any size from the viewing distance of 1.87x width.
Vash-HT 1st March 2007, 19:15 Quote
Quote:
Originally Posted by Da_Rude_Baboon
I wish people would get over this 1080p marketing BS band wagon. Unless you are using a display 50 inches or above then 1080p is pretty much useless as you can not tell the difference.

I have a 37" 1080p LCD display, but I havent tried doing anything in 720 vs 1080p on it. My 360 is hooked up to it with a VGA cable at 1920x1080, and so is my PC. I don't watch any TV on it, it's purely a gaming screen right now. I may try hooking up the component cables on my 360 and see if theres really much difference, but for me I wanted 1080p for the 1920x1080 res vs. the lower res of 720p TVs.
Da_Rude_Baboon 2nd March 2007, 10:32 Quote
Quote:
Originally Posted by mclean007
..snip..
(Link - http://www.practical-home-theater-guide.com/Tv-viewing-distance.html)

..snip..

My apologies as from the link my calculations were based around SD and DVD viewing.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums