bit-tech.net

Rumour: AMD to support USB 3.0

Rumour: AMD to support USB 3.0

AMD's Hudson D1 southbridge - due towards the end of the year - could feature USB 3.0 support.

Rumour has it that AMD is looking to get one over on long-time rival Intel with its next southbridge, by offering integrated USB 3.0 support.

According to everybody's favourite rumour broker, DigiTimes, AMD's next-generation laptop southbridge - the Hudson D1 - could well feature support for the high-speed interconnection offering by the end of the year.

Designed for ultra-portable laptops and netbooks, the Hudson D1 southbridge is expected to appear in products during the fourth quarter of this year - and if USB 3.0 support is true, will beat Intel to the punch by quite some time with the company not looking to include USB 3.0 support in its own chips until 2012 at the earliest."

DigiTimes' claim comes from "sources from notebook makers" who, in turn, claim that AMD is talking to Renesas Electronics - since merged with NEC - about licensing the technology for use in its up-coming chipsets.

If true, it's a canny move on AMD's part: by including USB 3.0 support in its southbridge, AMD can offer a one-stop solution for manufacturers wanting to produce ultra-portable devices that offer the latest connectivity - whereas those looking to use Intel chips will be left scrambling for a third-party USB 3.0 controller, which adds complication - and additional power drain - to the design.

So far AMD hasn't confirmed or denied the rumours, so we may just have to wait until the end of the year to see what future ultra-slim AMD-based laptops are packing.

Is USB 3.0 a major consideration for future laptop purchases, or do you think that Intel is right to sit this one out and let others - including AMD - try the technology out and gauge demand? Share your thoughts over in the forums.

17 Comments

Discuss in the forums Reply
r3loaded 28th July 2010, 10:09 Quote
Well, people aren't going to use USB 3.0 devices until their computers have the right port for it. I don't know what Intel's playing at tbh.
John_T 28th July 2010, 10:55 Quote
Quote:
Originally Posted by r3loaded
Well, people aren't going to use USB 3.0 devices until their computers have the right port for it. I don't know what Intel's playing at tbh.

Exactly!

I want USB 3.0, though I have no plans to write to Intel to tell them that - I'll just buy it when it's there.

2012? That's a long lead time, I can see that being revised...
The_Beast 28th July 2010, 10:58 Quote
eSata is where it's at :)
Deadpunkdave 28th July 2010, 12:35 Quote
Intel are playing at launching Light Peak at twice the speed of USB 3, rising to 20x the speed of USB 3 over ten years. I expect they'll be plenty happy for AMD to have the expense of doing this if they can announce that they will launch Light Peak products within ~4 months of the AMD products. People will wait for the superior technology - just not too long.

Edit: From Intel's research website http://techresearch.intel.com/articles/None/1813.htm:
Quote:
We expect that the components will be ready to ship in 2010.
Altron 28th July 2010, 14:30 Quote
Quote:
Originally Posted by Deadpunkdave
Intel are playing at launching Light Peak at twice the speed of USB 3, rising to 20x the speed of USB 3 over ten years. I expect they'll be plenty happy for AMD to have the expense of doing this if they can announce that they will launch Light Peak products within ~4 months of the AMD products. People will wait for the superior technology - just not too long.

Edit: From Intel's research website http://techresearch.intel.com/articles/None/1813.htm:
Quote:
We expect that the components will be ready to ship in 2010.

That's fine and dandy, but it's not going to kill off USB.

USB has backwards compatibility, and it's cheap.

From the sounds of it, Light Peak looks to be aimed at Ethernet and SATA (maybe eventually HDMI) moreso than at USB.

There are a number of advantages to USB, not the least of which is that 5v 200mA bus. I don't think there would be any advantage to moving towards a fiber interconnect for basic peripherals that already don't come close to saturating the USB 2 connection, much less the USB 3 connection. Stuff like keyboards and mice transfer a miniscule amount of data compared to a hard drive, but benefit from having a single connection for data or power, and a low cost of manufacturing. Your old USB devices won't work on Light Bridge, but they will work on a USB 3 controller.

I lol'ed when I read that they could transfer a 25Gb blu-ray in 30 seconds. That's fine and dandy, way to go Intel. You've invented such a fast connection. The hard part is over. Now all the manufacturers have to do is design a blu-ray drive that reads at 1GB/s, and a hard drive that writes at 1GB/s. Shouldn't be that difficult at all. The only thing that has a realistic shot of hitting that speed within the next couple of years are SSDs, but SATA 6Gbps is nearly as fast, and already on the market. SATA is limited by the range, however.

A very high speed optical link wouldn't need to be in basic peripherals. It would have its biggest advantage where very high data transfer speeds are needed over long distances. And that's ethernet. To me, Light Bridge doesn't sound like a better version of USB... it sounds like a common standard for ethernet, HDMI, and eSATA. Your TV plugs in, and you have internet access, and the ability to play off of your computer's hard drives, or set it up to display the computer's video. You have a NAS in a different room that is just as fast as the hard drive in your machine. Two computers far apart can share files, and even share processing power and memory of the latency is low enough and the data rate is high enough.
Krikkit 28th July 2010, 14:37 Quote
Hit the nail on the head for me there Altron - USB is *the* standard right now, because it's totally convenient. Not because it's faster, less complex or better specced than its rivals but because it's cheap, backwards compatible, and everywhere.

LightPeak is exciting tech - I can't wait for optical connections to become commonplace and liberate us from the crappy copper interconnects for high-speed devices, but we still need something convenient for small peripherals and such. That, like it or not, is USB.
leexgx 28th July 2010, 15:31 Quote
USB 2 is 400-500mA {500 is the spec} not all laptop and motherboard makers follow the spec correctly and can not provide 500mA Like the Sony Viao i was playing with last night i could not get an USB mobile dongle to work as it would reset when it connects to 2g or 3g he had to use an powered hub or an 2 to 1 usb cable to provide power extra power from second USB port

950mA for USB3 (again should be) tend to only find esata on higher end boards (some of my barebone systems have it now that i get) intel dragging USB 3 role out is going to hamper USB 3 for the next 5-7 years (takes 3-5 years from once intel puts something into the system to become an standard)
Deadpunkdave 28th July 2010, 16:52 Quote
Quote:
Originally Posted by Altron
That's fine and dandy, but it's not going to kill off USB.

I quite agree. However, USB 2 works perfectly well for peripherals and the myriad uses it is currently employed in, apart from data transfer. Its a great technology in its own right. USB 3 is striving primarily for that faster data transfer, and that is precisely where optical technology is going to win out. I really don't think that its too far fetched to have a Light Peak external hard drive and once there is, where's the advantage of the USB 3 standard over USB 2?

The other change, iirc, is that USB 3 will be able to supply more power. If that means it can do away with the power brick for external HDDs, then there's a case for making that trade off. Within 2-3 years though, the data rate of optical connections will so far ahead that that advantage will disappear.

Full disclosure: Not an Intel fanboy btw, Light Peak might not even be the dominant optical standard. Starting my MSc in Optics and Photonics in October, however, so this kind of tech is going be my field of expertise ;)
Altron 28th July 2010, 17:47 Quote
Quote:
Originally Posted by Deadpunkdave
I quite agree. However, USB 2 works perfectly well for peripherals and the myriad uses it is currently employed in, apart from data transfer. Its a great technology in its own right. USB 3 is striving primarily for that faster data transfer, and that is precisely where optical technology is going to win out. I really don't think that its too far fetched to have a Light Peak external hard drive and once there is, where's the advantage of the USB 3 standard over USB 2?

The other change, iirc, is that USB 3 will be able to supply more power. If that means it can do away with the power brick for external HDDs, then there's a case for making that trade off. Within 2-3 years though, the data rate of optical connections will so far ahead that that advantage will disappear.

Full disclosure: Not an Intel fanboy btw, Light Peak might not even be the dominant optical standard. Starting my MSc in Optics and Photonics in October, however, so this kind of tech is going be my field of expertise ;)

oh cool. Where are you going to school? I'm about 6 credits off from a BSc in optics.

I don't think it is so much that USB 3 will have a huge advantage over USB 2, but rather that manufacturers will move to USB 3 just so they can have that compatibility. There will be a solid two or three generations of fast USB 3 external storage devices, and people will want support for them. Already there is plenty of first-gen USB 3 storage out, and they will get more out before optical solutions become mainstream. People will be more inclined to buy USB 3 equipped computers for future-proofing. It's quickly becoming standard on enthusiast equipment (alongside SATA 6gbps).

I remember the USB 1.1 to USB 2.0 transition pretty well. Like the current computers, the computers then would have a full set of USB 1.1 ports (which was generally two on the back and two on the front if you were lucky), and then a special pair of USB 2.0 ports.

I made sure the computer I bought in 2002 had those two special USB 2.0 ports for making it "future proof". I think it was three years and two computers later that I actually used a device that needed USB 2.0, which was a 256MB flash drive. Only USB device I had in between was a printer and a mouse, both of which were USB 1.1 devices.

Right now, we're seeing only a couple of peripherals that would benefit from USB 3, yet every enthusiast motherboard manufacturer is including it. That's the beauty of backwards compatibility. You might not have a USB 3 device yet, but you can use the port like you would a normal USB 2 port, except you have the capability of moving to USB 3 devices in the future.

There's nothing to be lost by transitioning to USB 3... there aren't any trade-offs. All your old devices will work just as well, and you have flexibility. I can take my USB 1.1 printer from 2001, and use it in a computer equipped with only USB 3 ports. It will work fine. It's not so much that there are huge gains to be had in areas aside from fast external storage (and eSATA does a pretty good job handling high speed external drives), but that there are no reasons not to switch over.

LightPeak will have a harder sell, depending on how they market it. Look at firewire. It is faster than USB 2, and very popular amongst a very specific set of users. And I've always had a firewire port on my mobo, just in case. I've never owned a firewire device. Ever. I do have a firewire cable, I think. It pisses me off because I always find it when I am looking for a USB cable in my old Coolermaster case box that is full of cables.

MOST users of external storage are using it for backups or data warehousing. You want a big, cheap drive, so that means a 5400 RPM USB 2.0 drive for most people. The people that use their external storage often are the ones who spend a little extra for fast Firewire or eSATA drives.

I just can't see this being able to compete with SATA. As much as people talk smack, copper is cheap and fast. The biggest advantage to glass is distance - it can go a lot further without signal degradation, at much higher speeds. But for the 1-2 meter long cables used for computers, fiber doesn't give you an advantage. It adds cost and complexity. I don't care how cheap you can build the materials, you're competing with a cable that costs $0.10 to make. You'll still need the correct controller electronics, just instead of them directly connecting over copper, you're modulating it to fiber, and then demodulating it at the other end.

Where the advantage lies is with long distance, and with very high speed data access. NAS and streaming video. For the average home user, they will appreciate the simplicity. Even if the fiber only costs a couple bucks to make, most consumers won't want to pay that extra fee on their $20 flash drive of $50 external hard drive. Where they will pay that fee is on their $1000 computer or their $1000 HDTV.

The newest HDMI standard includes provisions for 100 megabit ethernet over HDMI. That's where we are headed - a single very high bandwidth connection. Forget about set top boxes, DVRs, HTPCs. If they can make an inexpensive connection that works at 100 meters and can do over 10 gigabits, you won't need any of that.

You'll have one big server in your closet, hooked up to the internet. Want to watch TV? Fire up the HDTV, and it will stream the video signal from the server to the TV. Want to watch a movie? it can stream it right from the hard drive? Internet? You bet. Gaming? Just grab a wireless keyboard and mouse, the connection is fast enough that the server hardware can do all of the processing. it's your personal cloud. You might have a PC, but it might be a dumb terminal, just connecting to your server and having the server do all of the work.

It's all possible with very low latency, inexpensive >10 gigabit home networking.

many people already have all of their media go through fiber. The only copper running into my house is for power. There's a single fiber line that comes into my basement, and into a big white box mounted on the wall. All of the phones, all of the televisions, and all of the computers go into that box.

Right now, it's segregated. There's no internet or phone on the TVs. No phone or TV on the computers. No internet or TV on the phones. Because that's what people are used to. But it doesn't have to stay that way. It won't stay that way. If my computer is connected with a gigabit ethernet to the box that my TV comes out of, why can't I stream TV to my computer? Make landline phone calls with a headset? Why can't I send the output of my monitor right to a TV, to watch a movie on it?

It's because that's the pricing model people are used to, and it's because there isn't enough bandwidth for all of this multi-tasking. If I want internet on my TV, I need to run a separate ethernet line to it. If I want TV on my PC, I need to run a separate coax line to it, and get a TV tuner. but the distinction is only in our perceptions - the phones, the TV, and the internet are all just packets on that same piece of fiber running out of my house.

A low-latency, high-speed fiber connection, standardized to every 'smart' device... that would allow them to break down the walls separating these different data types. That's where fast and cheap fiber will take us - not just making a new, more complicated way to plug in our external hard drives, or to connect our mice (I still use a PS/2 keyboard)

It's a beautiful thing.
Deadpunkdave 28th July 2010, 18:56 Quote
Ah nice, don't think we can specialise so early this side of the pond. I'm doing my masters at Imperial College London.

I do hope your vision of the future home network is right Altron. I read recently (ish) that a number of big players are trying to make an AV standard using Cat5e and Cat6 cables to replace HDMI which will maintain most of the segregation you talk about, though it will offer improvements and incorporate some connectivity.

Perhaps I am in much more of a minority than I thought in my willingness to pay extra for peripherals and components which use optical technology.
HourBeforeDawn 28th July 2010, 19:09 Quote
What I want is eSata with power built into a single plug, whoever first thought of eSata should have been smacked for not incorporating power into the plug design .
MrZephyr 28th July 2010, 19:33 Quote
Quote:
Originally Posted by HourBeforeDawn
What I want is eSata with power built into a single plug, whoever first thought of eSata should have been smacked for not incorporating power into the plug design .

Agreed! Got myself one of these instead: http://www.antec.com/Believe_it/product.php?id=MjA3NQ==
Altron 28th July 2010, 20:30 Quote
Quote:
Originally Posted by Deadpunkdave
Ah nice, don't think we can specialise so early this side of the pond. I'm doing my masters at Imperial College London.

I do hope your vision of the future home network is right Altron. I read recently (ish) that a number of big players are trying to make an AV standard using Cat5e and Cat6 cables to replace HDMI which will maintain most of the segregation you talk about, though it will offer improvements and incorporate some connectivity.

Perhaps I am in much more of a minority than I thought in my willingness to pay extra for peripherals and components which use optical technology.

Copper ain't so bad. The biggest issues are EMI/RFI, and signal degradation over distances. Short runs and shielded cables solve most of those problems. The hypertransport, for instance, is copper. It's short, and it is in an EMI shielded case, and it can do 50+ gigabits per second.

Fiber optics ain't perfect, either. You still have attenuation, just not as pronounced. A big issue is group velocity dispersion. "Chirping" results in signals spreading out. To increase the data rate, you need to make the fibers shorter. Granted, this is usually a case of making a 50km cable instead of a 100km cable, but it's an issue. To put it simply, a fiber optic works by flashing a laser very quickly, as a series of ones and zeros. However, due to a phenomena called pulse broadening ("chirping") that occurs in dispersive media such as fused silica, the ones and zeros blur together after awhile, because the pulses broaden into each other. The longer the cable, the more chirping that occurs. The more chirping that occurs, the lower the maximum data rate. If you have a very short fiber, you can flash the laser on and off at hundreds of gigahertz, because the pulses won't broaden enough for you to not be able to distinguish a 1 from a 0. But if you have a 500 km fiber, you must flash the laser at a much lower speed, because the pulses will broaden together after a certain distance, and you won't be able to tell a 1 from a 0.

So fiber is the next step, and it's where communications are heading towards, but it is not the second coming. It solves many of the problems with copper, but presents many new problems. It has an advantage for very long distance communication, but that advantage is minimal at short distances like those your peripherals would be using, and the increased cost and complexity of going to fiber would outweigh them.

Short fiber will make more sense once actual optical computers make an appearance, since the signal can stay optical the entire time. But simply converting an electrical signal to an optical signal, then back to an electrical signal a couple feet later is kinda silly. It really would only have a tangible benefit once you get above 10 meters, when the attenuation in copper starts to make high speed communications difficult.

But, right now, it's not a crazy leap in data rate. A copper HDMI cable can do about 10 gigabits per second, same as this, and can do it at 10 meters. The different buses in the motherboard can do way faster than 10 gigabits.

It's not energy efficient, either. A regular ol' copper cable takes an electrical signal and moves it, attenuating it slightly based on how big of a signal is and the conductivity of the cable. This takes an electrical signal, uses it to modulate a laser (which requires power), then uses sensors to read the laser and produce an identical electrical signal (which also requires power).

In terms of energy efficiency, it scales better. Once the optical signal is created, it has much lower attenuation than the electronic signal. So there would be a critical value of distance where the copper is more efficient below that distance, but the fiber is more efficient above that distance. I don't know power consumption numbers for this, but I'd be shocked if it is less than ~10 meters.

I read that Cat6 article too. Interesting. What confused me was this, though - are they planning on implementing a network packet protocol for HDMI, or simply using Cat5e/Cat6 as a conductor? I've seen on monoprice some wall plates which have a HDMI and two RJ-45s. because cat5e is much cheaper and more flexible than HDMI cable, the idea is that you run two cat5e cables in the wall from one plate to the other across the house, and connect them such that the HDMI signal directly travels over the Cat5e. Obviously, if you plugged these into a switch or router, it wouldn't work. Now, I'd be interested if it was a protocol for streaming HD video over the network, using TCP/IP packets.

On a related note, HDMI 1.4 supports 100 megabit Ethernet over the HDMI cable. That's a step in the right direction.
Chimel 29th July 2010, 21:17 Quote
Fiber optics may be faster, but I am not sure how they suit computer cases where you need to tap cables, make them take sharp turns, sometimes use perpendicular connectors, etc.

In any case, today's technology is USB 3.0, so it seems aberrant to wait until 2012 to release USB 3.0 connectivity when possibly some new USB 3.5 or 4.0 standard will appear at that time. If you buy a laptop with an Intel USB controller today, you will never be able to fully use USB 3.0 devices just because Intel made the wrong technological choice not including support for USB 3.0? Maybe it's only computer enthusiasts who require USB 3.0 today, but I am pretty sure that any laptop owner, not just enthusiasts, will be pissed off in 2012 if the laptop they bought 6 months earlier does not support USB 3.0 devices, which will be mainstream if not on the verge of obsolescence at that time.
l3v1ck 30th July 2010, 10:52 Quote
Question: can you add USB3 to existing laptops with an express card?
Elton 30th July 2010, 19:51 Quote
Quote:
Originally Posted by l3v1ck
Question: can you add USB3 to existing laptops with an express card?

I would presume so, it's basically a PCI-E extension.

Where is that ExpressCard 2.0 anyways?
techU 7th November 2010, 00:52 Quote
i realise this thread is quick old now but i just had to register to coment ;)

Altron you give off good accurate points regarding speed of light peak and its many uses.... BUT you
Totally fail and miss the obvious when it comes to low speed USB peripheral's.

get a simple and cheap USB to Light peak dongle that's sure to appear 2 minutes after light peak kits appear in the shops, and plug it in anywhere along the upto 100 meter/300 foot hair thin flexable fibre chain ,and connect for instance you generic USB TV dongle, and use that USB dongle/device on any light peak connected PC, done.

i expect to see along with USB to light peak, all these other device (hard drive, fire-wire, ETHERNET etc) docking stations made, and anything else the 3rd party vendors think they can make money on.

come 1Q 2011 i also expect to see a potential light peak fibre wrapped in copper to provide power at a far better distance than any USB , a 5 meter/15foot USB extension cant power that USB TV dongle for instance or the far more common reason to use an extension a wireless 11g/n card set up to run as a wireless router etc...


you already said light peak is a multi protocol fibre cable so you Must also know that its designed to carry all these protocols AT THE SAME TIME

http://www.youtube.com/watch?v=izNoF1SWtSg

""..With Light Peak, Apple asked Intel to develop a single data port that could supply multiple, high speed streams of data capable of carrying virtually any type of signaling: networking protocols like Ethernet and Fibre Channel; standard audio and video signals such as S/PDIF, HDMI and DisplayPort; and serial interfaces such as FireWire, USB, and eSATA. Using optical signaling, Light Peak can achieve very high data speeds over relatively long cables that can be very thin; copper cables have problems with signal attenuation, electromagnetic interference, and bulk.

Light Peak offers the capacity to upgrade existing signaling protocols to work over high speed optical cables driven down in cost by volume production. Additionally, with any type of signal available through a single optical port, both notebooks and smaller mobile devices can shed today?s overlapping variety of limited capacity ports for a single pipe that delivers virtually any kind of data at extremely high speeds. This would allow a laptop to plug into a monitor via one thin cable, and then allow the display to offer standard jacks such as USB and Ethernet networking. ..."

Read more: http://news.cnet.com/8601-30685_3-10360047.html

Number One light peak device and driver off the bat from the 3rd party's will or at least should be is OC
'Ethernet over Light peak' as its just a simple ethernet to light peak driver patch thing once you daisy chain connect your 2 channel transceiver 4 port pciE cards and fibre cable up together with another PC card etc.

tell you You dont want the potential of simple and quick 10Gig Ethernet for all your LAN PC's and and Freenas storage devices as your very first light peak device, and ill show you a non tech reader that doesn't get the whole simple point or the point of a LAN come to that.

that 10 Gig Ethernet alone is far more than enough to justify light peak Today never mind all this but ,but about low speed USB , get yourself a 'cheap USB/whatever to light peak' docking station (with power?) for your old devices when they arrive and be happy.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums