bit-tech.net

Nvidia Analyst Day: Biting Back at Intel

Comments 1 to 25 of 27

Reply
Spaceraver 14th April 2008, 09:27 Quote
So we have another war on our hands.. Good.. That means nice prices..
p3n 14th April 2008, 09:43 Quote
If nvidia keep up this retarded product nomanclature/progress I hope intel squashes them.
r4tch3t 14th April 2008, 10:15 Quote
Hmm, that bit about the video encoding on the GPU sounds very interesting. My current choice in CPU is based heavily on the fact I will be encoding DVDs fairly regularly.
Tim S 14th April 2008, 12:57 Quote
Quote:
Originally Posted by r4tch3t
Hmm, that bit about the video encoding on the GPU sounds very interesting. My current choice in CPU is based heavily on the fact I will be encoding DVDs fairly regularly.

Yeah, it's something I'm really excited about. I've been a big advocate of quad-core for its media encoding capabilities... but when you can speed that up by between 10-20x with a GPU, why would you want to buy a quad-core CPU? That's like an order of magnitude faster and it's only going to get faster and faster with more SPs. :)
Kipman725 14th April 2008, 13:54 Quote
the man speaks the truth for the moment I doubt it will be the case for many more years though.
LeMaltor 14th April 2008, 13:59 Quote
Can someone explain what Fab, IP, IA and sandbagging mean please? Thanks >_<
Blademrk 14th April 2008, 14:33 Quote
Fab = Fabrication
IP = Intellectual Property
IA = Intel Architecture (x86?)
sandbagging = delaying releases / products
chicorasia 14th April 2008, 14:34 Quote
Just for kicks, I ran a benchmark on my parents Dell - a pentium dual core 3,00GHz, 1gb, nothing fancy, a basic productivity machine.

using the integrated intel GMA X3100: 303 3Dmarks
using a discrete Geforce 7300GT: 2100 3DMarks

I didn't run it using a GeForce 8800GT, but I'd expect that to reach at least 10000 3DMarks.

Oh well, a tenfold increase in gaming performance over the next few years won't be enough....

<RANT>My macbook has a GMA950 integrated graphics chipset. Apple claims I can use it to drive an external monitor at 1920x1200 resolution. I have it conected to a 22" monitor, at 1680x1050 and the image is chock full of artifacts and rendering errors, even now as I am simply browsing the web. The same thing happens on a newer macbook (GMA X3100), and on a intel macmini (GMA950). Surprisingly enough, an older PPC macmini, with a Radeon 9250 integrated card, has no problems driving this monitor </RANT>
johnmustrule 14th April 2008, 15:37 Quote
I like the commpetition but Intel's got a mountain to climb, I think AMD and Intels soution is going to be similar which will be bad for Nvidia but if Nvidia establishes a physics model with AMD then they might stand a chance of swinging users to adopt the "GPU physics" option which is still hard to justify considering everyone under the sun's still going to have a CPU albeit they'll need a GPU to play any reasonable games. As far as Mental ray goes, it's amazing, I just started using it in 3DS max and it's fast and HQ! And as a CGI artist (a novice at best) I would have to say that water reflections are going to be the biggest raytracing benafactor, but more importantly, subdivided surfaces and physical fluids along with realistic hair most importantly, are the most important improvments Nvidia discussed, or mabey in three years Mental Ray will be efficient enough to run on future hardware.
Cupboard 14th April 2008, 16:54 Quote
The image about the difference is speed was very interesting - it seems to say that the GMA950 can cope with CoD4 but that the newer GMA 300/3100 can't
http://images.bit-tech.net/content_images/2008/04/nvidia_analyst_day_-_biting_back_at_intel/16.jpg

I do rather like the look of this though
http://images.bit-tech.net/content_images/2008/04/nvidia_analyst_day_-_biting_back_at_intel/97.jpg
frontline 14th April 2008, 17:14 Quote
The problem for Nvidia at the moment is that, despite being rivals in the CPU manufacturing business, Intel and AMD/Ati appear to have a cosy relationship.
Jojii 14th April 2008, 17:47 Quote
Quote:
... that will be known as Tesla2 and there have been several hints in the presentations at new features that we're likely to see included in that architecture.

Care to describe said hints? Mondays call for juicy speculation.
Tim S 14th April 2008, 18:01 Quote
The actual details of chip specifications are pretty scant, but the graph on page four titled "Era of Visual Computing" suggests that we're going to see a respectable performance boost with the new generation. I also believe we'll see some new features added: things like C++ support in CUDA (2.0), some form of tessellator (or subdivision surfaces) engine, and some optimisations to enable geometry synthesis and more realistic hair.

There were some more, but these are the ones that I remember because they were very obvious. The other hints came from the way certain things were said by Huang, Tamasi and Hegde in particular. :)
xtremeownage 14th April 2008, 18:55 Quote
Why Intel would say Integrated Graphics are better is beyond me... I for 1 hate when companies decide to lie to consumers....

Integrated Graphics don't run games like ,Bioshock, Crysis Medium and definately not High or Very High setting. If they do the frame rates will be so poor has shown in the figures. They wont be able to run crysis on High in the next 3 years. Anyone who purchases this carbage from intels attempt to enter the graphics market is just decreasing the number of real next gen graphics engines that will be running Half Life 3, Crysis 2&3 and god knows what amazing games are around. If it werent for Nvidia you wouldnt have had the XBOX with halo & h2.Plus the Nvidia RSX GPU in the PLAYSTATION 3! Besides why purchase IG for games you wont be able to run ^^.

Im so disappointed some people decided to take intels side. My only assumption is probably they dont own graphics card or dont play games. I love Intel for its processors, not its graphics. Companies should specialize on what they are good at instead of butting in and making things messy in a already loved and advanced industry. (CPU (4 cores)) ----->GPU(128 Cores!)

For Intel to point the hand at Discrete graphics cards and say they are not it is like saying ok lets scrap all the visual goodness and go back to play shity games without graphics on Dos machines. Those words spoken by intel in the conference really annoyed many and im glad the Nvidia CEO gave intel a revenge they surely deserve. Im surprised Ati hasn't said anything....there GPU is in the XBOX 360. and running Half life 2 EP2 in PCs really well..
Its hard to give gamers the real visual experience already ,, So plz Intel go away....Im so annoyed now. I'm dissapointed the Cell Processors in the PS3 wasn't brought to Desktops because i would scraped intels junk.(it good junk but old compared to the PS3s processing power).

I can already see it. Noobs brain washed into buying this lerrab (or wotever IGP Cpu from INtel)....They go in the shop. get a PC and try play crysis or crysis 2 ahahaha then say WTF!. AFter a night of cursing and ranting they return the PC after 2 days and ask for there money back. NExt..They remember reading this post and go out and buy Geforce10,000GT lol Man oh man. They eventually realise there screen xplodes into life, absorbed by the rich goodness, the stunning shadows, man oh man its like being in heaven he or she says, eyes wide open , mouth dribbling wide open.

If you are really thick headed after reading what i have said you would go out to the shop and buy an Intel integrated graphics motherboard or PC to play Bioshock in dx10 (high settings)....fOR YOUR sake hang yourself for all our sakes.

(sorry for my english if it is bad ,,its not my first language)

YAYAYA my GPU 9800GX2 (birthday gift ahaha) for now...eat that INTEL. BEat this f**kers frame rates on High settings in the next 2 years and i'll kill myself.
Listen to this guy: PC s are not for gaming ( YEs thats true...PC's are for making games!--so we still need the graphics cards dumb ass to make the games )
PC's are for making games for consoles etc. Consoles rock. PC games are for Hardcore gamers playing large scale strategy games like Galactic Civilization 2 , Supreme commander with a mouse for quick actions, plus for high rich graphics games that surpass console versons. The PC still remain and will continue until they make a console that has a GPU that is upgradable and mouse.
Anakha 14th April 2008, 18:59 Quote
Again, little mention that RayTracing + CUDA = Massive Win! Raytracing is a very simple algorithm that is naturally parallel, and the more cores (Or in this case, "Stream Processors") you can throw at it, the better.

With 128 "cores", each tracing a single ray on a screen 1600x1200, with 10 "Bounce" rays, you would need a 9MHz processor to get a stable 60Hz display. ((1600*1200*60)/128*1000000). Considering most GPUs are in the hundreds of MHz range (If not the thousands of MHz), this gives a VERY detailed scene with LOTS of reflection. All you'd really need is a way to use that RayTraced image directly on the card (IE, output from the Stream Processors directly to the FrameBuffer to be rendered) and you have Real-Time RayTracing OOTB.

I'd put a dollar down for that. Anyone else?
EmJay 14th April 2008, 19:10 Quote
This reminds me of the sparring between Boeing and Airbus a few years back - Boeing talked up the size of their largest planes, and effectively goaded Airbus into one-upping them by making something even bigger. As soon as Airbus had sunk too much money into it to pull back, Boeing scrapped all their plans and announced that they'd be focusing on fuel efficiency instead - an extremely popular choice with the airlines. +1 for Boeing.

I'm wondering if Intel is doing the same thing here - making lots of noise about ray tracing, Larrabee, and the death of the GPU, coaxing nVidia into making even bigger (and more expensive) graphics cards, and then quietly making something totally different. People are screaming that current integrated graphics are worthless for gaming, and even a 10x improvement won't be enough, but let's not forget that the technology for decent graphics already exists - it's not like Intel has to research everything from scratch, they just need to find a version that doesn't violate copyright and start building it into their systems. Still not an easy task, but I'll bet they're going to push nVidia into catering to the gaming market (which is only a fraction of the computing industry as whole), while they quietly eat up everything else. The GPU may still exist - but I'm guessing the 'average computer' won't include one in another five years.
xtremeownage 14th April 2008, 21:23 Quote
like i said Emjay the technology has been around for as long as i can remember but the point im making is....unless its not integrated and the GPU is on a separate card then integrated graphics are not for gaming.

The games that will come out in the future will push Nvidia and ATi to create better GPU's that system vendors will put in their systems e.g Dell, APPle, Alienware. The average PC today from dell at the low end is about $500. It has an integrated GPU from INtel.If you purchase this machine (1) Your Budget was $500 and below (2)It is an Office PC that never uses the IG... NO 1 gets integrated GPU and uses them for the latest games. As a consumer in this industry i can already see that IG will need as many as 128cores to level todays GPU power. Playing games on a future IG in say 2 year would mean 4cores X10 40 cores>>>not sufficient.

The whole point of this arguement between the 2 is them saying you dont need GPU's but instead use IG which is absurd considering the looooooooooow performance u get from IGs.

Im guessing processors from intel with many cores will be revealed much much later though there is a prototype in place. At the pace GPU's are going game developers will opt for GPU;s other than IG. ATI's GPU has over 400 cores. ATI Radeon™ HD 3870 X2 Graphics though out performed by Nvidia 9800GX2 it is so advanced and i dont see intel catching up in 5 years because by then GPU's will cross the 1000 core mark, processing teraflops of data and producing the most sophisticated graphics and physics never before seen. With the scale of GPU growing smaller even Intels 80 core processor will have a hard time pulling off spectacles the ATI nVIDIA CARDS will be producing. The world economy is growing and every decade it becomes cheaper for the the average user to purchase GPU;s due to increasing income and GPU's shrinking in manufacturing processes which cuts costs.
Other than purchasing PCs for office work people will purchase hand held devices like iphone 2 lol , use a wireless keyboard and use word processors and spreadsheets. Oh plug in a usb monitor and do whatever work we please. Nvidia's APX 2500

http://www.nvidia.com/object/hh_games_demos.html

Purchasing a PC with furios processing power would be a leisure thing. I myself dont see the need for PC's with Intel integrated graphics. Infact i see laptops today using Nvidia GPUs making PC's with integrated graphics look ancient. Hand held will be for Office...PC;s will be for playing amazing games. Consoles will replace PC gaming maybe the Graphic vendors like Nvidia will provide the horse power for the consoles. I think the Motherboard will soon shrink in size. GPU could be the size of my wallet-who knows. THE POINT IS GPU will LIVE ON and IGs WILL ALWAYS SUCK and wont be needed if most people will be able to afford GPU;s
Bladestorm 14th April 2008, 22:53 Quote
Everytime I think maybe the GPU market has finally settled to the point I can feel justified in buying a new GPU (primarily to finally play bioshock on lets face it) I hear of something big coming out in another month or two and I put it off again.

Its getting to me a bit. I think its probably a contributing factor though, that when I put the current PC together, the 7900GS game out between me buying my parts and actually finishing building the thing (had to mod the case a bit to get the watercooling sorted first and my dad had an accident with a too long screw leading to a radiator repair delaying things by a couple weeks) and delivered something like 30% more performance for a chunk less cash (and given the GT was £220 at the time, that one stung a fair bit!)

Right now I'm both very tempted by significantly stock overclocked 8800GT 512mb for £129 and somewhat overclocked 8800GS 512mb for £165, both of which seem like nice value .. but if they may only be good value because something much better (and/or forward looking) is coming out in a month or two it might all be false economy >.<
r4tch3t 15th April 2008, 01:21 Quote
Quote:
Originally Posted by Anakha
Again, little mention that RayTracing + CUDA = Massive Win! Raytracing is a very simple algorithm that is naturally parallel, and the more cores (Or in this case, "Stream Processors") you can throw at it, the better.

With 128 "cores", each tracing a single ray on a screen 1600x1200, with 10 "Bounce" rays, you would need a 9MHz processor to get a stable 60Hz display. ((1600*1200*60)/128*1000000). Considering most GPUs are in the hundreds of MHz range (If not the thousands of MHz), this gives a VERY detailed scene with LOTS of reflection. All you'd really need is a way to use that RayTraced image directly on the card (IE, output from the Stream Processors directly to the Frame Buffer to be rendered) and you have Real-Time RayTracing OOTB.

I'd put a dollar down for that. Anyone else?
Doesn't quite work like that. You would definitely need more than one clock to calculate the vector of the reflected bounce, plus how much of the intensity should be kept and how much dispersion it needs. Plus that would only be the ray tracing. You still have calculate where everything is in 3 dimensional space so it can calculate the ray tracing (although that may be done by the CPU) So let say it takes 50 clocks to calculate a bounce that means you would need ~ 500MHz, and that's just for ray tracing, you still have to render the geometry etc.
Don't quote me on this as I don't have a good understanding of how ray tracing works (I know the basic idea) or how ray tracing would affect other areas of graphics processing. If Ray tracing was as easy as resolution x refresh rate/#steam processors x clock speed then it would have been done years ago instead of creating raster engines to do the work.
BlueOcean 15th April 2008, 01:38 Quote
This person he is a ceo. He own Nvidia corporation.
xtremeownage 15th April 2008, 04:28 Quote
Bladestorm YOU are so right. Its annoying how they release cards every 3 months or so each time being better than the previous card. I stayed with my 8800GTS 640mb for a long time and overclocked it slightly but only got marginal performance. I think games should be made for the first High End card with a GTS and GT mark from Nvidia e.g 8800GTS. That way everyone can enjoy the game but for better frame rate GTX are optional. GTX should be more powerful better yet they should drop GTX after 1 year and replace them with Dual solutions (GX2s). The reason being consoles stay with the same graphics cards for years and console gamers enjoy all there games with no system specs problems. The lifespan of consoles is 4-6 yrs. In that time the number of games produced is massive and are pretty good. For example the new upcoming Star wArs the force UNleashed looks pretty good on a GPU in the xbox360 thats almost 2 years old. Its sad to see some games come out and cant be played on high end systems with all the Visuals turned up. When they say Sli is not it for games i clearly agree on that fact because i cant afford sli because i would need a new motherboard. However the 9800GX2 gave me the opportunity to experience a level of gaming i have always wanted thats resembles sli & is faster. In terms of performance i experienced 2 times the performance on 1 card. I was pretty impressed and this card is my best buy from Nvidia to date.
SLI should be replaced by single GPU solutions because they are efficient at processing data. Those opting for Quad then its optional but buying 2 cards should atleast mean we get a 15% discount OMG nvidia . Your stuff is too expensive.
metarinka 16th April 2008, 10:47 Quote
so the war is Intel VS Nvidia? IG vs discrete, what ever happened to ATI/AMD we need all the extra competition and innovation we can get. They've already mentioned sandbagging on Nvidia's part, sometimes being on top can make you lazy (like AMD trouncing INTEL with x2)

at any rate I take most things intel says with a grain of salt. They have a really smart PR team and can create buzz where nothing exists such as centrino for laptops which was what? Nvidia cpu+chipset+wireless controlle (I'm forgetting the 3rd part) Big deal. Same with how hard Intel pushed BTX a very terrible idea.
Tim S 16th April 2008, 10:56 Quote
Intel will be releasing a discrete GPU in a couple of years (that's what all this Larrabee talk is)... Basically Intel will play down the viability of current GPUs in the future until its convenient for it to say otherwise. Having said that, Nvidia have done the same (we won't make a CPU, we won't make a CPU... then they launch APX2500, which is a system on a chip for mobile phones and includes CPU logic).

Basically, they're both as bad as each other. :)
sbenrap 17th April 2008, 10:30 Quote
I'm quoting the last part of the article:
"Huang doesn't seem fazed by Intel's push into his territory at the moment, but he said he remembers the scars he got following the release of the GeForceFX architecture"

Can someone please elaborate what "scars" he's talking about?
I don't remember any big deal between nVidia and Intel when the GeForceFX was released...

Thanks :)
Tim S 17th April 2008, 10:39 Quote
Quote:
Originally Posted by sbenrap
I'm quoting the last part of the article:
"Huang doesn't seem fazed by Intel's push into his territory at the moment, but he said he remembers the scars he got following the release of the GeForceFX architecture"

Can someone please elaborate what "scars" he's talking about?
I don't remember any big deal between nVidia and Intel when the GeForceFX was released...

Thanks :)

There were no scars from between Intel and Nvidia... just the fact that GeForceFX could have quite easily been the end of Nvidia had it not pulled itself out of a hole with the 6-series. :)
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums