bit-tech.net

The Problem with Real World Benchmarking

Comments 1 to 25 of 50

Reply
Nexxo 5th July 2011, 09:05 Quote
It's overcomplicating things. Why not simply ask:

But can it run Crysis? :p

EDIT: Oh, and in response to Tularin's post (#4): First! :p
Apocalypso 5th July 2011, 09:18 Quote
Quote:
Originally Posted by Nexxo
It's overcomplicating things. Why not simply ask:

But can it run Crysis? :p

I could justify upgrading our entire estate based on that ;)
Tulatin 5th July 2011, 09:21 Quote
"But can it run Crysis?"
Congratulations, you're responsible for making the 4,877,921,204th post containing this joke. Unfortunately, it's now less funny than "lolcats". You could have at least wasted your comment by partaking in Internet tradition, spewing "First!" meaninglessly into the world.

If only we could have a time when people contribute meaningful discussion to articles, perhaps wondering if Intel influences Bapco's habit of only focusing on a tiny sliver of the real world?
Pete J 5th July 2011, 09:25 Quote
As pointed out by the article, it's unsurprising that Nvidia and AMD walked out because *puts on whinny voice* 'it doesn't represent GPGPU usage'.

What is typical PC usage? Most of the time I'm using an internet browser with about 10 pages open, outlook, three or so word documents, a few PDFs, an excel spreadsheet with perhaps only 300 cells being used and a couple of paint files. Do I need the latest and greatest hardware to do this? Of course I bloody don't!
Apocalypso 5th July 2011, 09:28 Quote
Quote:
Originally Posted by Tulatin
"But can it run Crysis?"
Congratulations, you're responsible for making the 4,877,921,204th post containing this joke. Unfortunately, it's now less funny than "lolcats". You could have at least wasted your comment by partaking in Internet tradition, spewing "First!" meaninglessly into the world.

If only we could have a time when people contribute meaningful discussion to articles, perhaps wondering if Intel influences Bapco's habit of only focusing on a tiny sliver of the real world?

Benchmarking Crysis is a necessary tool in the hardware reviewers arsenal and relevant to this article.
[USRF]Obiwan 5th July 2011, 09:41 Quote
I agree with Pete, If you want basic stuff you could do with everything with a PC from 6 years ago. If you want to play the latest games just put in a middle segment recent video-cart and you can play it fine.

I on the other hand have just a amd quadcore with 8gb of ram and a GTX460, all more then capabple to furfill my needs. Like movie editing, fast unpacking of downloaded movies and series, playing games and other regular stuff you normally do. I call that a wise investment.

Why on earth do i want to have a 1200watt PSU feeding quad sli with a i7 990X processor on 32BG ram and a 400 dollar motherboard with lights and knobs on it which i will never use once it is inside the case and a 160GB SSD for which I can buy 6x 2TB drives.

So I can read my mail faster or start my PC faster? yeah right..

For the record, my PC is en sleep-mode 24/7 and starts in 3 seconds with the move of a mouse

When I was young and naive I always wanted the latest and the best. I even bought newer revision of a motherboard, Yeah crazy! I know better now...
Telltale Boy 5th July 2011, 10:00 Quote
Quote:
Originally Posted by Pete J
As pointed out by the article, it's unsurprising that Nvidia and AMD walked out because *puts on whinny voice* 'it doesn't represent GPGPU usage'.

What is typical PC usage? Most of the time I'm using an internet browser with about 10 pages open, outlook, three or so word documents, a few PDFs, an excel spreadsheet with perhaps only 300 cells being used and a couple of paint files. Do I need the latest and greatest hardware to do this? Of course I bloody don't!

Says the guy with tri-sli 580s. :p
DbD 5th July 2011, 10:15 Quote
This is the same benchmark that Intel has had the same say in for many years. It's one that AMD championed for much of that time - making it the standard for government purchasing orders in America. Why did they do that? - because the A64 was a great cpu, it did well in BABCO and hence AMD could win some deals over Intel despite it being the smaller player.

So why are now are they really pulling out?

The strong suspicion is because bulldozer is weak, because if those same government dept's that keep using BABCO AMD will sell nothing. Hence it's not really to do with break down, or gpu compute or any of that it's to do with selling bulldozer. AMD need to minimise use of BABCO for competitive benchmarks of Xeon/SB vs bulldozer. It's all smoke, mirrors and marketing.
SexyHyde 5th July 2011, 10:36 Quote
Quote:
Originally Posted by DbD
This is the same benchmark that Intel has had the same say in for many years. It's one that AMD championed for much of that time - making it the standard for government purchasing orders in America. Why did they do that? - because the A64 was a great cpu, it did well in BABCO and hence AMD could win some deals over Intel despite it being the smaller player.

So why are now are they really pulling out?

The strong suspicion is because bulldozer is weak, because if those same government dept's that keep using BABCO AMD will sell nothing. Hence it's not really to do with break down, or gpu compute or any of that it's to do with selling bulldozer. AMD need to minimise use of BABCO for competitive benchmarks of Xeon/SB vs bulldozer. It's all smoke, mirrors and marketing.

and nvidia and via walked out, just to show amd some support. buttom line is tech is changing with more cores/threads and some software does run better on gpgpu FACTS and benchmarks should reflect that. why did amd champion it before? because they had better cpus? yes, but everything ran on the cpu back then.
Pete J 5th July 2011, 10:46 Quote
Quote:
Originally Posted by Telltale Boy
Says the guy with tri-sli 580s. :p
Notepad is very hard to run I'll have you know!

In all seriousness though, if I didn't need to do CAD, I could probably still get away with using my first PC I ever got - 14 years ago. Ah, the good old days of Windows 95.
r3loaded 5th July 2011, 10:51 Quote
Here's an idea - how about task-specific benchmarks, with their own individual scores? Interested in video editing? Run a test of VideoMark. Are you running complex financial calculations in Excel or Matlab? Run CalcMark. Do a lot of editing in Photoshop/Lightroom? PhotoMark will help you decide. You get the idea. :)
Autti 5th July 2011, 10:57 Quote
Why didn't Bit-tech mention the fact that if you change VIA's cpu ID (which you can) to Intel it gets a higher score (not in 2012 but it was occurring, so to with PC Mark 2005) and it becomes blatantly clear why companies want to leave.

Intel cheat. Flat out. Now i've only ever bought Intel CPU's so fanboi accusations won't work, but its just a fact. Have a look at register and compiler optimizations for Intel in older versions of SuperPi as well.
DbD 5th July 2011, 10:57 Quote
Quote:
Originally Posted by SexyHyde
and nvidia and via walked out, just to show amd some support. buttom line is tech is changing with more cores/threads and some software does run better on gpgpu FACTS and benchmarks should reflect that. why did amd champion it before? because they had better cpus? yes, but everything ran on the cpu back then.

Bulldozer doesn't have a gpu.
bob_lewis 5th July 2011, 11:02 Quote
Good article, thanks BT.
wuyanxu 5th July 2011, 11:16 Quote
SysMark has always been a CPU benchmark, you want GPGPU, use something else.

call me cynical, but i thought the reason AMD walked out was due to their new processor was unable to compete.


then there's the matter of real world benchmarks. if simple task is all you will do, then why even buy latest CPU or GPU based on benchmarks? how about just buy a tablet? fast-enough computing doesn't need benchmarks, all it needs is a simple user feedback of "can it handle my facebook page".


in summery, i think people focus on the benchmarks way too much. call me Apple fanboy, i value user experience more than underlying hardware, a fast-enough system is the same as fastest system when the software is the same. Also, benchmarks can only tell you A is faster than B at task C. it can never tell / has never told anyone A is always better than B. so what's all the commotion about?
SexyHyde 5th July 2011, 11:22 Quote
Quote:
Originally Posted by DbD
Quote:
Originally Posted by SexyHyde
and nvidia and via walked out, just to show amd some support. buttom line is tech is changing with more cores/threads and some software does run better on gpgpu FACTS and benchmarks should reflect that. why did amd champion it before? because they had better cpus? yes, but everything ran on the cpu back then.

Bulldozer doesn't have a gpu.

Never said it did. And wouldn't that mean if the score did include gpgpu bulldozer would score less? Amd have a range of cpus, including gpgpu would give a more rounded performance benchmark, excluding it gives a less honest result.
Xir 5th July 2011, 11:34 Quote
Quote:
’When was the last time you completed a 35,000 row spreadsheet?’
Yesterday (no that's a lie, it was only 20,000 rows representing 3 months data)
Quote:
a knowledge worker who mainly spends their time processing email, creating spreadsheets, presentations and other similar documents, as well as web browsing, social networking and viewing YouTube videos
Web browsing, social networking and Youtube are either barely allowed (web browsing) or blocked (social networking and youtube) ;)
Quote:
optical character recognition (OCR) and file compression activities − things an average user will rarely if ever do.
Very true. :D
Quote:
After all, many businesses are still using Internet Explorer 6, let alone version 8 or 9, and you could also argue that not many people are using GPGPU acceleration on a regular basis yet
Yup, updating on a business scale is a giant pain.
Quote:
The same applies to spreadsheets: how many people run massive calculations on a daily basis?
Well it IS called a business benchmark, not a Joe-average-looking-for-lolcats-videos-benchmark ;)
...
Anyway, a Office and email productivity benchmark would be right for businesses (okay, incorporate SAP maybe) :D
For "average-user-Joe" we'd need a different benchmark, I agree.
Tattysnuc 5th July 2011, 12:37 Quote
35k rows in Excel is not unusual. How else do you validate millions of rows of data when building databases, or building reports. It's a valid measure for those of us that use massive amounts of data in our day job.

Benchmarks should be selected based on their appropriateness to you. A portable computer that'll play Crysis may well not yield the battery life that work users crave. It's simply a metric to guide users. Manufacturers have always picked and chosen the grounds on which to pitch their marketing fights, and this is no exception.

If the perfect piece of hardware that does EVERYTHING fastest had been invented, then this wouldn't even be being discussed. Until then, we have to learn to look for trustworthy sourced that bench using applications that are relevant to us. That's why I have a suite of short-cuts to different sites to read their opinions/results.
Saivert 5th July 2011, 12:48 Quote
okay who is this mystical "average user" you guys are constantly talking about?

Also, for a work PC I got an AMD quad core with 4GB RAM and a GeForce 210 card powering dual 1080p monitors. I mostly do customer support, and registrartion so I need to use Excel a lot, Outlook, several web browsers, PuTTY and other small tools here and there. I also run VMWare to run some older operating systems in case some customer runs older software.

All except VMware would probably fly on a older single core computer with only 1GB RAM.
But this ensures I never have to wait, ever on anything running locally.

I'm nor sure how Sysmark gauges this type of workflow. You would have to have some benchmark framework that would allow you to set up your own tasks to run.
tad2008 5th July 2011, 12:58 Quote
Quote:
Originally Posted by r3loaded
Here's an idea - how about task-specific benchmarks, with their own individual scores? Interested in video editing? Run a test of VideoMark. Are you running complex financial calculations in Excel or Matlab? Run CalcMark. Do a lot of editing in Photoshop/Lightroom? PhotoMark will help you decide. You get the idea. :)

I think that in essence this simply provides a viable solution to something, which to be honest is not as big a deal as the Babco would probably have you believe.

As for serial processing, just how out dated is that? Yes there will still be people using single core CPU's but at least have a benchmark capable of comparing those who use more than a single core. I can't help but wonder if Bapco have actually been updating Sysmark or just giving it a new version number...

It's a darn good thing we have the guys... err... are there any girls? at Bit-Tech to keep us informed and educated and will be good to see what you come up with for your own benchmarking software.

+1 for Bit-Tech
alpaca 5th July 2011, 13:11 Quote
what about a website where you could go, select the programs you use and it gives you feedback on how a certain cpu/gpu/computer setup would fare? that way everybody can evaluate a computer/component on it specific needs.
b5k 5th July 2011, 13:48 Quote
Calling Crysis a bench is like saying that because a broken piece of code stresses a system it's a benchmark.
Bauul 5th July 2011, 14:33 Quote
I personally think concentrating on CPU benchmarks is a mistake in itself. In my experience at work, a decent amount of fast RAM and a speedy harddrive is by far and away the most important part of getting a computer to run quickly.

We do quite typical office jobs (any high-end application is streamed via Citrix), and what I hear my staff complain about most are things like:

1) My computer takes too long to boot up
2) Opening *insert MS Office app* takes too long
3) My computer slows down when I have too much open

Above something quite simple, the CPU has pretty much no effect on any of the above.
fingerbob69 5th July 2011, 14:57 Quote
Semiaccurate's take on this from some two weeks ago.

http://semiaccurate.com/2011/06/20/nvidia-amd-and-via-quit-bapco-over-sysmark-2012/

What they say and I think is implied here is that the benchmark was being skewed in such a way that made Intel always look good to the detriment of all others. This seems to have particularly rankled with AMD given their the gpu part of thier apu's way out perform Intel's equivilent ...with the benchmark singularly failing to show this.
Lazy_Amp 5th July 2011, 15:31 Quote
It was implied at the end, but there was also no direct mention of the fact that Intel had complete say in what went into the benchmark, mainly because, as the semiaccurate article points out, the committee would deny any motions not brought by Intel. Even if the results of the 2012 benchmark are completely accurate (which, looking at the record of favoring Intel cpu ID's and ignoring all GPGPU calculations, is hard to justify), the fact that you have a system that accepts input from only one source is suspicion enough to justify all other chip designers pulling out.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums