bit-gamer.net

The Elder Scrolls IV: Oblivion

Comments 51 to 69 of 69

Reply
Tim S 3rd April 2006, 23:47 Quote
Quote:
Originally Posted by dehx
Alot of people don't realize this, but the X1900XTX is the same exact GPU that runs inside the X360, in fact, the X360 has a X1900XT, not the overclocked XTX.
The Xenos GPU (R500, the one inside XBox 360) is a unified shader model with an on-die EDRAM module to give 4xAA for free. Radeon X1900-series is based on a traditional shader architecture, meaning that there is no vertex texturing or sharing of resources.

There are dedicated vertex shaders, and dedicated pixel shaders with one texture unit for every three pixel shader units inside R580. R500 doesn't have any of that - it's a completely unified shader architecture, meaning that you can share ALUs between vertex shader instructions and pixel shader instructions. ;)
MastershakeJB 4th April 2006, 00:30 Quote
oops. I just read on what separates DX10 from 9, and so I guess my 79 won't be compatable :/ .
Tim S 4th April 2006, 00:33 Quote
DirectX is backwards compatible, but if you want to make use of newer features that come with DX10, you'll need a DX10-capable video card. G80 and R600 are coming later this year.
Kaze22 4th April 2006, 01:15 Quote
Quote:
Originally Posted by Shadowed_fury
Guys chill out, its just a game ;)
The thing is I don't even own a Xbox 360, I have the PC version of Oblivion. I just can't stand people talking trash without knowing what the hell they're talking about.
Quote:
oops. I just read on what separates DX10 from 9, and so I guess my 79 won't be compatable :/

LOL Owned
specofdust 4th April 2006, 03:10 Quote
Ok, I'm quite sure I'll be in the minority here, but do at least some other people think that HDR was quite considerably over-used in this game? I've so far left it on, because I think the majority of its use is very enjoyable, and really brings that "daylight" light feel to much of the outdoor stuff, as well as some lovely lighting to indoors, however at times when I'm say, running around ruins where the stone is practically blinding me(does stone, even white stone, ever reflect that much? I'd say no) or watching a very bright sun shining wolf I've just killed roll down a hill, I think, "they're really taking the piss with this". Its a shame theres not a slider for various levels of intensity, because I'd be much happier if HDR was only used about half as much.

Other then that, its too small, its too simple, but it looks fantastic, the sounds are fantastic, and the combat is enjoyable. Hopefully mods can take care of the smallness, and the simplicities of the gameplay. Oh and the guards are entirely too polite, damnit if they're not telling me to "get lost"(in ye olde Tamrialic talke) then I ain't happy :p

/moan
Tulatin 4th April 2006, 10:23 Quote
I'm loving this game so far, but i'm finding when you have the right weapons (elven claymore) and gear (full body mithral, baby), it's bloody easy. Everything seems to be done just right, except for how touchy the guards were - for example, in the early game they notice you take one gold piece while you're in the basement of a monster house. Now (L.11), i ransack the whole place, and walk out, with guards buzzing about, with nary a word. Weird. On a plus note, i propose a contest for boredom's sake - what's the biggest mess you can make in the game? I've already made a big enough one in my hovel that it pushes me across the room (no joke. I have about 1,000 pounds of crap.
Mister_Tad 4th April 2006, 15:43 Quote
I've just been screwing about looting places for some time now, have finally accumulated over 100k gold

nfi what im going to do with all of it, bought a house to throw all of the stuff in thats too valuable to sell and no good to me (ie worth a lot more than shopkeepers have)(does anyone know if your houses can get robbed, as ive not been there in days and would hate to go there and find all my loot nicked!). Bound to find someone with loads of dosh sooner or later.

If you're swatting everything down like flies with a couple swipes, just increast the difficulty slider a bit.
Tulatin 4th April 2006, 18:19 Quote
I floored the slider... no luck. The only things that are raping me are will-o-wisps. Btw - i just passed 10K, bought a house in brunea? (sp). Now it's empty. I've got to bring my adoring fan to it though, and lock him in the cellar. He's still at my waterfront shack :). (Yeah i'm grand champion)
bloodcar 4th April 2006, 19:11 Quote
I left my adoring fan in the middle of a river. I wonder if he drowned....had actually forgotten about him.

100k gold will go very fast when your buying up houses and buying all the furniture for them.

Word to you fellows who have the Gray Fox helmut.... don't use a bound helmut while wearing it!!!! I went up to one of those rune stones while I was wearing it and activated the stone. Next thing I know, my fame was at 0 and my infamy was at over 100 and the second I came close to a city (even after I had taken the melmuts off) I was chased by guards. Fricken sucked. I didn't even get to pay off my fine even though I had more then enough money to pay the $7500 fine and ended up spending 150 days in jail for that one. :'(

Now I'm trying to raise my fame back up so I can buy that damned house in Anvril.... you have to be famous to buy it.

Some quests though seem to be undoable if you don't complete them right away or if you screw up. If you're needing some help with a quest and it's messed up (on the 360) I'll give you a hand if you want. Just send a message to Major Suckage and I'll help you the next time that I'm on.
Nottheking 5th April 2006, 19:57 Quote
Well, I WAS about to place in a rather huge post I had spent a half hour working on, but I had forgotten about the odd quirk of vBulletin, which I'm rather incensed still exists on the Internet; it logged me out for "inactivity" after 5 minutes, and prompty dumped my post when I tried to submit it. Had I remembered this, I would've saved a copy of it, which I normally do. Curses.

With luck, I may come back tomorrow and try again, when I'm not feeling so angry.
Nottheking 5th April 2006, 20:38 Quote
Well, I decided to go back and write it again. It's different from before, obviously, and incorporates a few more replies.

At any rate, I've been playing Oblivion quite extensively, logging 135 hours of play time since I got it on the 21st, without a pre-order. (!) Yes, that works out to around 9 hours a day. I’ve been spending most of my free times on it, and have been largely absent from the Internet in that time, even.

I don’t know what method of testing was used for the article, but I find it to be a little pessimistic; I’m getting things great on my PC, which it at 1280x1024, x6 AA and x16AF, (no HDR, obviously) with the settings either maxed or set beyond the max through the INI file. For instance, I have EVERYTHING set to reflect in the water, which normally only reflects the ground, sky, and city walls/towers. I also have the “blood decal” setting set to many times the normal level. (it applies to decal duration/count; max is 10 seconds, 10 decals; I have it at 300 seconds, 30 decals) Oh, and I’m using an X800XT.
Quote:
Originally Posted by whisperwolf
whats the graphics compare like rom the xbox to the the high end PC's. Trying to work out if I'll be better of buying the xbox, or spending silly money upgrading my whole PC, cause at the moment I have a feeling it will destroy my current machine.
Well, minus the oddity of AA and HDR on the PC, (which was an intentional “cripple” placed in there by BethSoft, not a technical limitation) you won’t find a way to be dissatisfied with either, unless you’re dissatisfied with the game itself, that is.

Either path is viable, and if graphics are your chief concern, one should stick with whatever platform they’ve got best set up.
Quote:
Originally Posted by automagsrock
So when I play this my character moves at a snails pace compared to everything else. WTF??
Ignore what the others said; unplug your joystick or gamepad.
Quote:
Originally Posted by EK-MDi
If you're able to play the game at high resolutions, 1280x1024 and above, you don't really need Anti-Aliasing so much. So deciding between either AA or HDR, is an easy choice. You can just choose HDR, which will have the most useful benefits.
Not really; it’s a matter of taste; for me, I want x6 AA, dangit! The ONLY exception is if I’m playing a game at 320x200, 8-bit palletized. Old VGA goodness.

Aside from that, I see a clear need to have my AA, and at x6, not x4 or x2, at even 1280x1024. I can live more easily without HDR.
Quote:
Originally Posted by MastershakeJB
Also, I have a question. When i enable HDR from within the game, and AA from my control panel as opposed to in game, it actually works, or looks like it does. So why is it that I keep hearing that this is "impossible" when it's very much running on my comp? Albeit it acts......weird, when i have it enabled. It acts similar to skipping a frame, like it does when it's caching off my HDD, only moreso, but it's not all that noticable, as it happens only slightly more frequently than does the HDD caching skip, which is only every couple of minutes or so. So my question is, am i doing damage by leaving AA running with HDR? Is HDR actually running with AA, or am i just imagining that it is and it's actually bloom? I'm lost, i don't understand why it's "impossible" and yet i'm doing it.
It could be possible it is working; as far as it’s been known, the limitation is only artificial; BethSoft placed it in there, even though they effectively acknowledged that it was 100% possible to use both together in their game.
Quote:
Originally Posted by dehx
Alot of people don't realize this, but the X1900XTX is the same exact GPU that runs inside the X360, in fact, the X360 has a X1900XT, not the overclocked XTX.
Sorry that is incorrect. That assumption is actually a lie that was first circulated by some Xbox fanboy. The two don’t even have the same CORE CLOCK SPEEDS, for cryin’ out loud! I consider this article at Beyond3D to be REQUIRED reading for any discussion of the GPU in the Xbox 360.

A few of the chief differences between the Xbox 360’s core and any modern PC graphics card:
  • The unified shader architecture has apparently made it difficult for some to tell the exact shading power. However, note that the 48 “shaders” the R500 has do NOT match with what each pixel shader unit (PSU) is in any modern graphics chip; rather, each of the shaders in the Xenos consist of only one ALU, instead of two in each pixel shader; PC graphics vertex shaders, on the other hand, have 1 ALU. As a result, the Xenos has “only” a total of 24 billion operations per second of shading power, as opposed to, say, 25 billion per second for the X1800XT, 65 billion per second for the X1900XT, or 67.6 billion per second for the X1900XTX.
  • The raster pipeline design was made for low heat production, not immense brute strength. In effect, there are only FOUR ROPs for the Xenos, compared to 16 for almost any high-end video card. However, they can apparently blend four color samples into a single write, though they only actually do one write at once. Also, they can double their fillrate if handling only Z operations, but then here you do toss the use of AA, which has little point for a Z or stencil pass, and is only used for color passes.
  • One thing that the Xenos actually DOES share with the PC is the TMU structure; it is independent of the rest, like with an X1k card, and it has four quads with four TMUs apiece.
  • The real killer for the chip’s performance is the fact that it only has two memory controllers, giving it a 128-bit memory interface typically used for mid-range or low-end cards, rather than a 256-bit (4 controller) interface used for any high-end video card. That means in spite of the 1.4GHz GDDR3, it’s performance is much less, like that of a plain Radeon X800, which has 700MHz RAM, or like the X1600XT or 7600GT, both of which have the same memory bandwidth style as the Xenos. This bandwidth is also shared with the CPU. The only consolation that might be found here is that the EDRAM tile buffer means that overdraw does not increase bandwidth usage; the framebuffer writes will come at the exact same rate no matter what.
Believe it or not, I actually don’t have something against the Xbox 360; it’s an amazing, and powerful, gaming machine. However, I have little tolerance for the flurry of misleading, and outright incorrect, information that floats about. I do appreciate it when people take the effort to get things correct.
Quote:
Originally Posted by Kaze22
Wow you figured out that PC has console plus TGM god mode, you're so cool you finished the first gate in 57 seconds somebody should give you a prize. News flash nobody cares, and the lack of console and TGM Godmode on Xbox 360 is actually a good thing because it's always more fun to play a game without cheating, who the hell wants to complete a Oblivion Gate in 57 seconds anyways might as well just skip the whole game and watch the ending damn Noob.
Another thing as much I don't like to defend the Xbox 360, I'm gonna have tell you to GET YOU'RE FACTS STRAIGHT before you start spreading rubbish.
X1900XT or XTX or XXXXTTTXXXX is a direct X9 card with a overclocked core, the Xbox 360 uses the Xenos a modified DX10 Card with hardware AA and an unified key word "unified" shader architecture, that is not found in "ANY" PC Video Card to date. Although the Xenos GPU has lower clock speed than the X1900 it's architecture is more advanced than any DX9 Card in the market.
That’s quite a bit excessive, don’t you think? And the console is much more useful than cheating. That’s not what it was primarily used for in Morrowind, and it’s not much of what it’s been used for in Oblivion. In Morrowind, it was used for both TCL (toggle collision) for when the clipping got the player stuck; for Xbox players, the only option was to reload. Perhaps the most popular command, though, was RA, (reset actors) useful for when NPCs wound up clogging the doorways, (particularly the door to the guild guide in the Balmora Mages’ Guild) or when they managed to fall through the floor.

In Oblivion, I use it for the “save” command; simply input “save *****” to make a named save. Also, there is the “TDT” (toggle debug text) command, that allows you to monitor performance as more than a simple FRAPS framerate counter. (and anyways, running FRAPS will drain your performance)

As for DirectX 10, it would be mistaken to assume that the Xenos is DirectX 10, as DX10 isn’t even finalized yet. Rather, it’s kinda like a DirectX 9.5. Many of the same tweaks seem also present in the R5xx desktop cores as well.
Quote:
Originally Posted by MastershakeJB
well, really the 360 core resembles an X1900 very much, only it has ten megs of SUPER fast ram sitting right on the die, which basically gives it one eye candy for free, be that loads of anti aliasing or antisoptric filtering.
Incorrect again. The 10MB of EDRAM really isn’t “fast,” the 256GB/sec figure was fictitious, and referred to the potential EFFECTIVE write speed of the ROPs. And it doesn’t mean that AA is “free.” It only eliminates the problem of increasing memory bandwidth usage, but it does nothing for the fact that it increases the load due to more shader processing and more textures being applied. The Xbox 360, from what I’ve seen, also doesn’t do anisotropic filtering whatsoever. It does use trilinear, but not anisotropic, as the latter tends to increase video RAM usage, and RAM is something that is still actually in short supply, even though it’s a far improved case over the original Xbox.
Kaze22 5th April 2006, 21:21 Quote
Quote:
As for DirectX 10, it would be mistaken to assume that the Xenos is DirectX 10, as DX10 isn’t even finalized yet. Rather, it’s kinda like a DirectX 9.5. Many of the same tweaks seem also present in the R5xx desktop cores as well.

First of all when someone starts talking trash about beating a Oblivion Gate in 57 seconds it's obvious what they're using the console for and second I didn't say that the Xenos is DX10 card I said it's a modified DX 10 Card which mean's it's in between. As ATI would not want to stiffle there next gen card sales by putting it in a gaming console, but with that said the Xenos architecture is far more efficient than even the fastert Video Card in the market today (Developer Cards Excluded). Don't get too hung up on clock speeds and tech specs, the architecture is what drives games, the fastest PC GPU to date must go through several hardware/software intermediates to process a game so without MS DX 10 support even the highest clocks speeds will yield in limited results.
But within the next year with the release Vista plus G80 and ATI R600 we'll see PC gaming back on top.
Nottheking 6th April 2006, 15:35 Quote
Quote:
Originally Posted by Kaze22
First of all when someone starts talking trash about beating a Oblivion Gate in 57 seconds it's obvious what they're using the console for and second I didn't say that the Xenos is DX10 card I said it's a modified DX 10 Card which mean's it's in between. As ATI would not want to stiffle there next gen card sales by putting it in a gaming console, but with that said the Xenos architecture is far more efficient than even the fastert Video Card in the market today (Developer Cards Excluded). Don't get too hung up on clock speeds and tech specs, the architecture is what drives games, the fastest PC GPU to date must go through several hardware/software intermediates to process a game so without MS DX 10 support even the highest clocks speeds will yield in limited results.
But within the next year with the release Vista plus G80 and ATI R600 we'll see PC gaming back on top.
Even in the case that they ARE using the console for nothing but cheats, (I note that in Morrowind, oddly enough, the capbilities of the console paled compared to clever use of alchemy) I don't feel that you gave the propper rebuttal.

I believe you mean "modified DirectX 9.0c GPU," not "modified DirectX 10 GPU." The API used for the Xbox 360 does happen to be a modified version of DirectX 9.0c, but it's still called that. It does not claim SM 4.0 capability; indeed, it claims precisely SM 3.0 capabilities, and nothing more, though it does, in fact, seem to have a few differences.

As for efficiency, you are correct; the Xenos has the most efficient arcitecture of any commonly-produced GPU today. However, I think you might be confusing "efficiency" for "power." In the computing world, the two are normally diametrically opposed; the least efficient graphics arcitectures do happen to be those used in ATi Radeon and nVidia GeForce GPUs, but those two do happen to be the most powerful. At the opposite end is the arcitecture found in GPUs such as PowerVR chips; they have little processing waste, and a very low thermal envelope, but as a result, their power simply cannot even come close to what ATi or nVidia's chips can do. This is one of the unique elements of the R500 Xenos; it may not be quite as efficient as a PowerVR, and in strength it does get the crap beaten out of it by any high-end PC video card out today, (let alone 2- or 4-GPU setups) but it manages to produce an admirable ammount of power (roughly somewhere between ATi's X1600XT and X1800XL, depending on the "balance" of the application in question) while being very efficient.

My comments on clock speed happen to simply be because it's one of the basic-level factors of performance; the number of particular graphics units (be they TMUs, ROPs, or shader ALUs) is pointless for determining performance unless you also know the clock speed; you multiply both numbers to get the maximum fill-rate, which is where you finally have a USEFUL statistic. You combine all such similar useful factors together to get the performance of the GPU.

As for the "arcitecture is what drives the games," that is true, but note that the Xenos, in overall arcitecture, is actually little different from modern video cards; it still goes through the exact same process. In fact, which is one of the keys that Microsoft hopes will make the console successful: PC raphics programmers can go to work on the Xbox 360 without any real additional learning. (both the Xenos and modern video cards use the same API, after all) The only real differences are with the arrangement; chiefly, the GPU is tile-based, which as I noted, helps to eliminate the excess bandwidth usage caused by overdraw, (and make sure that framerates, while not necessarily higher, are more consistent, as fluctuations in the usage of VRAM bandwidth is perhaps the number one cause of inconsistent framerates) and the fact that it slightly differs from the X1k's shader arcitecture by only having one pool of ALUs, rather than separate pools for pixel and vertex shader ALUs.

While a console will never actually best the most powerful PC on the console's release, (though the Nintendo64 did actually come close) the Xbox 360 is a place where it's readily evident what the console's real main strength is: unlike PCs, it's the perfect place to try and expiriment with new types of technology; they won't yield more power for the console, but if they prove viable there, they can be transfered over to the PC as well. ATi seems to have gotten this down right; it should be noted that ATi's R300 GPU seemed to keep a lot of features from the "Flipper" used in the Game Cube, and it also appears that ATi, which had apparently been finished with the R500 since early 2005, had taken some things from it and applied them for the R5xx desktop GPUs.
WilHarris 6th April 2006, 18:17 Quote
Quote:
Originally Posted by Nottheking
<snip>

Gold.
Kaze22 6th April 2006, 18:40 Quote
Quote:
propper rebuttal
LOL thats priceless. Nah I think I gave him just enough rebuttal. Anyway all proper rebuttal aside I think you make some valid points. Since we are having a nice little discussion on video cards, I'm just gonna add my two cents on the whole PC vs Console issue.
Having played both console and PC games for years, I've come to realize one thing. That how a game runs has very little to do with a video cards raw power but rather the harmony of code and hardware.
As always PC's on paper have much more power both in terms of CPU and GPU when compared to their console counterpart, but console games almost always run smoother than their PC counter-part. Why is it that a video card with twice the clock speed can barely manage to perform at the same rate of it's measely console counter-part.
Reason being as a another member of the forum pointed out that consoles have a more direct hardware to software interaction with less driver, OS variables, this is mainly due to it's singular design.
The best analogy would be a the creation of a sports car, now in a professional race a car tailored to one specific driver will almost always yield better performance than simply an all purpose fast car.
The PC video card is very much like that all purpose fast car, while the console tends to be that customized specialy tuned car for that one perticular driver. Even if the all purpose fast car has more horse power it's performance is stiffled by it's poor driver and vehicle interaction, while the specialy tuned vehicle suits the drivers every needs and can out perform the faster vehicle in a real race.
I don't know if you can understand my analogy, but what it boils down to is how the code interacts with the hardware, and consoles always win on that department. Which is why we see an ATI X1600 although and paper similar to the power of the Xenos yet in practicle run does not even reaching half of it's final output.
When code is written to function on one unified hardware, it will always run smoother than the same code running in an all purpose machine.
Nottheking 7th April 2006, 02:28 Quote
Quote:
Originally Posted by Kaze22
LOL thats priceless. Nah I think I gave him just enough rebuttal.
Whoops. Didn't mean to put two "p"s in there. (writing so much like that, I make a lot of typos, and occasionally, I let one or two slip by. I do correct the vast majority of them)
Quote:
Originally Posted by Kaze22
Anyway all proper rebuttal aside I think you make some valid points. Since we are having a nice little discussion on video cards, I'm just gonna add my two cents on the whole PC vs Console issue.
Having played both console and PC games for years, I've come to realize one thing. That how a game runs has very little to do with a video cards raw power but rather the harmony of code and hardware.
As always PC's on paper have much more power both in terms of CPU and GPU when compared to their console counterpart, but console games almost always run smoother than their PC counter-part. Why is it that a video card with twice the clock speed can barely manage to perform at the same rate of it's measely console counter-part.
Well, while you are correct on the "harmony" part, you are also neglecting another factor: in almost all cases (Oblivion included) one of the keys to the game running so well on the console are settings that are actually reduced; GameSpot, for instance, noted that the Xbox 360 version actually cuts back a little on foliage. (which has been shown to be one of the chief performance hogs) Likewise, on the original Xbox, many ports, like Half-Life2, and Doom3, were significantly cut-down from their PC versions; both appeared to actually run at 320x240, (which is all you can get if you use a composite video cable anyway) as well as having many things reduced.

Also, I do know that the "raw power" of a piece of hardware is far from the final determining factor in performance; it's only one of three types. The other two are the capabilities and design of the hardware, and the design of the application being run on that hardware to test it. Hence, for instance, the nVidia GeForce FX 59xx series originally was competitive with the Radeon 9800 series, as texturing power was what cards were based upon, and what was emphasized on popular benchmarks like 3Dmark01 and even 3Dmark03; the former had apparently zero pixel shader usage whatsoever. However, in later tests, such as 3Dmark05 and 3Dmark06, placed increasingly large loads that were related to the pixel shader capacity of the graphics cards; at these, the GeForce FX cards, which had half the number of pixel shaders of comparable Radeon cards, (and reportedly less-developed shaders overall) lagged.
Quote:
Originally Posted by Kaze22
Reason being as a another member of the forum pointed out that consoles have a more direct hardware to software interaction with less driver, OS variables, this is mainly due to it's singular design.
The best analogy would be a the creation of a sports car, now in a professional race a car tailored to one specific driver will almost always yield better performance than simply an all purpose fast car.
The PC video card is very much like that all purpose fast car, while the console tends to be that customized specialy tuned car for that one perticular driver. Even if the all purpose fast car has more horse power it's performance is stiffled by it's poor driver and vehicle interaction, while the specialy tuned vehicle suits the drivers every needs and can out perform the faster vehicle in a real race.
I don't know if you can understand my analogy, but what it boils down to is how the code interacts with the hardware, and consoles always win on that department. Which is why we see an ATI X1600 although and paper similar to the power of the Xenos yet in practicle run does not even reaching half of it's final output.
When code is written to function on one unified hardware, it will always run smoother than the same code running in an all purpose machine.
That is correct, but that is also the entire point of the creation and use of APIs; originally, simply making an application to run on multiple brands of hardware required, at the very least, a re-compilation for each such platform, if not an entire re-write of the code. This was highly problematic, for instance, for VGA adapters and sound cards.

This is why DirectX was created in the first place, so that Windows could be a much more viable platform, eliminating many barriers.
Ultimate RPG Gamer 5th March 2009, 05:27 Quote
Iv'e played the elder scrolls oblivion and well iI liked how it seems like the enemies are really trying to kill the real you with the 1st person view. the only trouble control wise that I'm sure a lot of people had was sometimes after an intense arena match in the game you may have accidentally switched ur talk command to loot so what would happen is when you tried to talk the guards would think you stole from this person even if you were just looking at what they had on them. If i had a recommended player stats in mind for beginners i'd say go for being born under the mage and be what they call a Battle Mage which is adept at healing and attack Spells as it is Physical attacks. That way when ur weapon is destroyed you can hit ur foes with fire spells and you can also heal making you almost unstoppable.Thats about All I have to day on this game and I agree its one thats essential to any game collection in fact I own it if what I said about it is any indication.
Bauul 5th March 2009, 11:22 Quote
One always wonders what goes through people's mindsets when the resurrect a three year old thread just to absent mindedly wax lyrical about a game most people haven't touched in ages.
dire_wolf 5th March 2009, 12:32 Quote
Fookin hell it's been three years already? I vividly remember waiting excitedly for my play.com collectors edition to drop through the door, only to sell it 2 weeks later through sheer dissapointment . . .

Also, this is probably the most useless thread res i've ever seen

*SPAM*
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums