Building a PC for Skyrim

Post » Thu Sep 02, 2010 12:47 am

Yeah but getting a gaming pc for one game when there are games like; the witcher 2, BF3, tribes universe, planetside next, shogun 2, red orchestra 2, ect



Heh, only one of those that I care about is bf3. Comes out around the same time as Skyrim. All is right in the world.
User avatar
Lyd
 
Posts: 3335
Joined: Sat Aug 26, 2006 2:56 pm

Post » Wed Sep 01, 2010 9:46 pm

probably a medium end 600-800 usd computer. The game is designed around the 360 so yeah even a 500-600 USD computer will be fine.

Since all the games will "look the same" you'll probably be fine with a nvidia 8800 or a 9800.

I LOL'd. Seriously, don't listen to this guy. My old PC with 8800 GTX and Q6600 can't even max Oblivion with decent frame rates. A $500-600 will NOT "be fine". My laptop has more horsepower.

As for the game "looking the same" on PC as the 360. Only time will prove how completely false that statement is.
User avatar
Alan Whiston
 
Posts: 3358
Joined: Sun May 06, 2007 4:07 pm

Post » Wed Sep 01, 2010 12:27 pm

I LOL'd. Seriously, don't listen to this guy. My old PC with 8800 GTX and Q6600 can't even max Oblivion with decent frame rates. A $500-600 will NOT "be fine". My laptop has more horsepower.

As for the game "looking the same" on PC as the 360. Only time will prove how completely false that statement is.


Oblivion is incredibly poorly optimised and scales awfully, Fallout 3 runs decently at 1440x900 with everything on max on the 8800GT.

Regardless, for a $5-600 PC you'll be getting a lot more power than an 8800GT, even low-mid end offerings like the GTS 450 are approximately equal to the 9800GTX in raw power.

@Achromatis; Which tells us nothing, specifications are good for knowing what the *minimum* you need is. They'll have pretty powerful testing machines that will tell us nothing.
User avatar
Matt Gammond
 
Posts: 3410
Joined: Mon Jul 02, 2007 2:38 pm

Post » Wed Sep 01, 2010 9:20 pm

Maybe on medium settings.


Hahahahahaha

Skyrim is not a next gen game, if anything, given the fact that it's running on a brand new (and probably "clean") game engine, it may even run better than Oblivion. I'm just relishing in the fact so many people are going to blast money into building a new PC, only to find out their old one could run Skyrim just fine.
User avatar
Jade Payton
 
Posts: 3417
Joined: Mon Sep 11, 2006 1:01 pm

Post » Wed Sep 01, 2010 2:34 pm

Hahahahahaha

Skyrim is not a next gen game. I'm just relishing in the fact so many people are going to blast money into building a new PC, only to find out their old one could run Skyrim just fine.


If you just look at fallout 3, Having maximum graphics on PC were far higher then the PS3 or X-Box 360 were capable of. So The consoles are not a good benchmark for the high end of a PC, more so when considering that a PC will go to places like 16x anti-anolysing while the PS3 or X-Box 360 are unlikely to go above 2x. If people are going for the best possible graphics they can afford... however what I do find funny is how people are looking into building those machines now before we even know the min/rec specs of the game.
User avatar
ZANEY82
 
Posts: 3314
Joined: Mon Dec 18, 2006 3:10 am

Post » Wed Sep 01, 2010 4:37 pm

Anybody who uses antialising above 2x is an idiot. If you have oodles of processing capabilities to spare, then and only then should you max anti-aliasing.. But if you are running the game on 1900x1200 (or it's sibling settings) then you will not need any anti-aliasing, the resolution will generally be sharp enough that you will not see any aliasing.

Anti-aliasing is such a laborious and power-intensive process, you should never use it so long as you can avoid using it.


I have to be brutally honest, only idiots run games on "MAX GRAFICS LOL." There are so many utterly pointless graphical features that you simply do not need to turn on, they contribute very little yet eat a large chunk out of your framerate. Running the game on "high" is always good enough. Anybody that can run oblivion on High will probably be able to run Skyrim on Medium/High so long as the game is not incredibly unoptimized.
User avatar
Robert Jr
 
Posts: 3447
Joined: Fri Nov 23, 2007 7:49 pm

Post » Wed Sep 01, 2010 10:17 am

Anybody who uses antialising above 2x is an idiot. If you have oodles of processing capabilities to spare, then and only then should you max anti-aliasing.. But if you are running the game on 1900x1200 (or it's sibling settings) then you will not need any anti-aliasing, the resolution will generally be sharp enough that you will not see any aliasing.

Anti-aliasing is such a laborious and power-intensive process, you should never use it so long as you can avoid using it.


I have to be brutally honest, only idiots run games on "MAX GRAFICS LOL." There are so many utterly pointless graphical features that you simply do not need to turn on, they contribute very little yet eat a large chunk out of your framerate. Running the game on "high" is always good enough. Anybody that can run oblivion on High will probably be able to run Skyrim on Medium/High so long as the game is not incredibly unoptimized.


I actually disagree on that point: Anybody who runs AA above 4x at the expense of either framerate or other settings is a bit silly.
There's a significant difference between 0x, 2x, and 4x, but after that you get highly diminishing returns. They're nice to have, of course, but not worth dropping anything else.
User avatar
Nicole M
 
Posts: 3501
Joined: Thu Jun 15, 2006 6:31 am

Post » Wed Sep 01, 2010 8:05 pm

I actually disagree on that point: Anybody who runs AA above 4x at the expense of either framerate or other settings is a bit silly.
There's a significant difference between 0x, 2x, and 4x, but after that you get highly diminishing returns. They're nice to have, of course, but not worth dropping anything else.


I tend to use 4x when I have "framerate to spare"

At a high resolution I do not see any discernible difference between that and anything higher, and it's simply not worthwhile bleeding your framerate over it.
User avatar
Allison C
 
Posts: 3369
Joined: Mon Dec 18, 2006 11:02 am

Post » Wed Sep 01, 2010 12:09 pm

As long my can run it I'll be fine.

Look below.
User avatar
Sophie Louise Edge
 
Posts: 3461
Joined: Sat Oct 21, 2006 7:09 pm

Post » Wed Sep 01, 2010 10:56 am

I actually disagree on that point: Anybody who runs AA above 4x at the expense of either framerate or other settings is a bit silly.
There's a significant difference between 0x, 2x, and 4x, but after that you get highly diminishing returns. They're nice to have, of course, but not worth dropping anything else.


This is exactly my experience with many games using a 1440x900 desktop monitor.
User avatar
Alisha Clarke
 
Posts: 3461
Joined: Tue Jan 16, 2007 2:53 am

Post » Wed Sep 01, 2010 3:57 pm

I always found that stuff like AA, deph-of-field, and a few other things were what made my machine start chugging (GeForce 9400 GT along with an Intel Core 2 Duo E6400 @ 2.13 GHz - though I just ordered a Q6600) - with those things turned off, I found that games which were previously slideshows on my PC jumped up in FPS by a long shot.
User avatar
Dawn Porter
 
Posts: 3449
Joined: Sun Jun 18, 2006 11:17 am

Post » Wed Sep 01, 2010 6:19 pm

Oh geez.

I thought geeks hung out around CRPGs....

Hokay. first off, since we know that the game has to maintain consistency across multiple platforms, that means you can scratch some things off the video card right now. No shader model 4. No DX-11 functionality. Aside from the frame buffer, the rest of the card will be for processing textures and shaders. 500 megs -should- be sufficient. No idea if the PC version is HD, so an hdmi port isn't likely to be needed, unless you run your monitor with it by preference. Anything above 30fps is imperceptible to the average human eye; pad that by again as much, so that you don't stutter if you somehow overload the PCI-E bus. One card is all you need; dual carding so you can claim 150+ fps is nothing more than bragging. Personally, I've never run across a game that demanded dual cards.....at least since the original Monster cards and the beginning of the shooter craze. OpenGL will be a non issue if this is a gaming rig (unless you intend to run classic games that need it). Any card that fits within those general parameters will be good enough. Basically, the choices will be shaped by how many monitors you have, cooling requirements, power supply (remember you have to power video cards separately if you have lots of on-card RAM), and what other games you may play.

Processor is a matter of how capable the software is. Very few games actually take advantage of multiple core CPU's. And those that do certainly do =not= have 8 threads to take the full attention of something like a i7; that would be nothing more than stiffy point bragging. Same with the Phenom II X6. A graphics app like Max or Messiah or Premiere or After Effects could use that many threads; games simply don't. One of the quad core chips would do the job more than adequately. Which chip is flame war material. Skyrim may prove to be one of the users, though, as we know they have Havok Physics and Behavior, and both are multi-threaded.

Motherboard is far more critical than most give it credit for being. No onboard video, as that bites into your system RAM. With AMD boards, the kind of memory you can use depends on the actual processor, as the memory controller has been on the CPU dye since the Athlon 64 first came out. Intel is finally catching up on that score, but you need to be careful there. If you get an older Intel setup on a deal, you could wind up with the memory controller on the north bridge....and have to deal with the bottleneck of lack of bandwidth there. Cheap mobo's have cheap components, and games are the most stressful usage there is next to CG work. So a little extra spent there will keep the system going a lot longer. If you have a 64 bit system, then consider getting 8 gigs of RAM. That should be more than enough for the system. If stuck with a 32 bit system, you're stuck with 2 gigs max for any one program to address. Setting the large mem flag in XP will let you use 3 gigs (still 2 for apps, but the OS is offloaded onto the 1st gig, leaving the remaining 2 gigs to be treated as regular ram).

HDD. Here is the last remaining bottleneck in the entire system. Physical hard drives have seach time in the milliseconds; one reason to load your system with as much memory as it can use; loading it into system ram saves a lot of time. The new SSD's are much faster; the larger ones are internally configured in a RAID 0 array, to avoid latency issues. But SSD is still a fairly new technology, and the actual drives have yet to prove their longevity through even the 1st generation. If you have the funds to use them, great......but you might find them dying faster than a mechanical drive. Or not. No one knows at this time. If you can't afford SSD, a quick way to break the bottle is to use a RAID 0 array as your boot drive. The down side is that you have exactly =NO= fault tolerance. If one drive fails, the array is dead. You will have to replace the bad drive and reload everything. The upside is that by using data striping (which is what RAID 0 is; data is 'striped' across both drives) you effectively double your read/write bandwidth. On tests I've had a renderbox computer with RAID 0 go from power on to desktop (xp-64) in around 15 seconds. This only takes two drives, and you can get 500gb SATA II drives for as little as $39us each.....which is $80 for a terabyte RAID array. All the semi-decent motherboards support RAID, so no extra hardware is needed.

Power supply. Here is something you might want to spend a little extra on. Not so much for neccessity as for safety. If your system draws, say, 320 watts average, then with an 800 watt supply you have a 480watt surge buffer before you even max out the supply rating. You -can- get one that is close to your power usage if you need to save the money, but the PS will run much hotter, and you will have a far narrower margin of safety. A ballsy video card can come out of a dormant state and demand a lot to get back to operating status. The same if you have multiple hard drives that spin down when they are not being used actively.

And lastly, heat dissipation. You have to make sure you have a good cooling solution in place, be it cross blowing fans to liquid nitrogen. Heat is deadly for computer systems, pc board adhesive, and a lot of other things in there.

Stay inside those general boundaries, and you won't waste money on things you simply do not need.
User avatar
Charles Weber
 
Posts: 3447
Joined: Wed Aug 08, 2007 5:14 pm

Post » Wed Sep 01, 2010 5:32 pm

Yeah, i'm building a pc for skyrim( first time :celebration: ) loved morrowind and oblivion, especially on pc, i modded the hell out of oblivion.(not too much as, i played it on a laptop with integrated graphics.) My problem is since i have to share the pc i'm building, i have to have it built within a month or 2, so yeah, ill just have to upgrade in a year or 2. Getting a radeon hd 5770 with it, and that should be able to run mid-high on skyrim(not looking to max every game, since i only really have about a $900 budget, and since this is my first time, i need a new monitor and operating system)

Edit: here's the build: http://secure.newegg.com/WishList/PublicWishDetail.aspx?WishListNumber=12608074
User avatar
Skivs
 
Posts: 3550
Joined: Sat Dec 01, 2007 10:06 pm

Post » Wed Sep 01, 2010 9:25 pm

I actually disagree on that point: Anybody who runs AA above 4x at the expense of either framerate or other settings is a bit silly.
There's a significant difference between 0x, 2x, and 4x, but after that you get highly diminishing returns. They're nice to have, of course, but not worth dropping anything else.

Ah I dont know, I ran with http://i438.photobucket.com/albums/qq102/Starforce9/AAsample.jpg of AA before. It wasn't so bad........ :whistling: It made games almost playable....

/sarcasm
User avatar
Gemma Flanagan
 
Posts: 3432
Joined: Sun Aug 13, 2006 6:34 pm

Post » Wed Sep 01, 2010 12:45 pm

If you're partial to ATI, go with the 6870.

Disagree 5870 is really still a more powerful card than the 6870,almost all the benchmark sites have it posting higher marks than the 6870 like this one on anandtec.

http://www.anandtech.com/bench/Product/294?vs=290
User avatar
Gavin boyce
 
Posts: 3436
Joined: Sat Jul 28, 2007 11:19 pm

Post » Wed Sep 01, 2010 4:02 pm

Hokay. first off, since we know that the game has to maintain consistency across multiple platforms, that means you can scratch some things off the video card right now. No shader model 4. No DX-11 functionality. Aside from the frame buffer, the rest of the card will be for processing textures and shaders. 500 megs -should- be sufficient. No idea if the PC version is HD, so an hdmi port isn't likely to be needed, unless you run your monitor with it by preference. Anything above 30fps is imperceptible to the average human eye; pad that by again as much, so that you don't stutter if you somehow overload the PCI-E bus. One card is all you need; dual carding so you can claim 150+ fps is nothing more than bragging. Personally, I've never run across a game that demanded dual cards.....at least since the original Monster cards and the beginning of the shooter craze. OpenGL will be a non issue if this is a gaming rig (unless you intend to run classic games that need it). Any card that fits within those general parameters will be good enough. Basically, the choices will be shaped by how many monitors you have, cooling requirements, power supply (remember you have to power video cards separately if you have lots of on-card RAM), and what other games you may play.

"No idea if the PC version is HD"?
Er. Even ignoring the fact that just about any recent standard for monitor connectivity is capable of driving monitors at "HD", the ability to render in arbitrary resolutions is a staple of PC gaming going back to the first 3D games. PC games don't /do/ "HD" and "Not HD".

The eye also doesn't work in "frames per second", you can't say "The eye cannot see higher than 30fps" without showing a fundamental misunderstanding of the topic. Primarily, the eye does not do sequential full refreshes like a monitor does, and while 24fps is the accepted standard for footage looking smooth without motion blur (I'll get to that) it's far from the refresh rate of a human eye. A very slow moving object at 10fps looks smooth, a very fast moving object at 100fps could look jerky. It's not about refreshing, it's about fluidity - that's where motion blur comes in. Most TV and movie footage is filmed at high fps (120fps+) and then processed down to about 24 with heavy motion blurring between frames to make sure that no one frame is too different to the last (This is also why it's almost impossible to get a good screen capture from film footage, even if it looks sharp in motion). There is no framerate the eye sees at, because it simply doesn't work like that.

Oh, and anecdotally? There's a huge difference between 30 and 60fps, and a huge difference between 60fps sharp and 60fps with per-object motion blur.
User avatar
Siobhan Wallis-McRobert
 
Posts: 3449
Joined: Fri Dec 08, 2006 4:09 pm

Post » Wed Sep 01, 2010 2:05 pm

In addition to the above, raid 0 probably is NOT worth it for you with HDDs unless you are doing work with very large files. It's just not worth it for things like computer boot or game loading. SSDs are another story from what I've read, but those are so expensive right now you're paying a ton of money for performance that will be much cheaper in only a couple years... and drives are one of the easiest things to replace. If you're using an older machine even modern HDDs will be way faster for you in a newer machine. It will be an enormous speed boost with modern parts whatever you do, unless your machine is relatively recent. I was astounded by my brother's new computer, scanning over 10,000 files every couple seconds with the virus scanner, with a HDD.

As to the computer build listed earlier in the topic, it "wastes" money in many places. My eyes particularly got big at the exorbitant $200 computer case that doesn't even have USB 3.0 and the $200 HDD (not even SSD). That's like burning money. To be fair they did say they were waiting for those particular components to drop in price, though.
User avatar
Sarah Bishop
 
Posts: 3387
Joined: Wed Oct 04, 2006 9:59 pm

Post » Wed Sep 01, 2010 2:42 pm

"No idea if the PC version is HD"?
Er. Even ignoring the fact that just about any recent standard for monitor connectivity is capable of driving monitors at "HD", the ability to render in arbitrary resolutions is a staple of PC gaming going back to the first 3D games. PC games don't /do/ "HD" and "Not HD".


Oops. My bad. I had video editing on the brain at the time.

The eye also doesn't work in "frames per second", you can't say "The eye cannot see higher than 30fps" without showing a fundamental misunderstanding of the topic. Primarily, the eye does not do sequential full refreshes like a monitor does, and while 24fps is the accepted standard for footage looking smooth without motion blur (I'll get to that) it's far from the refresh rate of a human eye. A very slow moving object at 10fps looks smooth, a very fast moving object at 100fps could look jerky. It's not about refreshing, it's about fluidity - that's where motion blur comes in. Most TV and movie footage is filmed at high fps (120fps+) and then processed down to about 24 with heavy motion blurring between frames to make sure that no one frame is too different to the last (This is also why it's almost impossible to get a good screen capture from film footage, even if it looks sharp in motion). There is no framerate the eye sees at, because it simply doesn't work like that.

Oh, and anecdotally? There's a huge difference between 30 and 60fps, and a huge difference between 60fps sharp and 60fps with per-object motion blur.


I didn't think getting into neural latency, phosphor decay rates vs. LCD ramping, and the particular peculiarities inflicted by purely digital media onto an anolog biological system due to time. And the whole motion blurring is really moot, as it would depend on whether gamesas implemented it ingame. Yes, there are kludges that can simulate the effect (poorly), but if the game itself does not have motion blur algorithms, then you are stuck with straight on/off framerate. At that point, you are dependant on the natural decay rate of neural reaction of the rods and cones in the retina, the latency of nerve transmissions, and the image decay rate in the brain itself. If you stray below that set of numbers, you have detectable strobing in the image; above that threshold, and the naturally occuring decay rates cover the strobe until the next image arrives. There is luminance as well as chroma factors here; the eye perceives both slightly differently, and that has an effect on the issue, as well. I'm not talking about having the framerate padded to allow for other effects to be shoved into the video stream; I was talking about the eye's ability to be aware of the on/off state change of an -illuminated image- (which is very different from perceiving anolog reality). Nothing more.
User avatar
Emma Parkinson
 
Posts: 3401
Joined: Wed Jul 26, 2006 5:53 pm

Post » Wed Sep 01, 2010 5:22 pm

I didn't think getting into neural latency, phosphor decay rates vs. LCD ramping, and the particular peculiarities inflicted by purely digital media onto an anolog biological system due to time. And the whole motion blurring is really moot, as it would depend on whether gamesas implemented it ingame. Yes, there are kludges that can simulate the effect (poorly), but if the game itself does not have motion blur algorithms, then you are stuck with straight on/off framerate. At that point, you are dependant on the natural decay rate of neural reaction of the rods and cones in the retina, the latency of nerve transmissions, and the image decay rate in the brain itself. If you stray below that set of numbers, you have detectable strobing in the image; above that threshold, and the naturally occuring decay rates cover the strobe until the next image arrives. There is luminance as well as chroma factors here; the eye perceives both slightly differently, and that has an effect on the issue, as well. I'm not talking about having the framerate padded to allow for other effects to be shoved into the video stream; I was talking about the eye's ability to be aware of the on/off state change of an -illuminated image- (which is very different from perceiving anolog reality). Nothing more.

Nice words, same response - 30fps is not an upper cap.

"Anything above 30fps is imperceptible to the average human eye" is not a vague sentence.
User avatar
Chica Cheve
 
Posts: 3411
Joined: Sun Aug 27, 2006 10:42 pm

Post » Wed Sep 01, 2010 1:14 pm

Hey guys, I'm going to be building a PC...for Skyrim mainly. My question is around RAM. I don't have a budget, but I'm not going to pay a bunch of money for performance that doesn't justify the cost. I'm looking at this ram: CORSAIR XMS3 4GB (2 x 2GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800). My question: Is it worth it to pay more for 2000+ speed ram? Specifically for gaming, even more specifically for Skyrim, would it be worth it to just up in speedier ram?
User avatar
Robyn Howlett
 
Posts: 3332
Joined: Wed Aug 23, 2006 9:01 pm

Post » Wed Sep 01, 2010 6:26 pm

Hey guys, I'm going to be building a PC...for Skyrim mainly. My question is around RAM. I don't have a budget, but I'm not going to pay a bunch of money for performance that doesn't justify the cost. I'm looking at this ram: CORSAIR XMS3 4GB (2 x 2GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800). My question: Is it worth it to pay more for 2000+ speed ram? Specifically for gaming, even more specifically for Skyrim, would it be worth it to just up in speedier ram?

Most of the time you have to O.C. the board to support RAM above 1333 or 1600. Either way to aren't going to get too great of a performance increase. I personally would just throw another 4gb in instead of trying to O.C. it.
User avatar
Sian Ennis
 
Posts: 3362
Joined: Wed Nov 08, 2006 11:46 am

Post » Thu Sep 02, 2010 12:08 am

More of a wish list could go through with a good job.
Ivy bridge(tock die shrink of intel's new addition) quad core clocked to 4Ghz+
8Gb of ram
2 smaller ocz3 in raid0
and graphics cards well its hard to tell I swing both ways between the green goblin and the red devil. I'll be watching slickdeals.net for a super deal on a mid/mid-high card and hope to get two of them.

If anyone on this forum puts a computer together from October-November for TES 5 just write me a note with the amount you want to spend and what you want from the machine and I'll help you put one together.
User avatar
sexy zara
 
Posts: 3268
Joined: Wed Nov 01, 2006 7:53 am

Post » Wed Sep 01, 2010 1:08 pm

I LOL'd. Seriously, don't listen to this guy. My old PC with 8800 GTX and Q6600 can't even max Oblivion with decent frame rates. A $500-600 will NOT "be fine". My laptop has more horsepower.

As for the game "looking the same" on PC as the 360. Only time will prove how completely false that statement is.


Optimizations bro, fallout 3/new vegas runs fine on a 8800gtx, as for the games looking the same I'm quoting todd himself.

As he STATED we get high rez textures and AA, they are also comfy with dx9 so no dx11 tessellation on objects, or soft shadows ect.
User avatar
Siobhan Wallis-McRobert
 
Posts: 3449
Joined: Fri Dec 08, 2006 4:09 pm

Post » Wed Sep 01, 2010 9:54 am

Most of the time you have to O.C. the board to support RAM above 1333 or 1600. Either way to aren't going to get too great of a performance increase. I personally would just throw another 4gb in instead of trying to O.C. it.


Why would I need to OC the board? The board I'm looking at supports a minimum of 1333 and goes up to 21xx
User avatar
+++CAZZY
 
Posts: 3403
Joined: Wed Sep 13, 2006 1:04 pm

Post » Wed Sep 01, 2010 1:56 pm

Anybody who uses antialising above 2x is an idiot. If you have oodles of processing capabilities to spare, then and only then should you max anti-aliasing.. But if you are running the game on 1900x1200 (or it's sibling settings) then you will not need any anti-aliasing, the resolution will generally be sharp enough that you will not see any aliasing.

Anti-aliasing is such a laborious and power-intensive process, you should never use it so long as you can avoid using it.


I have to be brutally honest, only idiots run games on "MAX GRAFICS LOL." There are so many utterly pointless graphical features that you simply do not need to turn on, they contribute very little yet eat a large chunk out of your framerate. Running the game on "high" is always good enough. Anybody that can run oblivion on High will probably be able to run Skyrim on Medium/High so long as the game is not incredibly unoptimized.


well considering most people will only have up to 1080p (1920x1080), then it's not quite at that level and aliasing is still noticeable (at least it is for me) at this level. Sure Resolution helps reduce the effect of aliasing by reducing the lost information that leads to the degradation, games still work by rendering a vast vast 3D world in a limited 2D image, we'll never be able to realistically reproduce every dot on every surface to it's own pixel on the outputted 2D image... and resolutions can only go so high in reality (based on DPI limitations and physical screen size). That said, I have an nVidia GTX470 so it easily handles 16x on just about every game out ^^. Anyways Image quality will still be smoother with it, then without it.
User avatar
Sabrina Schwarz
 
Posts: 3538
Joined: Fri Jul 14, 2006 10:02 am

PreviousNext

Return to V - Skyrim