UnbeknownstGhost Posted August 30, 2010 Share #26 Posted August 30, 2010 What is your current? Link to comment
Veedo Posted August 30, 2010 Share #27 Posted August 30, 2010 Currently I'm running a GeForce GT 230. It passes all four subsections on the video card section, just not the model. Link to comment
UnbeknownstGhost Posted August 30, 2010 Share #28 Posted August 30, 2010 You'll be able to play it at a nice, low setting. Not a shiny oh so pretty, HIGH setting. Link to comment
Koti Nexus Posted August 30, 2010 Share #29 Posted August 30, 2010 Ahh, my graphics card, how I love the and how you confuse all things that look at you. *pets Graphics card* You a good graphics card, yes you are, my business baby card you are yess.. Nvidia Quadro FX card is a business graphics card is typically is unable to be accessed by.. *ahems* normal consumers, but since I work on web comics and have a business name.. :roll: The game runs, I know it does, i know this from last test on Beta where they freed up the game to run at high specs. Unless on Open beta they crank it up more, which means i just turn down the texture from high to medium. Link to comment
Kale Posted August 31, 2010 Share #30 Posted August 31, 2010 YES! I- -still cant run it. God damn laptops. :frustrated: Link to comment
UnbeknownstGhost Posted August 31, 2010 Share #31 Posted August 31, 2010 YES! I- -still cant run it. God damn laptops. :frustrated: Whats wrong with laptops? Link to comment
Kale Posted September 5, 2010 Share #32 Posted September 5, 2010 Laptops just aren't fit for games like this, partly because of the whole "Sorry you can't upgrade your $1200 purchases' graphic card," setup. :cry: WHY DIDN'T I BUY A DESKTOP!? :frustrated: Link to comment
UnbeknownstGhost Posted September 5, 2010 Share #33 Posted September 5, 2010 Laptops just aren't fit for games like this, partly because of the whole "Sorry you can't upgrade your $1200 purchases' graphic card," setup. :cry: WHY DIDN'T I BUY A DESKTOP!? :frustrated: Actually, I have an Asus gaming laptop. I can upgrade anything in it.... It even has room fro two more HDs... Link to comment
Satisiun Posted September 5, 2010 Share #34 Posted September 5, 2010 Laptops just aren't fit for games like this, partly because of the whole "Sorry you can't upgrade your $1200 purchases' graphic card," setup. :cry: WHY DIDN'T I BUY A DESKTOP!? :frustrated: Actually, I have an Asus gaming laptop. I can upgrade anything in it.... It even has room fro two more HDs... Yep. Seconding the positive vibes towards Asus. My own laptop has served me quite well, and although I've yet to upgrade it (I haven't had to since I got it for Christmas in 2008), I know I can when the time arises. Link to comment
Con One Posted September 5, 2010 Share #35 Posted September 5, 2010 Asus does come out with great hardware for their price point, but they tend to rush production and have generally crappy customer service. I spent a couple hundred dollars extra and went with a Sagen this go around, I like their style. Link to comment
Kale Posted September 5, 2010 Share #36 Posted September 5, 2010 CRAP! The manager never even mentioned a ASUS when I went to purchase a gaming laptop. DAMN YOU KENNY! I WASTED $1200 BECAUSE OF YOU! :frustrated: I swear, I'm going to suffer a concussion if I keep doing this. :lol: Link to comment
falcolas Posted September 5, 2010 Share #37 Posted September 5, 2010 Halfway between the two, and I still have to run it at half resolution, with all goodies turned off. Blows chunks, particularly with my 6mo old alienware laptop. I appreciate that they wanted to make the graphics future proof, but good god. Link to comment
Freyar Posted September 5, 2010 Share #38 Posted September 5, 2010 Halfway between the two, and I still have to run it at half resolution, with all goodies turned off. Blows chunks, particularly with my 6mo old alienware laptop. I appreciate that they wanted to make the graphics future proof, but good god. It really seems to be a CPU limitation, and not a GPU limitation. My i5 at work and Intel Quad at home are slammed pretty hard while the video card seems mildly indifferent. Link to comment
falcolas Posted September 5, 2010 Share #39 Posted September 5, 2010 That... really makes me scratch my head in wonder. What could they be calculating that can't be done in the GPU? Link to comment
Freyar Posted September 5, 2010 Share #40 Posted September 5, 2010 That... really makes me scratch my head in wonder. What could they be calculating that can't be done in the GPU? Maybe because the client was designed for console architectures? I was also surprised when I saw that FFXIV only seemed to take up maybe 1GB of RAM. Link to comment
FreelanceWizard Posted September 6, 2010 Share #41 Posted September 6, 2010 Beyond the fact that console games, by virtue of their comparatively less impressive GPUs, have to offload more to the CPU, a lot of optimization can be done during the beta phase. Shaders can be moved out of software and onto hardware, shader programs that work better for specific video card series can be developed, debugging telemetry can be disabled, and compiler optimization can be turned on. All of this can produce marked improvements in performance. That said, the performance is surprisingly poor compared to games of similar graphic quality (I'm thinking Assassin's Creed II and EQ2 at maximum settings, just off the top of my head), which is troubling. Hopefully it'll improve as the beta progresses, though. I know CoH had major netcode issues in beta, and CO ran like a dog on nVidia cards for a while until they got some render path fixes in. Link to comment
Zyrusticae Posted September 6, 2010 Share #42 Posted September 6, 2010 That said, the performance is surprisingly poor compared to games of similar graphic quality (I'm thinking Assassin's Creed II and EQ2 at maximum settings, just off the top of my head), which is troubling. EQ2? Really? EQ2 runs like a dehydrated dog on my system at maximum settings, whereas Final Fantasy XIV runs smooth as butter... except in Limsa Lominsa, for some reason unbeknownst to me. Well, I do have to turn off Ambient Occlusion, but I imagine the grand majority of players do as well. It's just a terribly poorly optimized feature, especially when considering how many other games have it and run just fine with it enabled... Link to comment
FreelanceWizard Posted September 6, 2010 Share #43 Posted September 6, 2010 I've actually found ambient occlusion is a terrible drain on any card. It seems to be about a 20-50% performance hit in many games for a very modest improvement in graphics quality. For me, that puts it squarely in the "people with SLI setups get to use this to justify their expense" category. Admittedly, I've not put a lot of effort into playing with ambient occlusion settings across multiple games, but the experiences I have had have not been positive. For me, EQ2 runs quite well on high graphics settings with Shader 3 on and a fast CPU. Because of how SOE built the game, it's highly CPU-bound and uses very few cores. SOE bet on the wrong horse in the hardware development race, and it shows. That said, my performance in EQ2 easily bests that of FFXIV on equally high settings. Now, given your anecdotal evidence and my anecdotal evidence, I wonder if the issue is driver or rendering path related. Champions, for instance, is known to perform better on ATI than nVidia hardware (especially with shadows enabled), and for a while after release, it suffered serious performance issues on 200 and 400 series GPUs. Even now, it suffers from graphical artifacts on 400 series cards. If SE hasn't gotten to optimizing for different cards yet, that could explain the differences in performance and gives me hope we'll see them fixed. Link to comment
Zyrusticae Posted September 6, 2010 Share #44 Posted September 6, 2010 I, personally, will run Ambient Occlusion on pretty much any game that supports it... with the only exception, thus far, of Final Fantasy XIV, simply because the performance with it is abysmal. It's rather perplexing. I have an Nvidia GTX 260, and I have no problems running Crysis, Aion, Champions Online, APB, any Source Engine game, Mass Effect 2, UT3... all with ambient occlusion enabled with the settings cranked. It's only FFXIV where it causes the game to devolve into a literal slideshow. Whoever programmed that feature really, really needs to go look at how the other game developers do it, or even just steal Nvidia's version, which runs in pretty much any game with a minimal performance impact (and looks nice, too). Meanwhile, EQ2 just... I don't understand it. It really runs just terribly on my system. I mean, I even have a nifty quad core (Q8300 clocked at 3GHz each core) and it just ran like dirt on dirt. Mind, I kept the game in SM 3.0 when I tried it, and never tried using the regular (primitive) shading model with CPU shadows instead, but considering how old the game is I have a hard time wrapping my head around the idea that it's GPU bottlenecked. Hrm. Link to comment
FreelanceWizard Posted September 6, 2010 Share #45 Posted September 6, 2010 That is weird. EQ2 is definitely CPU bound on anything in the 200 series, and the general rule of thumb is that it needs the highest possible clocks because it brute-forces a lot of the rendering. I didn't get the performance I have in it now until I moved to my i7 -- but it's at 2.8, not 3.0. Curious. I also need to retract my statement that FFXIV is running terribly on my system. While it definitely could use some optimization and needs to offload more to the GPU (as well as needing to cache more data by using more RAM), once I undid something stupid I set globally in the nVidia Control Panel for another game (HL2), performance improved significantly. I also needed to turn down the texture quality to Standard to get VSync to run well, but I personally don't see much difference between Standard and High with FSAA turned on. The only times I start going into slideshow mode now are on the lifts and in areas where there's a lot of people around -- and even then, performance improves the longer I stay in the area. The interface lag is definitely directly related to this, which tells me that the issue may very well be related to the netcode or texture loading. Both of those can be fixed, and really, things aren't so bad when you get away from the crowds. I would even go so far to say that the game is actually pretty snappy when you're by yourself. Link to comment
Zyrusticae Posted September 6, 2010 Share #46 Posted September 6, 2010 (as well as needing to cache more data by using more RAM) For me, this is my single biggest gripe with the game's performance. I find that Limsa Lominsa performs poorly because it is constantly loading in new data, even though I have gigs of free RAM it could use to to prevent that. The constant stuttering has convinced me that I MUST start in either Ul'dah or Gridania, both of which appear to have less stuff that needs loading or are just simply better-optimized all-around. The interface lag is definitely directly related to this' date=' which tells me that the issue may very well be related to the netcode or texture loading.[/quote'] I'm pretty sure it's the netcode. Every time the game pauses on a UI element everything around me freezes as the game waits for the data to come back from SE's servers, something that has become unbearable lately as some servers appear to have a FPS of approximately 1/5. I literally cannot do anything while it is frozen like this, including turning the camera (seriously, what? My camera controls are linked to the servers?), which shows just how badly the interface is dependent on a player's connection and the servers being good. Link to comment
BloodHecate Posted September 12, 2010 Share #47 Posted September 12, 2010 I'm nowhere near to having a dream computer, but I'm just happy to be able to play it. Link to comment
Emaraya Posted September 12, 2010 Share #48 Posted September 12, 2010 The best joke at this test: My graphic card meets all subrequirements, graphic RAM even by 300% in optimized but all in all it fails the minimum because it is an older card than the one in the list. If I fail in a test because my pen is from 1989 then I would say the test was not about skill. Link to comment
Soren Miren Posted September 12, 2010 Share #49 Posted September 12, 2010 >.> <.< My only regret is my motherboard doesn't want to acknowledge my additional 3 gigs of RAM. Link to comment
Freyar Posted September 13, 2010 Share #50 Posted September 13, 2010 I'd bet I'd fail on the CPU at home. It's a Q9550, before the Core iX's came out. Link to comment
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now