I usually set most of these to enabled if my gpu can take it (antialiasing being usually the last one), except vsync. Vsync apart from lowering the framerate, sometimes has an effect on input in fast games.
Comment has been collapsed.
high-ultra. i basically try to go as high as i can.
regarding AA: i think it makes a huge difference, actually. don't want to play without it.
VSync always has to be on. tearing is the worst.
i don't like motion blur that much, so most of the time i switch it off. but i love DOF, so that's something that always has to be on. other than that i just see what i need to turn off for constant 60fps and what looks so good that i want to keep it (for instance, HBAO(+) is amazing most of the time).
Comment has been collapsed.
the difference is small enough and the performance loss high enough, so that i stick to 2xAA most of the time.
Comment has been collapsed.
I let the game decide for me D:
With all seriousness i have no idea what half of those do , so i just tend to turn stuff on and off and see if i want it or not :P
VSync is always on i think , and as long as pc can handle it i try to keep Textures Up to max .
Comment has been collapsed.
Depends from the game, i usually start midway with settings and tinker around with em.
Comment has been collapsed.
It depends a lot on the video card, too - mine takes a huge crap in its pants if i want it to do AA. Other than that, it's usually 2/3 of the way to the highest setting, or the highest. I'm fine with shadow quality being low if need be (but not off); and I have tons of ram so i don't usually have to worry about decreasing texture size. And usually i don't care much about vsync - if it's noceable enough to be a problem, there are probably other settings that would help more.
(i'm also an environment artist, so i don't usually turn down texture size cause i'm that guy off to the side staring at some well made plant or rock)
Comment has been collapsed.
Everything off or on lowest settings, 4X MSAA, some amount of anisotropic filtering (I've found it has almost no effect on performance), and native resolution if possible, else 1280x720.
Comment has been collapsed.
Texture quality isn't really important to me, but the jaggies irritate me. I've found that lowest settings everything @ 1080 is usually worse performance-wise than something like 1280x720, lowest settings, 4xMSAA.
Maybe my computer's just weird.
Comment has been collapsed.
Current:
Running on a GTX 1070 and 32GB RAM, so I haven't run into anything (yet) that I can't max. Vsync/res due to running on a 42" LCD TV that overscans but does not have a compensation mode for PC (at least not through DVI/HDMI)
Comment has been collapsed.
I don't think I've ever seen 1804x1014 among possible resolutions in game settings. How do you usually set it? .ini file editing?
Comment has been collapsed.
Scaling settings in Nvidia Control Panel. Set 1920x1080 as baseline, then scale down the x and y individually to fit the screen bezel. From that point, the driver automatically scales any fullscreen app that calls for 1920x1080, if they don't natively support custom resolution.
Comment has been collapsed.
Interesting. It seems like we AMD users don't have such scaling feature, or at least I can't find it on my system.
Comment has been collapsed.
If I remember from my brief time using a loaned AMD card, you should somewhere in the catalyst control panel. Look for controls for stretching or shrinking display, "TV Settings", overscan/underscan, or anything relating to desktop size manipulation. The feature has been common ever since HDMI became commonplace.
Comment has been collapsed.
I found something similar in the Intel control panel (my laptop has an integrated Intel card used by Windows and an AMD used by games), it allows to shrink the screen and games are displayed shrunk too when at the same resolution. It would probably come in handy if I used a particular screen like your TV (of course I don't need it on my simple 22" pc screen, I was just curious).
Comment has been collapsed.
If my pc allows it and i can maintain 60fps i set everything as high as possible. If i see fps drops i start lowering aa to x4-2 or shadows to high/medium. I never turned down af, because i notice it very much and can't stand low af.
If my pc can't handle at least 45fps on high/ultra i just put the game on a virtual shelf and start saving money for upgrade lol that doesn't apply to slow games like strategy, 30fps in those are fine to me.
Comment has been collapsed.
In theory: I start at the highest settings and go down until I get something with satisfactory performance.
In practice: Low/ultra low, hope I get a game and not a slideshow.
I'll be honest, I don't really understand people's obsessions with 60+ FPS games...largely because with modern (non-indie) games, I'm lucky to get 30 FPS. I once tried the demo of Dragon Age: Inquisition and I got about 3 frames per minute on the lowest graphics settings.
I also don't play many games where FPS would be much of a factor at all...why should I care if a visual novel or a turn-based RPG is locked at 30 FPS?
Comment has been collapsed.
At the moment I usually use medium settings - whatever default the game assigns for my weak rig. I can only build my new gaming PC next year once I move to my own house, currently no space around here in my room, so gotta bear with laptop a bit. xD
Comment has been collapsed.
I have a weak CPU (intel Quad core i5 750) but a decent GPU (Radeon R9 380x), plus 8GB of RAM.
Anti-Aliasing: 8x, 4x if my CPU is struggling.
Texture Quality: Maxed out, unless my CPU is struggling.
Shadow Quality: Maxed out, unless it takes up resources unnecessarily.
Anisotropic Filtering: 4x, 8x. Depends on my performance.
VSync: Always on. A game without Vsync is a game I won't play. I don't need 400FPS to overheat my GPU.
Triple Buffering: Wait, isn't that like, part of the AA setting? No idea about this. :P
Generally, I'll set the settings to medium-high. It really depends on which game. Some games look like crap and yet my GPU is struggling with medium quality settings.
My system is decent but in dire need of a new CPU. Only, new CPU = new MOBO, new MOBO = new case... so yeah. That's a lot of money to spend, which I don't have.
Comment has been collapsed.
My HTPC also has an i5-750 CPU, and I'm not sure how much of a bottleneck it is. I'm going to add another HD7850 to it (in CrossFire) and will soon be able to tell exactly how significant of a bottleneck the CPU is, as I'm running all benchmarks also with my main PC, which has an i7 4770K and had the same GPU.
Comment has been collapsed.
Well, to be fair the i5 750 is a solid CPU. It can handle mostly anything you throw at it and it's affordable, I've had the same CPU for 5 years and I can still run most games on medium-high graphics. Only, it's not the best when it comes to maxing out your graphics.
Skyrim with maxed out textures caused stuttering when moving/loading up a new area. Most newer games do the same with maxed out textures, which means you have to lower your texture quality... and textures are pretty much the most important graphics setting to have a nice visual experience.
Comment has been collapsed.
After doing some more tests (soon to be posted in a new thread), I can tell you that if gaming is a reason for you to consider upgrading your CPU, hold on to that i5 750. It seems that it's almost never the bottleneck. my i7 4770K is several times faster than my i5 750 when I'm using it with image processing applications like DxO Optics Pro 11 (with RAW conversion using Prime NR), but when it comes to games with maxed out graphics the difference is usually tiny. Adding another GPU makes a much bigger difference. The only game I tested which seems to have issues that may be CPU-related is Metro 2033. I'm still looking into it.
Don't waste your money. The i5 750 is an oldie but goodie.
Comment has been collapsed.
And doesn't have the unfortunate drawback as the excellent Phenom II line has now, the lack of SSE 4.2 instructions. Makes me sad, the Phenom II X4 960 line was to my knowledge (one of) the most popular gaming CPU in history, and with a great reason. =(
Comment has been collapsed.
But how would upgrading my CPU be a waste of time/money when the reason I get stuttering FPS is when my CPU usage goes up to 100% in same games with maxed out graphics, while my GPU seems to be handling it well? :P
Comment has been collapsed.
The results are in:
https://www.steamgifts.com/discussion/Faj9Z/things-i-learned-benchmarking-my-pcs
Comment has been collapsed.
Shadows always off. If that can't be done, then low. They are too hefty on the system and provide no real quality to the game.
AA and AF go as far as they can without altering performance too much. Low priority.
Texture quality = as high as I can run it. Highest priority.
VSync = Default
Triple Buffering = off
Comment has been collapsed.
Whatever my PC can handle while still giving decent FPS. :D
Comment has been collapsed.
Max across the board at 4k, until my framerate drops to below 40, then I'll switch down to 1080p - after that I turn down Occlusion, then AA - in the worst case scenario I'll turn Vsynch off, but I hate doing that because I hate tearing. I really need a monitor with FreeSynch lol.
Comment has been collapsed.
22 Comments - Last post 1 hour ago by GarionX
41 Comments - Last post 4 hours ago by Hawkingmeister
2 Comments - Last post 5 hours ago by PurplyPlus
18 Comments - Last post 5 hours ago by doomofdoom
13 Comments - Last post 6 hours ago by sensualshakti
1,063 Comments - Last post 11 hours ago by Mayanaise
331 Comments - Last post 15 hours ago by Daud
141 Comments - Last post 5 minutes ago by forseeker
9,756 Comments - Last post 7 minutes ago by CurryKingWurst
123 Comments - Last post 10 minutes ago by jaaydee
103 Comments - Last post 13 minutes ago by faelynaris
22 Comments - Last post 15 minutes ago by faelynaris
30 Comments - Last post 18 minutes ago by Gubudugu
8,311 Comments - Last post 20 minutes ago by Almostn33t
I recently upgraded my video card and did (and still do) lots of before/after benchmarks. I'm left with the feeling that some settings mostly just reduce framerate without providing much visible improvement. Anti-aliasing seems to be one such setting. At 1920x1080 resolution I don't normally notice jaggies, so it's not clear what would be the benefit of enabling this option, let alone setting it on something really high. I'm curious which settings people use, what value do they use, and do they see a benefit beyond bragging rights.
So, how do you usually configure the following:
I assume most people use their monitor's native resolution, so no need to discuss this one, but would be nice if you give some background about your rig for context.
Thanks!
Comment has been collapsed.