Jump to content

Optimization


PolishRye

Recommended Posts

54 minutes ago, Caysle said:

I remember GTX 1060 being the recommended GPU, from the benchmarks I've watched 1070 barely handles at [email protected] fps. bad

The system requirements for both minimum and recommended are out of match for games. They're like targeting 30 FPS which is stupid.

Link to comment
Share on other sites

3 minutes ago, NativeX said:

The system requirements for both minimum and recommended are out of match for games. They're like targeting 30 FPS which is stupid.

Generally, the specs target 1080p 30fps at medium settings which in all honesty is perfectly playable.

  • Like 2
Link to comment
Share on other sites

51 minutes ago, NativeX said:

The system requirements for both minimum and recommended are out of match for games. They're like targeting 30 FPS which is stupid.

But 30fps at what level! 
 

A bunch of the lash back is against management of expectations - if they’d come out and said 1060 for 30fps at low-medium, equivalent to console settings, that would be reasonable.

 

Non communication causes so many issues - Rockstar’s PR need a-spankin’

Edited by RogueMango
Link to comment
Share on other sites

48 minutes ago, RogueMango said:

But 30fps at what level! 
 

A bunch of the lash back is against management of expectations - if they’d come out and said 1060 for 30fps at low-medium, equivalent to console settings, that would be reasonable.

 

Non communication causes so many issues - Rockstar’s PR need a-spankin’

That is what they never explain. PC and Console users are being abandoned all for the money.

Link to comment
Share on other sites

1 hour ago, McGuinness45 said:

Generally, the specs target 1080p 30fps at medium settings which in all honesty is perfectly playable.

Spending a lot of money on a gaming PC and 30 fps peasantry don't go very well together. Don't get me wrong, you are mostly right about 30 fps being playable. Console 30 fps vs PC 30 fps feels different though, and there's the online which puts you at a disadvantage against players who play at 60fps. And most importantly, there's the feeling of betrayal after spending your or your family's hard earned dollars and not feeling satisfied with what you're getting

  • Like 1
Link to comment
Share on other sites

3 hours ago, ^Tears^ said:

Watch these two videos...  You'll start to get a clearer picture (at least with the CPU).  You'll find a lot of older CPU's run this game better than a few of the newer ones because they have more thread counts and aren't hitting that 'invisible cap' (watch the video).  I think the main offender from Intel was the i7 9700K.

 

There's also a lot of Bios issues out there too, which they start off going into.

 

 

 

Use a cap and stop it hitting the really high (180) fps and your golden. 

I really hope they fix it for those struggling with it but I certainly hope they don't break it for others like me who are having the best time in RDR2! It's everything I hoped and expected. 

I have a mate who was running perfectly until last night and now it is showing an error saying he has files that aren't allowed and he can be banned and he has done nothing other than let RSL do its thing! 

Edited by 2ndLastJedi
Link to comment
Share on other sites

15 hours ago, E•MO•TION said:

i7-8700K

GTX 1070ti

16GB DDR4-3200

 

I ended up locking it down to 1080p at 30Hz with the highest preset. 60 FPS is out of the question at the moment. 😒

I have almost the the same config and was wondering if in yours everything is on max, including water physics and such.

Edited by dextervanroo
Spelling
Link to comment
Share on other sites

As someone who follows the PC hardware scene fairly closely one thing I have noticed is there's some kind of anomaly with the Nvidia drivers, anyone who follows gpu performance charts will know that the 1080ti is actually on par with the RTX 2080 with rasterisation gaming in benchmarks

the following graph clearly shows something is wrong when it's this far down the chart, losing 15fps to the 2080 in DX12 ? something is clearly wrong here, Nvidia is gimping the older gfx cards here.

performance-directx-12-2560-1440.png

Vulkan is not much better, but it is better, but not where it should be.

performance-vulkan-2560-1440.png

Link to comment
Share on other sites

From the video above, https://youtu.be/yD3xd1qGcfo?t=863

It is interesting to see how the 2070S performs at 1440p/High

I am wondering how it goes at 1080p/Ultra (or better, 1200p/Ultra), and which trade-off you have to make to keep it at 60 fps

 

Anybody with a 2070S playing at 1080p who can report how high they can go with a stable 60 fps?

 

 

Link to comment
Share on other sites

3 minutes ago, seldo said:

From the video above, https://youtu.be/yD3xd1qGcfo?t=863

It is interesting to see how the 2070S performs at 1440p/High

I am wondering how it goes at 1080p/Ultra (or better, 1200p/Ultra), and which trade-off you have to make to keep it at 60 fps

 

Anybody with a 2070S playing at 1080p who can report how high they can go with a stable 60 fps?

 

 

I have an i74790k and a EVGA 1080SC

At 1080p I can play with almost everything on ultra with a few things on high and maintain 60+fps. That's also with TAA medium and FXAA enabled.

Link to comment
Share on other sites

Officer Friendly
39 minutes ago, Gforce said:

As someone who follows the PC hardware scene fairly closely one thing I have noticed is there's some kind of anomaly with the Nvidia drivers, anyone who follows gpu performance charts will know that the 1080ti is actually on par with the RTX 2080 with rasterisation gaming in benchmarks

the following graph clearly shows something is wrong when it's this far down the chart, losing 15fps to the 2080 in DX12 ? something is clearly wrong here, Nvidia is gimping the older gfx cards here.

performance-directx-12-2560-1440.png

Vulkan is not much better, but it is better, but not where it should be.

performance-vulkan-2560-1440.png

How the f*ck is a 2060 and 2070 beating a 1080TI. This is some bullsh*t.

  • Like 4
Link to comment
Share on other sites

1 minute ago, Officer Friendly said:

How the f*ck is a 2060 and 2070 beating a 1080TI. This is some bullsh*t.

If I had to guess, it's because they used quality presets which actually scale to your hardware. If you set the high preset on two different GPU's like a 970 and a 2080 and read the individual graphic options you will have different results. I highly doubt those results are from manually setting all of the options exactly the same for multiple GPU's with various graphics qualities.

  • Like 2
Link to comment
Share on other sites

2 minutes ago, McGuinness45 said:

I have an i74790k and a EVGA 1080SC

At 1080p I can play with almost everything on ultra with a few things on high and maintain 60+fps. That's also with TAA medium and FXAA enabled.

Thanks! Looks like I will get a 2070S then. It should be able to keep it for the next 4/5 years without much trouble.

Link to comment
Share on other sites

The snow when it blizzarding in the beginning of the game seems to be the worst performance. Its probably because of all those on screen particles and the fog effects. . So most likely you will get your minimum frame rate very quickly as the game starts.

Edited by ViperDG
  • Like 1
Link to comment
Share on other sites

1 hour ago, Gforce said:

As someone who follows the PC hardware scene fairly closely one thing I have noticed is there's some kind of anomaly with the Nvidia drivers, anyone who follows gpu performance charts will know that the 1080ti is actually on par with the RTX 2080 with rasterisation gaming in benchmarks

the following graph clearly shows something is wrong when it's this far down the chart, losing 15fps to the 2080 in DX12 ? something is clearly wrong here, Nvidia is gimping the older gfx cards here.

performance-directx-12-2560-1440.png

Vulkan is not much better, but it is better, but not where it should be.

performance-vulkan-2560-1440.png

As a GTX 1080Ti User, I feel screwed over. I bet they did this sh*t on purpose. The 1080Ti was too good, they had to nerf it because of RTX.

 

  • Like 2
Link to comment
Share on other sites

1 hour ago, ViperDG said:

The snow when it blizzarding in the beginning of the game seems to be the worst performance. Its probably because of all those on screen particles and the fog effects. . So most likely you will get your minimum frame rate very quickly as the game starts.

Have to disagree. I had like 40-50 fps in the beginning of the game (with a few freezes though but that was mainly because of Vulkan) but it was still fairly playable. Now in chapter 2 when I reach the first real town my fps go down really heavily and it's sometimes even hard to maintain 30 fps. I know my hardware isn't the strongest but I think it should still run better than that (settings hardly matter btw, even on the lowest it performs like that)

 

GTX 1060 SC 6GB

i5 4690k @ 4,2Ghz

16GB RAM

Link to comment
Share on other sites

 

people complaining about optimization on pc, and im running both side by side and laughing at those comments 🤣

 

3RCV41g.jpg

 

 

lets not forget that the xbox runs below 30fps at 864p, not even 1080p. its so bad in graphics and resolution i never even finished the game or even enjoyed it.

 

Edited by HaRdSTyLe_83
  • Like 2
Link to comment
Share on other sites

52 minutes ago, VIPΣR said:

As a GTX 1080Ti User, I feel screwed over. I bet they did this sh*t on purpose. The 1080Ti was too good, they had to nerf it because of RTX.

 

Ah Chris from the Good Ole Gamer, he's on my subs list on YT, he's right, the 1080ti has held up well, even the 2080ti is not much faster than the 1080ti in pretty much every other benchmark I've seen.

RTX, the Redundant Deep Learning AI industry leftovers dumped onto gamers to recoup shareholder investments after the smart money pulled out of Nvidia.


Yeah I feel you there, I was looking to do a short term upgrade to the 1080ti while I wait to see the next gen of gpu's, but held off when R* released the system specs needed to run it, but appears they set the bar too low for Ngreedia who now appear to have actively gimped every gpu below the Turing lineup, a stock RX5700 beating a 1080ti, they're taking the P!ss.

  • Like 3
Link to comment
Share on other sites

This is why I'm still on 430.86! RDR2 is running well at 1440p /60 using Joker Productions 1080ti settings. Search Joker Productions on YT for the settings. 

Much like SOTTR settings and performance. 

Link to comment
Share on other sites

Couldn't believe how well this game runs. Here are my old ass specs:

 

 - Core i7 920 @ 3.6GHz

 - 8GB RAM

 - GTX 970 G1 Windforce

Shockingly, I was able to play through the Prologue section with a mix of Ultra (Textures, Lighting, SSAO), High and Medium with a mostly steady ~30fps at 1080p. No crashes. And wow is the lighting good in this game.

 

... but, as soon I got out down the mountains into the Heartlands, that was that. I've tried turning some high stuff to medium but I can barely achieve a stable 30 even in camp on the same config. I'll probably have to drop some of the Ultra lightning stuff down, sadly.

Link to comment
Share on other sites

9 hours ago, Officer Friendly said:

How the f*ck is a 2060 and 2070 beating a 1080TI. This is some bullsh*t.

I'm beating 1080TI in all games with my 2070s... I'm not surprised.

 

Next year spring the new cards come out and the 2080TI will be beaten.

 

9 hours ago, seldo said:

From the video above, https://youtu.be/yD3xd1qGcfo?t=863

It is interesting to see how the 2070S performs at 1440p/High

I am wondering how it goes at 1080p/Ultra (or better, 1200p/Ultra), and which trade-off you have to make to keep it at 60 fps

 

Anybody with a 2070S playing at 1080p who can report how high they can go with a stable 60 fps?

 

 

1440P on high settings = 55-58fps. On 1080p you shouldn't have any problems with 60fps.

Edited by Ferio
Link to comment
Share on other sites

I used the Game Debate article to tweak settings based on FPS hits:

 

https://www.google.co.uk/amp/s/www.game-debate.com/amp/news/27927/newsAmpPage.html

 

gtx 1060 6gb and a crusty i5 4440

and got a fantastic balance between frames: average 50 in the wilderness, 40 in camp, 35 in Valentine at high/medium with some killers at low: grass, no MSAA, water is medium, reflections low, ancillary effects at ultra (eg particles, texture, fur etc )

 

it looks stunning and performance is steady. It’s not 60, but for me an average of 40/45 across the board is great with these visuals 

Link to comment
Share on other sites

34 minutes ago, RogueMango said:

I used the Game Debate article to tweak settings based on FPS hits:

 

https://www.google.co.uk/amp/s/www.game-debate.com/amp/news/27927/newsAmpPage.html

 

gtx 1060 6gb and a crusty i5 4440

and got a fantastic balance between frames: average 50 in the wilderness, 40 in camp, 35 in Valentine at high/medium with some killers at low: grass, no MSAA, water is medium, reflections low, ancillary effects at ultra (eg particles, texture, fur etc )

 

it looks stunning and performance is steady. It’s not 60, but for me an average of 40/45 across the board is great with these visuals 

How is this accepteble though?

 

You get almost the same graphics with base PS4.

 

Base ps4 is supposed to have  a gpu that is equivalent to 750ti... We were used to get same performance with a 1050ti in games but this game take that bar to the 1060.

 

You should've get 55-60 fps with those settings. This is outrageously bad optimization... And people accepting it...

 

To get same graphics with base PS4 @60 fps, you need to have RTX 2060, which is 7-8 times stronger as in TFLOPS than a PS4 GPU.

 

 

 

 

Edited by yamaci17
Link to comment
Share on other sites

43 minutes ago, yamaci17 said:

How is this accepteble though?

 

You get almost the same graphics with base PS4.

 

Base ps4 is supposed to have  a gpu that is equivalent to 750ti... We were used to get same performance with a 1050ti in games but this game take that bar to the 1060.

 

You should've get 55-60 fps with those settings. This is outrageously bad optimization... And people accepting it...

 

To get same graphics with base PS4 @60 fps, you need to have RTX 2060, which is 7-8 times stronger as in TFLOPS than a PS4 GPU.

 

 

 

 

Is that correct though? There’s no way of knowing yet (to my knowledge so could be wrong) how the console settings stand up: need to wait for. DF analysis or similar 

 

I do agree though that optimisation seems really underbaked - I expect Improvements through patches and drivers (AMD appears to fare far better). 

Edited by RogueMango
Link to comment
Share on other sites

2 hours ago, yamaci17 said:

How is this accepteble though?

 

You get almost the same graphics with base PS4.

 

not even on the lowest settings at 1080p does the game looks as bad as the console version, nor does the PS4 or Xbox run at 50fps.

 

but yes, something is wrong, at least to me, ive seen better performance locking it to 60hz then at 100hz. idk if its something to do with vertical sync, but the image looks better when moving the camera around

Edited by HaRdSTyLe_83
Link to comment
Share on other sites

13 minutes ago, HaRdSTyLe_83 said:

 

not even on the lowest settings at 1080p does the game looks as bad as the console version, nor does the PS4 or Xbox run at 50fps.

 

but yes, something is wrong, at least to me, ive seen better performance locking it to 60hz then at 100hz. idk if its something to do with vertical sync, but the image looks better when moving the camera around

How do you lock? Is this based on your screen?

I only have 50 or 60 fps

Link to comment
Share on other sites

6 hours ago, dandroid said:

Couldn't believe how well this game runs. Here are my old ass specs:

 

 - Core i7 920 @ 3.6GHz

 - 8GB RAM

 - GTX 970 G1 Windforce

Shockingly, I was able to play through the Prologue section with a mix of Ultra (Textures, Lighting, SSAO), High and Medium with a mostly steady ~30fps at 1080p. No crashes. And wow is the lighting good in this game.

 

... but, as soon I got out down the mountains into the Heartlands, that was that. I've tried turning some high stuff to medium but I can barely achieve a stable 30 even in camp on the same config. I'll probably have to drop some of the Ultra lightning stuff down, sadly.

Given this is pretty much my spec, and I haven't had a chance to play the game, this is interesting. If you have any further tips on tweaking settings to get it looking good from C2 onwards, that would be appreciated!

Link to comment
Share on other sites

22 hours ago, Caysle said:

I remember GTX 1060 being the recommended GPU, from the benchmarks I've watched 1070 barely handles at [email protected] fps. bad

False. GTX 1070 8GB here, High settings, 1080p, 50-65FPS.

Edited by Alabeo
Link to comment
Share on other sites

3 hours ago, seldo said:

How do you lock? Is this based on your screen?

I only have 50 or 60 fps

yes, if your screen is only 60Hz it will only show those 2 option, mine is 120Hz, but if i set to 100Hz or 120Hz the image when turning the camera around isnt that smooth, maybe its just me but the 60hz feel way more then that.

Also the settings dont seem to save once in awhile, and every time i log in its set to windowed mode

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • 1 User Currently Viewing
    0 members, 0 Anonymous, 1 Guest

×
×
  • Create New...

Important Information

By using GTAForums.com, you agree to our Terms of Use and Privacy Policy.