ZachariasX Posted October 12 Author Share Posted October 12 If it is just gaming, the 5080 might be a solution, as 16 GB should be enough for VR, as there are simply too many cards out there in the 8 to 12 GB VRAM range, where any game has to work as well. You essentially have a marginally faster 4090 with less VRAM. If you are a gamer and run a 4090 on 1440p (like I do) there is absolutely no point in getting a 50 series card, as you get 0 FPS more than you had. What you probably can do is max out your CPU by capping a 5090 to 100 Watts instead if 200 Watts like a 4090… If you are a producer, getting another 8 GB VRAM plus 50% performance, it is a nobrainer to switch. Even arüt that price. That is why I would assume most of the available 4090 when the 5090 is out would be heavily used cards. 1 Quote Link to comment Share on other sites More sharing options...
Boo Posted October 12 Share Posted October 12 10 minutes ago, ZachariasX said: If it is just gaming, the 5080 might be a solution, as 16 GB should be enough for VR, as there are simply too many cards out there in the 8 to 12 GB VRAM range, where any game has to work as well. You essentially have a marginally faster 4090 with less VRAM. If you are a gamer and run a 4090 on 1440p (like I do) there is absolutely no point in getting a 50 series card, as you get 0 FPS more than you had. What you probably can do is max out your CPU by capping a 5090 to 100 Watts instead if 200 Watts like a 4090… If you are a producer, getting another 8 GB VRAM plus 50% performance, it is a nobrainer to switch. Even arüt that price. That is why I would assume most of the available 4090 when the 5090 is out would be heavily used cards. nah - I see the mistake you making. You're applying logic with your 5080 argument. From experience I know that even without VR (and specifically because I wont be sqirming about inside a snorkel), my 1080P gaming experience will be massively improved with a 5090 because I will be able to see it through the glass panel of my PC. Sound will also improve as I purr with smug satisfaction at the spec in my signature knowing how envious others will be of me and how much they will want to be me. 🙂 Foolishness aside, the issue comes with knowing versus assuming (not you Zak, I mean in general). Especially with stuff like DCS. You see people reporting 19Gb of VRAM used whilst others pointing out its just reserved. In either case noone can offer empirical evidence about what is reserved and what is used. Chuck in a F4 or a map that uses "latest terrain tech" (Pah!!) and all bets are off. For a long time 11GB was touted around as the max but, as with everything on the interweb, Im not sure if that was a fact or just a repeated story. What I do know is that if you run out of VRAM, the performance bogeymen come out and steal your FPS. From where Im stood 16GB of VRAM is awfully close to an old post about 11GB whilst 24 or 32GB of the same is a more comfortable distance if Im also considering spending £1-2K on a fancy snorkel. Its a proper pickle. In 2006 getting a new GPU every gen was pretty much a no -brainer. Now its a minefield unless you go big. 2 Quote Link to comment Share on other sites More sharing options...
ZachariasX Posted October 12 Author Share Posted October 12 3 hours ago, Boo said: as with everything on the interweb, Im not sure if that was a fact or just a repeated story. When I listen to some game devs on YT, they seem to be screaming for a 32 GB framebuffer. Right now, Steam tells you how much of that is out there and you had to be insane making a tech that asks for more than is out there in order to run your game. But. The higher your resolution, the higher the amount of VRAM required. Meaning if you are about fine running your game on 8 GB VRAM on 1080p, then going VR, you need much more and a 16 GB card might be just barely enough. If someone sees an XY amount of memory allocated, that would depend on the screen resolution. In this context I don‘t see 16 GB as a real upgrade over 8, if VR is your new thing. What I see happening here is that Nvidia is segmenting the gamer away from the „prosumer“. Basically making the Geforce >16 GB a prosumer card in a way they never really managed to do with the Quattro series, as it was always a bad gaming solution. While Nvidia gives you the power to run a game in VR with the 5080, it neuters the framebuffer to the very minimum for the card for not being really useful for productivity tasks at this pricepoint. And for those who do these tasks, $2500 is not that much if the card is fast enough. It is also better margins for Nvidia. In short, the 5070 might be the only reasonable card to get for normal computer use plus gaming, the 5080 a very overpriced option for the very specific advantage you get and the 5090 being a reasonable offer for the one who needs that power. Which is not gaming. As for the bling in the case, myself I turn off as much of that as I can. But I understand if it matters to people. 3 Quote Link to comment Share on other sites More sharing options...
Boo Posted October 12 Share Posted October 12 15 minutes ago, ZachariasX said: But I understand if it matters to people. Its proven to make games run better. Proven. I saw it on the interweb. 🙂 2 2 Quote Link to comment Share on other sites More sharing options...
Major Lee Posted October 12 Share Posted October 12 Less memory, but apparently much faster GDDR7, so still a higher memory bandwidth its predecessor. One gamer techie guy on YT was talking about what nVidia "has to do", which I find hysterical. Gaming GPUs are the tiniest portion of nVidia, and they are not beholden to retail consumer demands. AI processing seems to be much more lucrative for them, for the foreseeable future. AMD is abandoning the top tier retail GPU market, Intel is dragging along in 3rd place. NVidia is the only game in town, especially if you do more or need more than mere FPS in a video game. Not that I like the off the chain prices, but my opinion isn't going to get them to drop the 4090 to under a grand... https://www.msn.com/en-us/news/technology/nvidias-rtx-5070-is-rumoured-for-a-ces-2025-reveal-but-i-kinda-think-people-need-to-calm-down-over-its-unconfirmed-memory-specs/ar-AA1s6f5k?ocid=BingNewsSerp 2 Quote Link to comment Share on other sites More sharing options...
Gambit21 Posted October 13 Share Posted October 13 My sites are on the 7070, since that’s the soonest I’ll be building a new rig. The 7080…too damn much - 7070 = bang for the buck sweet spot. 😜 1 3 Quote Link to comment Share on other sites More sharing options...
ZachariasX Posted October 13 Author Share Posted October 13 18 minutes ago, Gambit21 said: 7070 5070? 1 2 Quote Link to comment Share on other sites More sharing options...
Boo Posted October 13 Share Posted October 13 1 hour ago, ZachariasX said: 5070? Gambit is taking the long view..:-) I suspect many 40 series owners will also be taking a long view if those prices turn out to be anywhere near correct. 6 Quote Link to comment Share on other sites More sharing options...
ZachariasX Posted October 13 Author Share Posted October 13 5 hours ago, Boo said: Gambit is taking the long view..:-) I suspect many 40 series owners will also be taking a long view if those prices turn out to be anywhere near correct. I think that would make sense. The German outlet Heise says here (in German), what the expected specs are: The 4090 features 16384 shaders and with 400 Watts it is pretty close to what it can do, even when run at higher power settings. Adding (maybe) higher frequencies and other optimizations plus the faster RAM to the 5080, I don‘t really expect the 5080 to give significantly more than the 4090 provides already. Essentially it is a significant downgrade due to smaller framebuffer. In this light, I understand that Nvidia discontinues the 4090, as it seems the more versatile product of the two while having a comparable gaming performance. Its existence cannibalises the price threshold Nvidia set for AI and productivity. As a bottom line, I see just a price increase over the 40 generation. The 5070 seems like utterly meh, hardly any step away from the 4070 an not even having more VRAM. The 5090 I see as the only card with a purpose. Which is not gaming. The 5080 is a giant middle finger towards gamers and the 5070 is what we had. 2 2 Quote Link to comment Share on other sites More sharing options...
Aapje Posted October 13 Share Posted October 13 If you are thinking about the 5070, it may be wise to wait for a mid-gen refresh, which is likely to get the 3 GB RAM modules. 2 Quote Link to comment Share on other sites More sharing options...
Boo Posted October 13 Share Posted October 13 (edited) 29 minutes ago, ZachariasX said: The German outlet Heise says here (in German), what the expected specs are: Christ - if they are saying it in German they must be serious.... Personally I see my three choices in simpler terms than perhpas the caluclations others will need to make (coming from a 3080 and going VR). 1) a used 4090 (cheapest option but with risk) 2) a 5080 (middle ground with a continued risk of DCS frustration) 3) a 5090 (Drug dependency expensive and with a risk that I'll still end up frustrated by all the other nonsense that comes attached to flight sim code). Oddly, having passed on the 4000 series, I almost feel blessed by this. Edited October 13 by Boo 2 Quote Link to comment Share on other sites More sharing options...
ZachariasX Posted October 13 Author Share Posted October 13 43 minutes ago, Aapje said: If you are thinking about the 5070, it may be wise to wait for a mid-gen refresh, which is likely to get the 3 GB RAM modules. This might be an especially redeeming factor for the 5080. 2 Quote Link to comment Share on other sites More sharing options...
Major Lee Posted October 13 Share Posted October 13 NVidia, the video cards everyone loves to hate... 4 Quote Link to comment Share on other sites More sharing options...
DD_Arthur Posted October 14 Share Posted October 14 On 10/12/2024 at 12:53 PM, ZachariasX said: On a flat screen, you get nothing by buying more than one of the better flavours of the 4070. Amen to that. Just love my 4070ti super. Best of all, by avoiding a 4090 I also have the budget for the next thirty months worth of takeaway curry ring-fenced. 4 1 Quote Link to comment Share on other sites More sharing options...
Boo Posted October 15 Share Posted October 15 It rumbles on https://www.techradar.com/computing/gpu/nvidia-rtx-5090-gpu-price-hike-could-be-much-less-painful-than-a-previous-rumor-suggested For the click averse basically the source that Moores Law is Dead quoted in his original headline grabbing Video is now saying such news is "fake" (ouch!) and that we may more likely see a $100-$200 increase over for the 5090 over the 4090. The source makes no mention of the 5080 or 5070 price wise. Which is great news for everyone who likes nothing more than looking at images of sold out stock in stores 🙂 As an aside, being in the UK, I opted for an FE when I got my 3080 as I personally dont overclock anything anymore. A little patience and a handy Telegram channel that told me when Scan had stock allowed me to get the 3080 for RRP and at a time everyone else was either paying £300-400 more or fuming at the empty shelves. I know the FE isnt for everyone but for those that can live with vanilla, I imagine the same or similar will be around for the 5000 series. If I come across it, ill post it up. 2 1 Quote Link to comment Share on other sites More sharing options...
Chief_Mouser Posted October 15 Share Posted October 15 On 10/12/2024 at 11:34 AM, Boo said: If MSFS2024 introduces eye tracking (they likley wont given Track IR is still considered radical tech over a hat switch to most MSFS customers) then I think that combo would see me happy and £1200 richer for a good couple of years. There is native eye tracking in MSFS2020 already so it damn well better be there in 2024! Tobii EyeTracker 5 works just fine, in fact too well for most folks (myself included) as the eye tracking isn't really necessary and gets turned off, relying on good old head movement instead. TrackIR is well past its sell-buy date, I only need it for IL-2 BoX and that can be got around by using additional software with the Tobii. Where Tobii support is going to be a must-have is here, in Combat Pilot. 🍻 2 Quote The Bell Inn, Bath. Live music venue and real ale pub (thebellinnbath.co.uk) I am in the homepage picture... or I would be if they hadn't cropped off the bottom part of it. 🍻 Link to comment Share on other sites More sharing options...
Boo Posted October 15 Share Posted October 15 4 minutes ago, Chief_Mouser said: There is native eye tracking in MSFS2020 already so it damn well better be there in 2024! Tobii EyeTracker 5 works just fine, in fact too well for most folks (myself included) as the eye tracking isn't really necessary and gets turned off, relying on good old head movement instead. TrackIR is well past its sell-buy date, I only need it for IL-2 BoX and that can be got around by using additional software with the Tobii. Where Tobii support is going to be a must-have is here, in Combat Pilot. 🍻 My bad - I meant dynamic forevated whatnot that DCS does with the crystal. 2 Quote Link to comment Share on other sites More sharing options...
CPT Crunch Posted October 15 Share Posted October 15 They need to change the whole arraignment with these cards, it's ridiculous they consume the entire boards acreage and negate all your slots, why do they even bother making slots any more? You can forget about a multi purpose machine running one of these monster cards. It also limits you on boards if you require lots of USB's for your gear. You get a single use machine at three times the cost, not sustainable in the long run for our sport. 4 Quote Link to comment Share on other sites More sharing options...
kestrel79 Posted October 16 Share Posted October 16 Yeah I was kinda hoping for some newer tech to have these things be more powerful, yet consume less energy. But it seems like we just need bigger cases, and bigger cpu's the last couple gens. 1 Quote Link to comment Share on other sites More sharing options...
Mysticpuma Posted October 16 Share Posted October 16 11 minutes ago, kestrel79 said: Yeah I was kinda hoping for some newer tech to have these things be more powerful, yet consume less energy. But it seems like we just need bigger cases, and bigger cpu's the last couple gens. It can't be long before GPU's have their own case and PSU? 2 Quote Link to comment Share on other sites More sharing options...
ZachariasX Posted October 16 Author Share Posted October 16 1 hour ago, Mysticpuma said: It can't be long before GPU's have their own case and PSU? The 110V folks can‘t have more than what is cutting edge today anyway. Nobody will go past 600W. Things will stay as they are. Quote Link to comment Share on other sites More sharing options...
IckyAtlas Posted Friday at 12:04 AM Share Posted Friday at 12:04 AM I am scratching my head. Are these 5090 real. Does NVIDIA really want to sell them to us gamers. Not sure. The GPU of the 5090 has a lot of problems in manufacturing and yields seem very bad due to an organic substrate they use for cost reasons. NVIDIA reached a kind of limit here. New technology needed maybe like glass substrate or so. Low yields will result in very high prices. For those that have deep pockets and plan to buy I would recommend waiting to see all kind of issues that will pop up with the first series out. 2 Quote Link to comment Share on other sites More sharing options...
Simfan1998 Posted Friday at 07:27 AM Share Posted Friday at 07:27 AM On 10/15/2024 at 4:46 PM, CPT Crunch said: They need to change the whole arraignment with these cards, it's ridiculous they consume the entire boards acreage and negate all your slots, why do they even bother making slots any more? You can forget about a multi purpose machine running one of these monster cards. It also limits you on boards if you require lots of USB's for your gear. You get a single use machine at three times the cost, not sustainable in the long run for our sport. Totally agree, 3 slots taken. the only issue is to remove those monster heatsinks and replace them with a Watercooling, you win in hight and lenght, but that induce more costs ($$$) and a a need for a big case with a lot of cooling spots capability. The other possibility is that case manufacturers change their lay-out, in set the graphic card on the side like the lian-li sup01. Then we can reuse the pci slots on the mobo's . Quote Link to comment Share on other sites More sharing options...
Boo Posted Friday at 07:58 AM Share Posted Friday at 07:58 AM 7 hours ago, IckyAtlas said: For those that have deep pockets and plan to buy I would recommend waiting to see all kind of issues that will pop up with the first series out. Thats almost a given anyhow given the usually low availablity in the first few months. I've seen storm clouds predicted for nearly every generation of RTX yet, in reality few catastrophies other than outlyers. Whilst I have no doubt NVidia wants every organ it can harvest from me, I do have some faith that, at least in their FE cards, there is sufficent QC and headroom to indemify me somewhat. Perhaps that faith is misplaced but, over the past 30 years, only my own stupidity and not surge protecting my PC have gotten me (the latter I have sorted). All this so far considered (including the tamed pricing predictions) I am likely to opt for a 5090 simply because its lifetime in my PC will see me into VR and possibly even the next generation of VR with (pinch of salt) potentially DP2.1. I get the arguments for not needing more than 16GB of Vram for gaming but I also play DCS where logic simply doesnt apply. Additionally, the given logic of the next gen's mid range matching the last gens high end, it'll at least see me comfortably over perhaps 4-6 years especially given the limits the tech appears to be appoaching in power and form and may even work out cheaper than buying two 80 series over that time. Quote Link to comment Share on other sites More sharing options...
ZachariasX Posted Friday at 09:42 AM Author Share Posted Friday at 09:42 AM (edited) 3 hours ago, Boo said: Additionally, the given logic of the next gen's mid range matching the last gens high end, …the given reality of the next gen's mid range matching the last gens mid range… (Corrected that for you.) Edited Friday at 11:25 AM by ZachariasX 2 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.