15 comments

  • lemoncookiechip 1 hour ago
    No worries, gamers.

    You can subscribe to our GeForce NOW service to rent a top of the line card through our cloud service for the low low price of 11€$£ or 22€$£ a month with *almost no restrictions.

    *Except for all the restrictions.

    • merpkz 1 hour ago
      I just bought a 5070 Ti a week ago and can attest I have used it for maybe 3-4 hours since then. It begs the question maybe I should have rented the compute instead instead of paying 900 eur on spot - that's like 3 years worth of rent.
      • observationist 1 hour ago
        If the compute is the unit of value under consideration, maybe. But there's more - you have access, freedom from supervision, the capability to modify, upgrade, tweak, adjust anything you want, resell compute under p2p cloud services when idle, etc. And then if the market for these gets hot, you can sell and recoup your initial costs and then some. The freedom and opportunity benefit - as opposed to the dependence and opportunity cost of renting - is where I personally think you come out on top.
      • pixl97 1 hour ago
      • throwaway2027 1 hour ago
        Because it'll set a precedent and eventually kill off being able to own the hardware to run things locally anymore in the future.
      • Aurornis 1 hour ago
        The GeForce Now service is actually a decent deal for casual gamers.

        The hardcore and frequent gamers won’t like it but it was never really for them.

        • theodric 1 hour ago
          So is a Steam Deck, really
        • close04 51 minutes ago
          The problem is that they're always a great deal, the best even, while there are alternatives. The noose tightens only after everyone is onboard.

          And the competition on the GPU market is soft to say the least.

      • wongarsu 1 hour ago
        The correct calculation is not 900€/36 months but (900€-$resell_value)/36 months. If you sell your GPU for 450€ after three years you saved a good bit of money. If the AI bubble doesn't pop, your resale value might even be a good bit higher than that. I've had a used 1080TI that I used for five years and then sold for nearly the same price, making it effectively free (minus electricity use and opportunity cost)
        • fluoridation 33 minutes ago
          Even if you don't resell it, at the end of the three years you still have a GPU that you can keep using, or gift, or whatever. After three years of renting, you have nothing.
      • Fire-Dragon-DoL 44 minutes ago
        Yes! Then you want to play one of the FromSoftware game and you are doomed.

        Damn nvidia

      • threetonesun 1 hour ago
        Currently after 3 years the price of the GPU if you decide to sell it might be a wash, much like it was after the crypto boom. Granted you have to pay for electricity to run it, but you also have full control over what it runs.
      • dymk 1 hour ago
        It cost 900eur because nvidia is shafting you
      • dyauspitr 1 hour ago
        Where the hell is 900 eur 3 years of rent
        • diab0lic 1 hour ago
          Haha. I read that the same way the first time I read it. The commenter means 3 years of renting GPU from nvidia via cloud services.
        • close04 1 hour ago
          I think that's comparing to 3 years of GeForce Now at ~22EUR/month for the Ultimate plan, for a total of ~800 EUR. For someone using in 3h/week then you might as well go for the free plan and pay nothing. But renting has while owning can only have financial cost, renting has a hidden cost on top of that. It leads to "atrophy" of the ownership right and once you lose that option you'll never get it back. That will have incalculable costs.
    • PunchyHamster 1 hour ago
      It's 33 eurodollar now. I'm sorry I meant 44
      • lemoncookiechip 1 hour ago
        China will save us, except no, we'll just ban their hardware sales, sucks to suck.
        • fc417fc802 1 hour ago
          Does China have any cutting edge fabs yet? I thought it was still just TSMC, Samsung, and maybe Intel.

          Maybe consumer electronics will move backwards by a process node or two?

          • lemoncookiechip 1 hour ago
            They've just recently been able to reverse engineer ASML's EUV machines. They're years and years behind, although the way things are moving forward with hardware prices skyrocketing (RAM, SSD, GPUs...) regular consumer won't have much choice in anything anyway for a while.
            • fc417fc802 26 minutes ago
              So at a rough guess would that be expected to put them on the equivalent of TSMC 7 nm a few years from now?

              I wonder if a bunch of consumer electronics will move back to something like 12 nm for a while? Seems like there's a lot of capacity in that range. Zen 2 wasn't so bad, right?

  • ecshafer 1 hour ago
    Crucial shut down, Nvidia not producing consumer cards. Even with AMD cards, if there's no memory available then we can't get them either.

    Ram is 4-5x the price of a year ago.

    Is AI going to kill the consumer computer industry?

    • Aurornis 1 hour ago
      > Nvidia not producing consumer cards.

      This is a false statement. They’re still producing consumer cards. You can go buy a 5070FE in stock on their web store at MSRP right now. You can buy a discounted 5060 from Best But below MSRP.

      They’re changing production priorities for a little while if the rumors are accurate.

      RAM prices have always been cyclical and prone to highs and lows. This is an especially high peak but it will pass like everything else.

      These predictions that the sky is falling are way too dramatic.

    • baal80spam 1 hour ago
      > Is AI going to kill the consumer computer industry?

      Even if, the death of the AAA gaming is nothing I will cry about. Most games don't require anything remotely as performant as 5070.

      • ecshafer 1 hour ago
        I don't play any AAA games really. The only "AAA" game I've played in the past few years is basically Baldurs Gate 3 and Kingdom Come Deliverance II. But mostly I play rpgs and strategy games that don't require much gpu power at all.
      • piva00 40 minutes ago
        There are niches like sim racing which require a high powered GPU if you want to run ultrawide or triple screens though.

        Just saying that your grudges with AAA games have a blast effect you might not be aware of.

      • emsign 1 hour ago
        I don't care about AAA gaming either, it's stale but one day the AI bubble will kill something you cry about though.
    • fc417fc802 1 hour ago
      That's one possibility. Another is that it's temporary until production can be ramped up (but I doubt it because fabs). Pessimistic take is that the suppliers expect the bubble to pop soon (and very violently) and want to maximize their take while they still can.

      Or maybe assuming the trend holds in the longer term it could mean that consumers will move downstream of datacenters. Anyone who wants a GPU rocking 3 to 5 year old recycled enterprise gear.

  • throwaway2027 1 hour ago
    Maybe game companies will be forced to optimize their games and focus on innovative gameplay elements rather than graphics.
    • wongarsu 1 hour ago
      AAA game companies won't care, they'll just continue targeting the latest console. For most of them releasing on PC is a nice bonus, not a core part of their strategy
  • voidfunc 2 hours ago
    Death of PC gaming incoming.

    Happy I just bought my 5080 before Christmas. Theyre all on borrowed time.

    • ndiddy 52 minutes ago
      Well if you look at the SKUs they're discontinuing, they're taking out all the lowest end models with more VRAM to save the allocation for the higher end models with jucier margins. For example the 5070 Ti costs $500 less than the 5080 but both have 16 GB of VRAM. I imagine that for the near future, they'll have the 5060 8GB, 5070 12GB, and 16GB will be limited to the 5080 for consumers willing to spend $1300 on a GPU.
    • legobmw99 1 hour ago
      I am a recent 5070ti purchaser so I'm also feeling lucky, though if they exit the gaming market entirely I suspect the drivers will all go to crap soon thereafter
  • patapong 1 hour ago
    Very curious about the second order effects of the hundreds of billions poured into LLMs. Perhaps even if LLMs do not pan out, we will have a massive increase in green energy production, grid enhancements and a leap in capacity for general-purpose computing over the next few years? Or maybe that is my naive side talking...
  • etempleton 1 hour ago
    I could see Nvidia completely stepping out of the low to mid range Desktop GPU space. The margins have to be peanuts compared to their other business lines.
  • Anonyneko 1 hour ago
    Bought a slightly overpriced, even by its own standards, 5090 in May, hope that it lasts me through the next five years of madness, and that the madness will have some kind of a temporary respite (or I luck out on a higher paying job, or figure out how to invest properly - it seems like a lottery these days).

    My only small regret is that I decided to build an SFF PC, otherwise I would've gone for 128 GB of RAM instead of just 64. Oh well, ̶6̶4̶0̶ ̶K̶B̶ 64 GB should be enough for most purposes.

  • emsign 1 hour ago
    Rumors have it they'll stop producing gaming GPUs all together. :(
  • 827a 1 hour ago
    IMO, sadly: the DIY PC world is on life support and will likely be something that isn't even really possible to do, for top-of-the-line performance, by 2028.

    I don't necessarily think that everything is going doomer "subscription based cloud streaming"; the economics of these services never made sense, especially for gaming, and there's little reason to believe that the same incentives that led to Nvidia, Crucial, etc wanting out of the consumer hardware business wouldn't also impact that business.

    Instead, the future is tightly integrated single-board computers (e.g. Framework Desktop, the new HP keyboard, Mac Mini, RPi, etc). They're easier for consumers to buy. Integrated memory, GPU, and cooling means we can drive higher performance. All of the components getting sourced by one supplier means the whole "X is leaving the consumer market" point is moot, and allows better bulk deals to be negotiated. They're smaller. It allows one company (e.g. Framework) to capture more margin than sharing with ten GPU or memory middle-men who just slap a sports car-looking cooler on whatever they bought from Micron and saying they're a real business.

    My lingering hope is that we do see some company succeed who can direct-sell these high-end SBCs to consumers, so if you want to go the route of a custom case and such, you still can. And that we don't lose modular storage. But I've lost all hope that DIY PCs will survive this decade; to be frank, they haven't made economic sense for a while.

    • fc417fc802 1 hour ago
      > All of the components getting sourced by one supplier means the whole "X is leaving the consumer market" point is moot, and allows better bulk deals to be negotiated.

      I don't think that checks out. The fabs are booked out AFAIU. This is going to hit SoCs (and anything else you can come up with) sooner rather than later because it all depends on the same fabs producing the same silicone at the end of the day. It's just packaged differently.

      They left the consumer market due to the price difference. It's not that there aren't middlemen willing to purchase in bulk right now. It's that the OEMs aren't willing to sell at any price because they've already sold their entire future inventory at absurd prices for the next however many months or years.

      I assume there will still be at least a few SoCs to choose from but the prices will likely be completely absurd because they will have to match the enterprise price for the components that go into them.

  • nerdjon 1 hour ago
    It will be interesting to see what the long term impact of this will be, the headline misses the biggest part that they (if the phrasing they use is correct) should be producing more of the lower speced (and cheaper) 5060 8GB model.

    So while the news is not great, I think it is far from any doom and gloom if we are in fact going to be getting more 5060 cards.

    As it is the value of the crazy higher speced cards was questionable with most developers targeting console specs anyways. But it does bring to question how this might impact the next generation of consoles and if those will be scaled back.

    We will likely be seeing some stagnation of capability for a couple years. Maybe once the bubble pops all the work that went into AI chips can come back to gaming chips and we can have a big leap in capability.

  • zcw100 32 minutes ago
    Have some faith, if it really is an AI bubble and it pops imagine the deals you're going to get like when Etherium went to PoS.
  • infecto 1 hour ago
    I don’t subscribe to all of this doom and gloom. I would like to consider myself a gamer and to be frank I used the same computer setup for since 2018 until I recently upgraded it in the past few months. Even with increased costs we are seeing the dollar spent per hour of entertainment is ridiculously cost effective.
  • re-thc 1 hour ago
    Time for AMD to shine?
    • zvqcMMV6Zcr 1 hour ago
      AMD's approach to pricing was "comparable NVidia card minus $50". If price of remaining NVidia cards goes up then AMD will follow.
    • roboror 1 hour ago
      They already had worse margins so probably not unless they've been hoarding RAM. AMD also wants DC money.
      • PunchyHamster 1 hour ago
        you'd fool me looking at how PITA is to make stuff work compared to NVIDIA
        • pixl97 1 hour ago
          Wanting something, and executing on it properly are two different things.
    • whatevaa 1 hour ago
      Knowing AMD, shine by doing the same.
      • keyringlight 1 hour ago
        My impression for the past decade or so is that Radeon for PC is AMD's way of keeping their GPU division's engine running between contracts for consoles or compute products. At the very least it's a welcome byproduct that provides a competent iGPU and a test bed for future 'main' products. It's been a long while since AMD has shown future vision for PC GPUs or they've led with a feature instead of following what others do.
        • re-thc 6 minutes ago
          > My impression for the past decade or so is that Radeon for PC is AMD's way of keeping their GPU division's engine running

          During this time AMD was focused on CPUs. They've already said that they'll focus more on GPUs now (since CPUs are way ahead and AI is a thing) so this should change things.

    • MrBuddyCasino 1 hour ago
      AMD avoids a price war with Nvidia for the simple reason that Nvidia has much, much more cash and will win this war, easily.
  • newsclues 1 hour ago
    Gamers are going to burn DC to the ground.
  • xnx 1 hour ago
    From a resource allocation perspective it's crazy that millions of valuable GPUs (and memory!) is sitting in personal computers and game consoles unused.
    • nerdjon 1 hour ago
      Do you mean when the computer is not in use or “unused” in the sense that even when gaming it is just being used for gaming and not something “productive”.

      2 very different arguments and not fully clear which you are trying to make.

      • xnx 56 minutes ago
        > Do you mean when the computer is not in use

        This. No judgement on any particular use. Just worth a reminder that the most advanced machines every produced make this magic rocks that sit there idle most of the time.

        • fluoridation 37 minutes ago
          It'd probably be unwise 100% utilize every machine ever produced, just in terms of waste heat, but even just simple wear and tear.
    • Anonyneko 1 hour ago
      If it was easy enough to rent my desktop while I'm not using it (such that I can get it back whenever I need it myself, within a few minutes at most), I would happily do it.
    • CivBase 1 hour ago
      You could say that about litterally anything. Food, housing, fuel, heat, water. There are always solutions for better optimizing global resource allocation - especially if you're willing to ignore the wants and rights of the people.