• snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    ·
    1 month ago

    Originally, the leaks had pinned the price of the RTX 5090 at $1999, leaving many of us largely unsurprised at Nvidia continuing to push the price of its flagship card even higher. That’s not only because it maintains a market-leading position – with AMD expected not to even try and compete at this extreme end of the GPU market this generation – but also because of the huge uptick in the specification of this new GPU.

    I remember back in the day when they made huge leaps every generation but the prices remained fairly stable, not increasing by 33%. This is all due to lack of competition and profit seeking, not technical improvements.

    • sunzu2@thebrainbin.org
      link
      fedilink
      arrow-up
      19
      arrow-down
      2
      ·
      1 month ago

      Merchant will always charge the highest price they can extract, this is pricing 101. Fuck your economies of scale etc.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      14
      ·
      1 month ago

      This is all due to lack of competition and profit seeking, not technical improvements.

      And people buying it anyway instead of sticking to actually reasonably priced products.

    • Ptsf@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      30 days ago

      Not to defend nvidia entirely, but there are physical cost savings that used to occur with actual die wafer shrinkage back in the day since process node improvements allowed such a substantial increase in transistor density. Improvements in recent years have been lesser and now they have to use larger and larger dies to increase performance despite the process improvements. This leads to things like the 400w 4090 despite it being significantly more efficient per watt and causes them to get less gpus per silicon wafer since the dies are all industry standardized for the extremely specialized chip manufacturering equipment. Less dies per wafer means higher chip costs by a pretty big factor. That being said they’re certainly… “Proud of their work”.

    • Dnb@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      30 days ago

      Lack of competition or people showing they’ll pay insane prices so they are meeting that demand?

  • Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    24
    arrow-down
    1
    ·
    edit-2
    1 month ago

    Ok, but fuck NVidia. What’s with AMD? Will there be a card that matches the 7900XTX at least? Will there be a 8900?

    • Fusty@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      No, there will not be. The RX 8000 will top at the mid range. There will be no successor for the 7900 XTX. Maybe the 9000 in 2027 will have a 9900 XT. I wish there will be a 8800 XT for $500 to put pressure on nVndia, but with Blackwell 50 series using GDDR7 and RX 8000 using GDDR6, an RX 8800 XT for $500 might make no difference.

      • bitwaba@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 month ago

        The problem is midrange is defined on being “lower than high end”, and the 5090 is insane that it drags that line up. If AMD makes an 8800 level card, it’ll be a “mid range” card by the extremely wide spectrum of performance, but it’s still an upper 1/3rd card.

        • Fusty@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          I agree that an 8800 is higher level. My standard is any GPU that does 1440p 144fps ultra of a 3 year old gane is above midrange.

          I believe for GPU selection, a lot of people would be happy with 1080p 120fps path tracing on ultra. If graphical quality is the most important thing about gaming like some PC make it oi to be, there would be no consoles that sell.

          Since there is a very large market of people who have not bought a new release within the last 2 years, it makes for a lot of people who would enjoy raytracing/path tracing getting 120fps at 1080 to 1440 but not more or higher… The mass market is years away from 4K 144Hz.

  • ditty@lemm.ee
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    1 month ago

    Journalists […] suggest the upcoming GPU could cost around $1900. Manufacturers have allegedly been told anywhere from $1899 to $1999 will be the expected range, lining up with pricing rumors from last month.

    For reference, the RTX 4090 launched at $1599 for its Founder’s Edition but has since crept up to nearly $2000 or more for overclocked cards.

    Ouch. I was a sap who built my first gaming rig in 2015 and I thought I was dumb for buying a Titan X (Maxwell) for $999. Hard to fathom paying double that for one GPU

  • Artyom@lemm.ee
    link
    fedilink
    arrow-up
    8
    ·
    1 month ago

    Honestly specs hardly matter for the top of the line GPUs. The flops are basically infinite, and gaming has visually stagnated to the point where you would never know the difference. I just bought a new card and focused mostly on vram per dollar, which probably has a stronger correlation with hardware longevity.

    • kyle@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      I eventually upgraded from my 1080 to a 3070, joined a wait list for EVGA and got it at MSRP like 2 years after release, took forever. I kinda wanted to get a new card every other generation but not at these prices.

  • resetbypeer@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    27 days ago

    AMD will not do anything until they go UDNA around 2026. I hope AMD can take enough midrange market share. Heck even rooting for Intel in the GPU market. We need competition.

  • b34k@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    3
    ·
    1 month ago

    Sweet, guess my 4090 FE isn’t going to depreciate in value anytime soon.