- cross-posted to:
- hackernews@derp.foo
- cross-posted to:
- hackernews@derp.foo
2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.
2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.
Given technological progress and efficiency improvements I would argue that 2023 is the year the gpu ran backwards. We’ve been in a rut since 2020… and arguably since the 2018 crypto explosion.
Nah 2022 it was running backwards far more. 2023 was a slight recovery but still worse than 2021.
I feel the same way. I don’t have the data to prove it.
Anecdotal evidence is still data
No, it’s a datum - about how people feel
Performance numbers are easy to find. The prices have not been great and the 4060 is held back by its reduced memory speed, but it’s a performance increase nevertheless. The flagship product, the one that shows what is currently possible in terms of GPU power, did show remarkable improvement in top performance.
I’m more salty about AMD not supporting ai workloads on their consumer gpus. Yes, ROCm exists and it will work on quite a few cards, but officially it’s not supported. This is a major reason why Nvidia is still the only serious player in town.
Yeah AMD just seems like it just doesn’t want to market AI on consumer hardware for devs. They have a ryzen chip line with built in dedicated "NPU"s now, but honestly the fact there is a disconnect between AI for the GPUs and a focus on windows, even for development, just makes it feel clunky.
♪♪ I say data, you say datum ♪♪
Ok I thought it common knowledge but maybe I should specify.
Datum is the singular form of data. Data is a collection of many single datums. If you have ten thousand anecdotes they do in fact become statistically significant.
It is just my impression of things.