• DdCno1@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    13 days ago

    To illustrate your point, my old GPU, a GTX 1080 from 2016 (basically ancient history - Obama was still president back then) remains a very useful for ML-applications today - and this isn’t even their oldest card that is still relevant for AI. This card was never meant for this, but thanks to Nvidia investing into CUDA and CUDA being useful for all sorts of non-gaming applications, the API became a natural first choice when ML tools that run on consumer hardware started to get developed.

    My current GPU, an RTX 2080, is just two years younger and yet it’s so powerful (for everything I throw at it, including ML) that I won’t have to upgrade it for years to come.