Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn’t know, while just under 2,000 voters said yes.

  • MrAlternateTape@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    I have no clue why any anybody thought I would pay more for hardware if it goes with some stupid trend that will be blow up in our faces soon or later.

    I don’t get they AI hype, I see a lot of companies very excited, but I don’t believe it can deliver even 30% of what people seem to think.

    So no, definitely not paying extra. If I can, I will buy stuff without AI bullshit. And if I cannot, I will simply not upgrade for a couple of years since my current hardware is fine.

    In a couple of years either the bubble is going to burst, or they really have put in the work to make AI do the things they claim it will.

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    I would pay for a power efficient AI expansion card. So I can self host AI services easily without needing a 3000€ gpu that consumes 10 times more than the rest of my pc.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      I would consider it a reason to upgrade my phone a year earlier than otherwise. I don’t know what ai will stick as useful, but most likely I’ll use it from my phone, and I want there to be at least a chance of on-device ai rather than “all your data are belong to us” ai

  • Telorand@reddthat.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    …just under 2,000 voters said “yes.”

    And those people probably work in some area related to LLMs.

    It’s practically a meme at this point:

    Nobody:

    Chip makers: People want us to add AI to our chips!

    • ozymandias117@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      The even crazier part to me is some chip makers we were working with pulled out of guaranteed projects with reasonably decent revenue to chase AI instead

      We had to redesign our boards and they paid us the penalties in our contract for not delivering so they could put more of their fab time towards AI

      • nickwitha_k (he/him)@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        That’s absolutely crazy. Taking the Chicago School MBA philosophy to things as time consuming and expensive to setup as silicon production.

  • ClamDrinker@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Depends on what kind of AI enhancement. If it’s just more things nobody needs and solves no problem, it’s a no brainer. But for computer graphics for example, DLSS is a feature people do appreciate, because it makes sense to apply AI there. Who doesn’t want faster and perhaps better graphics by using AI rather than brute forcing it, which also saves on electricity costs.

    But that isn’t the kind of things most people on a survey would even think of since the benefit is readily apparent and doesn’t even need to be explicitly sold as “AI”. They’re most likely thinking of the kind of products where the manufacturer put an “AI powered” sticker on it because their stakeholders told them it would increase their sales, or it allowed them to overstate the value of a product.

    Of course people are going to reject white collar scams if they think that’s what “AI enhanced” means. If legitimate use cases with clear advantages are produced, it will speak for itself and I don’t think people would be opposed. But obviously, there are a lot more companies that want to ride the AI wave than there are legitimate uses cases, so there will be quite some snake oil being sold.

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      7 months ago

      well, i think a lot of these cpus come with a dedicated npu, idk if it would be more efficient than the tensor cores on an nvidia gpu for example though

      edit: whatever npu they put in does have the advantage of being able to access your full cpu ram though, so I could see it might be kinda useful for things other than custom zoom background effects

      • yamanii@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        But isn’t ram slower then a GPU’s vram? Last year people were complaining that suddenly local models were very slow on the same GPU, and it was found out it’s because a new nvidia driver automatically turned on a setting of letting the GPU dump everything on the ram if it filled up, which made people trying to run bigger models very annoyed since a crash would be preferable to try again with lower settings than the increased generation time a regular RAM added.

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          Ram is slower than GPU VRAM, but that extreme slowdown is due to the bottleneck of the pcie bus that the data has to go through to get to the GPU.

  • exanime@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    AI for IT companies is looking more and more like 3D was for movie industry

    All fanfare and overhype, a small handful of examples that do seem a solid step forward with millions others that are just a polished turd. Massive investment for something the market has not demanded

  • nayminlwin@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Can’t help but think of it as a scheme to steal the consumers’ compute time and offload AI training to their hardware…

  • cmrn@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    I still don’t understand how the buzzword of AI 10x’d all these valuations, when it’s always either: a) exactly what they’ve been doing before, now with a fancy new name b) deliberately shoehorning AI in, in ways with no practical benefit

  • OCATMBBL@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Why would I pay more for x company to have a robot half ass the work of all the employees they’re gonna cut?

    • Wogi@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      So the trades have been unknowingly fucking with AI for decades, because of the time honored tradition of fucking with apprentices.

      A lot of forums are filled with absolutely unhinged advice, and sprinkled in there is some good advice. If you know what you’re doing, you can spot the bullshit.

      But if you don’t know anything about it, the advice seems perfectly reasonable. There’s a skill in giving unhinged advice. Literally you can’t get your master cert without convincing at least one apprentice to ask where the board stretcher is.

      Do I actually have a dedicated vise for Vaseline when I run a tap cycle or is that old timer bullshit? HOW WOULD YOU POSSIBLY KNOW??

  • bitwolf@lemmy.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    No, but I would pay good money for a freely programmable FPGA coprocessor.

    If the AI chip is implemented as one, and is useful for other things I’m sold.

    • profdc9@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      I think manufacturers need to get a lot more creative about simplified computing. The RPi Pico’s GPIO engine is powerful yet simple, and a good example of what is possible with some good application analysis and forethought.

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        Whichnoart of the pico are you referring to specifically? Never heard the term “GPIO engine” before. Is that sort of like the USB stack but for GPIO?

        • phlegmy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          7 months ago

          I think they meant PIO (programmable IO). It’s like a small processor tied to some of the IO pins. There’s a very small set of instructions and some state machines.
          It can be used to implement your own IO protocols without worrying about the issues that come with bit-banging from the cpu.

      • jj4211@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        Problem for the big market is that it’s hardly profitable. In fact make things too easily multipurpose and you undercut your specialized devices opportunities. Why buy a smart device for 500 dollars that requires a monthly subscription when you could get a 100 dollar device with a popular preload of a solution on it?

        Like when the WRT54G came out in the day and OpenWRT basically drove Cisco to buy out Linksys to neuter the “home router” to stop it displacing expensive products in the business sector. The WRT54G was the best product for the market, but not the best product to exist for vendor profitablity.

      • bruhduh@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        I have few pi pico but i didn’t knew about it, can you please elaborate, because I’ve been using them just like any other esp32 stm32 esp8266 i have

  • t00l@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    They want you to buy the hardware and pay for the additional energy costs so they can deliver clippy 2.0, the watching-you-wank-edition.

  • T156@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    It just doesn’t really do anything useful from a layman point of view, besides being a TurboCyberQuantum buzzword.

    I’ve apparently got AI hardware in my tablet, but as far as I’m aware, I’ve never/mostly never actually used it, nor had much of a use for it. Off the top of my head, I can’t think of much that would make use of that kind of hardware, aside from some relatively technical software that is almost as happy running on a generic CPU. Opting for AI capabilities would be paying extra for something I’m not likely to ever make use of.

    And the actual stuff that might make use of AI is pretty much abstracted out so far as to be invisible. Maybe the autocorrecting feature on my tablet keyboard is in fact powered by the AI hardware, but from the user perspective, nothing has really changed from the old pre-AI keyboard, other than some additions that could just be a matter of getting newer, more modern hardware/software updates, instead of any specific AI magic.

  • Fedizen@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    30% of people will believe literally anything. 16% means even half of the deranged people aren’t interested.

    • ipkpjersi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      7 months ago

      Interested or not, more hardware is going to be “AI-enhanced” and believe it or not, it’s going to cost more.

      This is our future.