I’ve seen reports and studies that show products advertised as including / involving AI are off-putting to consumers. And this matches what almost every person I hear irl or online says. Regardless of whether they think that in the long-term AI will be useful, problematic or apocalyptic, nobody is impressed Spotify offering a “AI DJ” or “AI coffee machines”.
I understand that AI tech companies might want to promote their own AI products if they think there’s a market for them. And they might even try to create a market by hyping the possibilities of “AI”. But rebranding your existing service or algorithms as being AI seems like super dumb move, obviously stupid for tech literate people and off-putting / scary for others. Have they just completely misjudged the world’s enthusiasm for this buzzword? Or is there some other reason?
AI has some useful applications, just most of them are a bit niche and/or have ethical issues so while it’s worth having the tools and functionality to do things, no one can do much with them.
Like for example we pretty much have AIs that could generate really good audio books using your favourite actors voi e likeness, but it’s a legal nightmare, and audio books are a niche already.
In game development being able to use AI for texture generation, rigging, animations are pretty good and can save lots of time, but it comes at the cost of jobs.
Some useful applications for end users are things like noise removal and dynamic audio enhancement AIs which can make your mic not sound like you are talking from a tunnel under a motorway when in meetings, or being able to do basic voice activation of certain tools, even spam filtering.
The whole using AI to sidestep being creative or trying to pretend to collate knowledge in any meaningful way is a bit out of grasp at the moment. Don’t get me wrong it has a good go at it, but it’s not actually intelligent it’s just throwing out lots of nonsense hoping for the best.
-
OpenAI struck gold, NVIDIA followed suit, and everyone else bought shovels hoping to get investors even though they have no plans on striking gold (developing useful AI).
-
Would you like to buy a timeshare to the moon? If we all buy, you’ll be able to sell your spot for 10x the price! Don’t wait! Spots are limited!
Nvidia is the biggest shovel seller out there.
We kinda need to adapt the saying now. When someone finds gold, you need to sell wood and iron for all the shovel makers that will show up.
Nvidia sells the hardware (shovels), but also develops portions of the software to make it run more efficiently, like OpenAI. Nobody else but Microsoft seems to be actually developing software, though AMD is slowly working towards having comparable performance.
-
Attracts investors.
When people are evaluating companies, and see a company missing out on the current trend, how is that going to factor into their valuation of the stock prices?
Venture capitalists and shareholders want to make money. Company executives want to give them eternally increasing profit. Simple as that.
Money was already spent. The hype companies were backed by big capital in their early days. Now the people who provided that capital want to cash out and they want their winnings. So you will have AI shoved down your throat on every piece of media channel those people also own. AI is a hype term that appears periodically since before the 2000s. This is nothing new. https://en.m.wikipedia.org/wiki/AI_winter
LLMs are toys that sparkle for a brief moment. Their value is laughable compared to their cost.
Because it attracts shareholders
They think they’ll get money.
Is why.
Unless a hyped-up investor gives it to them, they won’t.
It’s just a modern version of “Fuzzy Logic”
Think like a venture investor.
A small chance of huge growth via new technology can have a big payoff. They expect most companies to fail and are more worried about missing an opportunity than losing money in a single bad investment.
Nobody is quite sure where AI technology will be in ten years, but if it’s big, it’s going to make people who got in early very rich. It doesn’t matter that it sucks now; the web sucked in 1995, but it made people who got in (and out) at the right time very rich.
Because whilst technical people know it has limited applications (like blockchain), business people tend to fall for buzzwords easily because they don’t realise a lot of the things it does were solved in other ways
No, they’re pretty much just dumb. In tech, this works along hype cycles where there’s gotta be some new thing all the fucking time, and it cures what ails ya and is perfect for every case. This mostly involves taking any actual merits of [new tech] and blowing them way out of proportion and context, making it the best thing since sliced bread. This invariably makes people invest because hype is more important than making sense. When the cycle for that particular tech winds down into the Trough of Disillusionment, a new one shows up.
A lot of business people also think that AI is a “force multiplier” meaning that if they use it they can get more done in less time. Anything that can do that is basically a money printer at the business level, which is why all these execs and companies are so excited about it.
The problem is it’s not or at least not reliably proven to be so. All these companies are jumping on board thinking “shove some AI in there and get 20% growth” when in reality there’s no backing behind it working like that. And that’s why a lot of customers are turned off, because from the consumer side, AI is just sloppy unoriginal junk. But on the business side they just see “Productivity is up” never mind that the productivity is garbage quality.
It was super cool for like three weeks. Now it’s the gambler’s fallacy they’re hanging on to.
Ohhh, you fn nailed it. Nice comparison.
Money, really. Someone thinks AI is going to make money, so everyone is trying to make money by slapping “AI” on everything to see if something sticks.
Honestly, the LLM / Generative AI tech is pretty cool and amazing that it even works, but it is still in its infancy. As one person put it, we’re watching a baby (AI) take its first steps walking, but the people with money are going “Get that baby a job!”
It’s nowhere near ready for daily driving / what it is advertised to do, but I believe the ones that are serious about making it work are hoping technological and programming innovation will come along that will make it more energy efficient and more accurate as it is used more.
They want to create some hype and look cool by using AI chatbots. And most normies don’t care about privacy and the dangers of AI in the future, they only care about “wow I can use AI for bla… bla…”
But they have no idea, that one day AI could take over their jobs… and rich people like Sam Altman are getting richer, and he only pays you with
UBI moneysome pieces of computinghttps://x.com/tsarnick/status/1789107043825262706
Also, AI companies aim for government contracts and medium / big corpos.