Linus Torvalds reckons AI is ‘90% marketing and 10% reality’
The Linux creator is interested in AI, but the hype means he "basically ignores" it.
Linus Torvalds recently offered his opinions on the merits of artificial intelligence (AI) as we know it. The creator and lead developer of the Linux kernel didn’t disappoint with his characteristic cynicism regarding the substance of the AI industry in 2024. Famous for his highly informed yet unvarnished opinions on all things tech, Torvalds grimaced as he summarized the state of the AI biz as “90% marketing and 10% reality.” Torvalds spoke at the Open Source Summit in Vienna earlier this month, where TFiR interviewed him. Tsathustra highlighted this interesting AI segment.
During the highlighted interview segment, Torvalds attempted to see the potential in AI, but relentless industry hype is taking its toll. “I think AI is really interesting, and I think it is going to change the world. And, at the same time, I hate the hype cycle so much that I really don’t want to go there,” said the tech icon.
The Linux pioneer outlined his AI hype coping mechanism: "So my approach to AI right now is I will basically ignore it because I think the whole tech industry around AI is in a very bad position (grimaces)...” However, it seems like there is almost too much AI BS around for the Fin to tolerate, and it is currently “90% marketing and 10% reality.” That’s quite a ratio.
Linus Torvalds says AI will change the world but it is currently 90% marketing and 10% reality, and it will take another 5 years for it to become clear what AI is really useful for pic.twitter.com/6knFEfJbqfOctober 21, 2024
On a more positive note, Torvalds reckons there is change afoot. “In five years, things will change, and at that point we’ll see what AI is getting used every day for real workloads.” But it now seems fitting to remind readers that this isn’t the first instance of an IT industry heavyweight asking about the validity of the AI industry. Just a week ago, we reported on the CEO of Baidu voicing an even more pessimistic opinion – that the AI bubble would burst and that just 1% of companies would continue to pick up the pieces after the predicted ‘pop.’
The Linux godfather ended the highlighted video segment by mentioning what he believes to be the current strengths of AI. “Chat GPT makes great demonstrations (rubs forehead), and it's obviously being used in many areas, but especially in graphic design, things like that.” However, Torvalds couldn’t resist a last dig, reminding us “But, I really hate the hype cycle.”
Sadly, it is challenging to be a tech enthusiast and ignore pervasive trends in the industry, which often can frustrate as they seem to be marketing bandwagon-led. However, individuals are probably best advised to follow in Torvalds’ footsteps and “basically ignore” things they don’t like, concentrating instead on the abundant enthralling aspects and potential modern technology delivers.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Mark Tyson is a news editor at Tom's Hardware. He enjoys covering the full breadth of PC tech; from business and semiconductor design to products approaching the edge of reason.
-
chaz_music I COMPLETELY agree with Linus. The market should first find the "must have" use, and that can drive the market acceptance as well as the tech development. This does not mean making a cool picture or screwing up a term paper. It should be on the same level to society as search engines or smart phones.Reply
When I studied neural nets several decades ago, there was a common feeling in our class that this was considered esoteric tech to be used by techies only, for things like adaptive control and fuzzy logic. Fast forward to now and we finally have enough nodes to do some interesting things to images and sound. But the method that we are using to do these things is bulky, not cost effective, and certainly not optimized.
Using analog nodes uses way less power and probably will be the right final solution for many uses. Yes, that is a real thing. Analog neural nets (like inside your brain). -
roba67 Yes at first we were all impressed with CHATGPT but then we found it was not very useful. The hype cycle is much larger than I expected. I think leading edge semiconductor technology is a solution looking for a problem and many of the large tech company are easily fooled. Nvidia has done a good job of providing tools like CUDA but the idea that AI can work without heuristics is nonsense.Reply -
Dantte Agree and disagree.Reply
It probably is mostly 90/10, but its so new we're in a spot that I like to say: "we dont know what we dont know." We need people to jump on that band wagon, play with it, use it, break it, fix it, and figure it out. Those that do this early will be wasting a lot of time on the hype, but like any good pyramid scheme, those that get in early, as much as it may seem pointless now, will have the most to gain when it becomes mainstream. -
why_wolf 90% sounds extremely optimistic. I agree with Li, 99% of them are pure bubble and will fail.Reply
But even that sounds optimistic. I've still yet to to see AI do something, that actually turns a profit (the only part that actually matters) and isn't just a relabeled service previously done by "big data" or "the algorithm". -
King_V
Yeah, I'm pretty much in line with this feeling. I thought Torvalds was being slightly on the optimistic side. But, Torvalds and Li both have the right idea. Right now it's become a marketing term, and they're applying the AI moniker to everything they can.why_wolf said:90% sounds extremely optimistic. I agree with Li, 99% of them are pure bubble and will fail.
But even that sounds optimistic. I've still yet to to see AI do something, that actually turns a profit (the only part that actually matters) and isn't just a relabeled service previously done by "big data" or "the algorithm".
Suddenly parsing documents is "AI," recording video or sound and parsing out the words is "AI." Even the commercials for the new phones, talking about now with AI, your phones can do something new... except that the examples they give seem to be what phones could already do.
Anyone remember how we were hearing the phrase "big data" all the time? Yeah, kinda feels like that. -
mitch074
As cynical as he may be, Linus is basically an optimistic guy.Kamen Rider Blade said:I would argue the ratio should be closer to 95% Marketing, 5% Reality. -
JamesJones44 This statement has been true for all bubble tech. Hardware bubble in the 80s, .com bubble in the 90s, cloud bubble in the 00s, crypto bubble in the 10s, AI bubble in the 20s. 1000s of companies making ever promise in the world, 30 or so players with 3 to 5 super major players will come out with real, usable products and survive when the bubble popsReply -
edzieba
5% is probably optimistic.Kamen Rider Blade said:I would argue the ratio should be closer to 95% Marketing, 5% Reality.
The big problem is the <5% of non-nonsense applications are the ones that were being actively used before the 'deep learning' boom, mainly machine vision and data filtering.
Large Language Models, where the vast majority of investment is currently going, have basically no utility beyond toys. Image generation is in Photoshop Era (or if you're older, the digital drum-machine era) of artistic panic: the bit where everyone thinks it will take their jobs by having anyone push a button and receive art, and before the bit where everyone figures out that you still ned to be an artist to actually get anything of actual value out of it and it just becomes another digital tool that some artists will use productively and others will eschew for various reasons to various effect (e.g. the auteur directors who will still demand celluloid film vs. the many directors who cannot afford the massive costs of celluloid film). But even then, image generation is a pretty niche use, and will be relegated to mundane non-consumer-facing applications like creating tiling non-repeating rock textures from a small number of seed textures in game engines, or mapping user-generated-avatar expressions to face models without having to make sure the individual composited face elements can tolerate all possible poses. -
JRStern Sounds about right to me!Reply
In next five years it should get 50%-80% cheaper to field a system at about the current level, and it will be better understood by both vendors and customers, and it might even turn a small profit, LOL.
Probably more like ten years to see anything massively better than today, and longer than that - say 20 years - to something more or less human-level.
FWIW I have very little fear of any "super-intelligence", for reasons I can go on about at very great length.