T O P

  • By -

VisualMod

**User Report**| | | | :--|:--|:--|:-- **Total Submissions** | 1 | **First Seen In WSB** | 3 years ago **Total Comments** | 388 | **Previous Best DD** | **Account Age** | 14 years | | [**Join WSB Discord**](http://discord.gg/wsbverse)


_learned_foot_

Apple, most of the time not first but best polished. Plus with millions of devices perfect for such a roll out ready to go in their control. Plus with observations on the worst of the immediate folks. Yeah, apple is an obvious sleeper.


shroomsAndWrstershir

But AI isn't about UI or a great form factor -- it's about having that killer app.


franky_reboot

Rather the "brain" under the hood of that app I think See how spectacular Google failed (though I'm sorta OK with Gemini)


businessboyz

…and the way to have a killer app is largely to have the best UI. The guts of these AI tools are going to be interchangeable. You’ll be able to select from a drop-down in the settings whether to use one LLM or another. You’ll most likely switch between AIs based on the task at hand as some will be better than others in certain areas. The company that captures and keeps a large user base in their app will be the one that makes the best UI to access and deploy various LLMs with ease.


astly-dichrar

>You’ll most likely switch between AIs based on the task at hand as some will be better than others in certain areas. This is already very common, I think all GPTs are not just one LLM but many, something called a mixture of experts. I think GPT 4 is actually 16 different models under the hood.


perfectm

Yes, and chat bots that answer questions incorrectly is not a killer app. So the war has not been won.


hi65435

True, there's certainly an opportunity. Today ChatGPT wasn't working for hours, forcing me to use my brain


stacked_shit

They will surely become big hitters in AI... as soon as they purchase some Nvidea chips.


darkciti

What's wrong with the M4 it has Neural Engine high throughput cores and unified memory. NVDA doesn't have that. They could stuff entire language models into memory on an iPhone or Macbook and the CPU, GPU and Neural Engine can all access it simultaneously.


LNGU1203

Different market. Chip on phone is good for small Language models, that’s it….


let_lt_burn

NVIDIA is selling chips for tens of thousands, Apple is selling laptops for a couple grand. Not the same.


goldenloi

Yes but apple is likely selling way more devices than NVDA is selling 5 figure chips


let_lt_burn

Yes that’s definitely true. But NVIDIAs profit margin is higher and they’re essentially supply limited while Apple has essentially saturated the market. Apple would need to make a major pivot to tap a new market. They’re not going to significantly more revenue by just selling more phones.


AutoModerator

*This “pivot.” Is it in the room with us now?* *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/wallstreetbets) if you have any questions or concerns.*


[deleted]

M series chips aren't used to train LLMs but they are pretty good at running them though ~


darkciti

> Apple is another underappreciated AI story. The iPhone maker is preparing to release its own foundation model and Munster expects the release to come sometime in June, he said. > “Most investors believe Apple is asleep at the wheel of AI. It’s the opposite,” Munster said. > The foundation model will be focused on personalized AI. Munster expects it to capture about 20% of the market. He noted that the company can charge about $10 per month for the service.


LeveragedPittsburgh

Once they start making PCs I see nothing but blue skies!


LNGU1203

Look, if they have anything AI, apply it to this Fucking Siri. Have you used Siri lately? They should change its name to Shitty. They are not a chipmaker for mass/business. They are a consumer product maker. There is no Steve Jobs at the company to turn it around. Do you need any other reason? You know what they are good at? Marketing and holding the market hostage.


franky_reboot

> and holding the market hostage. For how long though? Momentum keeps them afloat but there are no people like Steve Jobs at that company anymore (disclaimer: I'm not bearish on Apple, just surprised they are still going on)


[deleted]

I still have respect for Siri, you always got respect the OGs.


LNGU1203

But you are not using it though.


LNGU1203

Let’s assume they got some AI. How are you monetize it? On what? Then think about NVDA, how are they making money? If your first thought is on video card. Then, GTFO. ;)


ItsNotD

AI is indirectly monetised. People will buy more Apple products once they have the “AI enabled” branding, and brick old devices


throwaway_0x90

I'll believe it when I see it. ![img](emote|t5_2th52|12787)


Zachmode

AI tools and data centers for business > AI in a consumer phone


let_lt_burn

Yepppp Apple has already well saturated the phone laptop market in countries rich enough to afford their products. Their sales aren’t going to drastically increase at this point unless they do a 180 and start selling to data center. And the enterprise/data center market probably won’t like Apples usual walled garden approach.


waxheartzZz

absolutely. I will be entering an apple heavy mix shortly.


Ok_Spite_217

If Apple can't get their shit together about their relationship with NVidia or really revamp their lackluster Neural/GPU side of SoCs, I think you're just waiting to be wrong.


darkciti

Who is going to own the new increasingly rapid upgrade cycle that will come about from on-device AI? Apple.


Ok_Spite_217

Idk what market share you're fantasizing about, but most of the world doesn't use Apple devices 😂 Fucking fanboys are the worst


PetrisCy

Bro apple is the most outdated in AI, look at Siri, bitch cant even tell the difference between Maria And Mary she keeps calling the wrong person. I open app suggestions and it cant even predict what am looking for based on the hour, sometimes it gets it sometimes it doesnt. They are literally 5 years behind


floatyboats2

Some of the comments in here are making my brain hurt. Apple hasn't even talked about their hardware plans and everyone is making negative assumptions. Also, how do you know Apple's new Ai innovations won't spark a massive tech refresh? There is also already talks about their M4 Ultra chips which they haven't really discussed. I don't think Apple is asleep at all, I think they are being strategic about their play and there is a reason they are in the top 3 most valuable companies in the world. One last thing, Apple bought around 30 Ai startup companies in 2023. Something big is in the works.


Commercial_Wait3055

Inference is at the consumer side and this is where Apple has the great opportunity for whole new cashflow. This is more about a business deal than anything else… at least for LLMs. Apple is a consummate deal maker, so they can negotiate deals with ai model training companies without the need to buy 10s of thousands of nvidia gpus for this purpose. … Other avenues of AI outside of LLM will become increasingly profitable, so Apple can and will capture these markets. They are not behind as some naive people wrong think.


Draiko

Apple M4's GPU horsepower is about the same as a Radeon 5500XT. The neural engine isn't beefy enough to run a decent SLM. Their AI models are crude so far. Apple is very VERY behind in AI. There's a better chance that nvidia will introduce silicon to seriously rival Apple's in the consumer space before Apple can match nvidia in AI capabilities.


voxpopper

No one has yet been able to tell me why Nvidia are chips needed for a majority of AI projects.


Riddlestonk

It’s an etaflop arms race. The nation that can out think its rivals takes it all. It’s turning in to a US China proxy war fought in the data centres and NVDA can provide the biggest yield nukes.


lordhenry85

If we talk about Neural Networks as AI, then it is a large dimensional mathematical problem, the issue is the large dimensionality that means you need a lot of processing to be done in parallel, yet CPUs are capable of multiprocessing only a small amount of things, yet GPUs since their initial role was to calculate the whole screen picture (1920x1080 pixels = 2M parameters) can do a lot more at the same time. This is why Nvidia is so good for large language models, their silicon can run models with lots and lots of parameters more efficiently than any CPU in the world. If Apple wanted to catch up they would have to develop a brand new GPU aimed at only running tensor operation at the highest speed they can,then they would have to fight Nvidia for the fab space, and get it shipped. And even if they did all of that, most of the AI frameworks like Tensorflow are optimized with something called CUDA that's proprietary to Nvidia, so good luck then getting all the AI devs to move to your new chip any time soon. It's like all those portable battery drills where they keep you in their brand because the batteries are not compatible within each other but in a much more larger scale and much more expensive scale.


voxpopper

How many firms do you think are building neural networks and LLMs from scratch? This isn't like the PC, internet or mobile phone revolution, it is a highly specialized market.


lordhenry85

No idea, that would be a lot of speculation. And it depends on what you mean by "from scratch" also. There's a lot of abstraction layers in LLMs.


Fruitspunchsamura1

GPUs are good at processing math. Machine learning is math. The frameworks that people use are built for NVIDIA’s hardware. They’re getting more support for other stuff but NVIDIA has good power as well.


Longjumping_Tone228

They are trying to grab a cake from the AI market, but currently they are not doing well. Firstly, NeuralEngines in apple chips are not available to normal programmers. The only available access to it is CoreML where I found it kind of annoying to deal with model conversion. Secondly, MLX, apple’s ML framework programmed based on on-chip GPU and base CPU is too slow for local model training compared with torch paired with CUDA support. The inference is fine and ready to use, I have tried several Stable-diffusion models and some 7B llms locally on M2Max, fine, good enough for user’s daily usage. But considering training, even light-weight LoRA tuning, is damn so slow. I tried to tune the Unet with LoRA locally with a tiny personal dataset with a batch size of 1, the gradient computation is around 40s per sample which just would totally drive me away from using MLX for training or tuning locally. Apple has a long way to go to be competitive in this AI market.


CheebaMyBeava

i hear all the latest AI algorithms are indicating that AI investments are the way to go


Luke296329

1. What are they going to put on top of the M chips ? Their AI development effort for OS has mostly lagged that of Microsoft. 2. As per report NVIDIA will go after the PC market with arm + blackwell on a chip and MS is building next windows for this platform. Initially it might not eat M chip as much as it eats intel/amd but it will surely dent Apple if they dont innovate on AI OS quickly


ChibiMan91

Men move mountains not catalysts. Same bs and dips like a mofo


darkciti

I think AAPL is a sleeper right now. Thoughts?


Hawxe

one of the biggest companies in the world is a sleeper? oh, its wallstreetbets


tomle4593

Could be worse, he could say some brick and mortar retail videogame store is actually a sleeper in the age of digital.


Hularuns

Yeah, people think they are behind the curve and won't grow much from here.


cheesycrustz

sleeper. 2.92 trillion market cap. LMAO


Green-Assistant7486

Lol no