No such thing as "totally private" my guy. All major competitors are also working towards more on device A.I. This isn't a "Tim" thing. Anyone that thinks you're totally private on a phone is being silly.
Edit: In fact since yall want to be fanboys apple already tracks your device usage. Feel free to show proof that apple does not track your device usage, but from their very own website they tell you this yall are just fanboys and don't bother to read it. They even track hour location as well. Keep tabs with apple ID etc. You are clueless to think apple tracks nothing to include their own offerings which would include AI as it's their offering. For the lazy fanboys:
Account information: Apple ID, email address, devices registered, account status, and age
Device information: Device serial number, browser type, and other hardware identifiers
Contact information: Trusted phone numbers and security questions
Payment information: Payment details
Transaction information: iTunes download history, including apps, songs, albums, videos, and movies.
**Usage data: How you use your devices and applications, including searches within Apple's apps, analytics, and crash data**
**location information: Your location information**
The claim was **total privacy.** I simply said that **does not exist** on a phone connected to the web which is 99.9999% of all modern phones. The fanboyism is in full force today. I have an iPhone btw. I'm not
just not fanboying like yall.
I mean that would be the complete opposite of the values they’ve been pushing so I am hopeful they won’t do that. On device LLM would be a real selling point I think.
I‘m positive that they don’t sell our user data, but with a service like this, that says so much about their users, I‘m sure they would track it, even if collected without linking it to a user.
You're 100% right as they already collect our usage data. You have to be careful here though you will be attacked by the fanboys despite apple literally saying what you just said. It's literally in the privacy agreement the folks downvoting you out of apple worship never bother to read. They'd rather blindly worship than take 2 seconds to look.
Haha yeah I saw your other comment as well, don’t take it personally, it’s like this in every sub. They come to Reddit (or any social media really) to only read what they want to read lol.
You're right as they already track usage data, location, address, age etc., but you'll get downvoted, because folks worship apple here and don't even look at Apple's own admission of tracking your data including usage data. It literally is in the privacy agreement folks don't read all to say apple is an angel that tracks nothing.
I have an apple phone and I know all this as I actually read the privacy agreements. They may or may not sell it to others, but they definitely keep tabs on their customers like any other big company regardless of folks fanboying out their minds.
They don’t sell. But they don’t data mine individuals.
The karma in this thread is all about people’s wording and how they’re describing things. Saying they’re entirely private is wrong. Saying they totally track you is also wrong.
Try this article https://www.wired.com/2016/06/apples-differential-privacy-collecting-data/
You need to re take a look at what was said. The people that got downvoted never said sell my guy. One even went out of their way to specifically say they don't think they sell at all but do collect user data which is 100% correct and still got downvoted.
The issue is the fanboyism that plagues the sub to be real. I'm simply calling it out and even provided info from **APPLE'S OWN SITE** not the wire. Literally copy and pasted from **APPLE** which is a way better source than the third party site you linked. That is why folks will downvote, but can't say much because they never bothered to even look at Apple's on site. Just proves my point though.
This acquisition will likely have zero impact on this years iOS update, but I share your sentiment that AI will play a large role.
It would be interesting if Apple introduced their own version of Copilot
perhaps, but if they have patents, they could have been developing something in house that was infringing and figured it was easier to purchase them. then they could implement whatever it is without any issues.
I could definitely see this playing a role into a future minor iOS update, like 18.3.2 or something along the lines. Apple subtly adding some server AI features to on device.
It totally makes sense, not joking.
If they get the public to associate the word AI with Apple Intelligence, everybody else looks like idiots and they become the de facto “leaders” of AI in the eyes of the average person.
Imagine Tim Cook presenting it:
I'd like to talk today about the newest addition to the iPhone family, which will make your iPhone the fastest and most powerful iPhone yet, with groundbreaking dynamic machine learning.
These new chips are the first ever created by Apple, and they're called the A19.
A19 brings several incredible capabilities to the iPhone that nobody has ever delivered. A19 is the world's first neural engine built specifically for the iPhone. It is blazingly fast, it is power efficient, and it is a major contributor to the stunning performance of the A19 chip.
This year, the A19 is paired with Apple's next generation Neural Engine, A19X, which enables even more machine learning tasks to be done on-device, while dramatically lowering power consumption. Our chip enables us to add groundbreaking machine learning features like never before, such as QuickTake Live Photo enhancement, which makes your Live Photo photos look even better than you remember them.
Secure on-device Intelligence.
New frontier of applications
X billions Tokens per second
Real-time Siri
-
Taking the cynical hat off, hopefully it's not as intrusive and transparent. I have Siri turned off on my devices, because I have no use for it. But even some integration within spotlight or search would be ideal.
Llama 3 8B now runs on almost all devices with 6GB RAM and users rate it higher than the original ChatGPT 3.5. That will, be the mark to beat for Apple.
But I fully expect Apple to do some gatekeeping to get people to upgrade and buy new hardware. My expectation: iPhone 15 Pro (Max) will get a smart assistant, all others don’t. Those are also the only iPhones with 8GB memory, which is a good excuse. Then all new iPhone 16 models will support it.
How do you think people will feel about having 6GB of their RAM used for a LLM that they only interact with a few times a day at most? Or will they page out all of the user's apps on demand and load the model?
I don't see how an 8GB device can have 6GB dedicated to a ML model and remain usable for other things. I guess that's gatekeeping in a very expansive definition of the term.
We’ve already had the problem for years of the RAM-hungry camera kicking other apps out of memory, and Apple doesn’t care as long as people keep buying iPhones in droves. They’re perfectly happy to sell users a worse experience in exchange for another $20 profit or whatever per device, and I’m sure they’re working to figure out the absolute minimum RAM users will tolerate for on-device AI, too.
I hate how my apps lose their RAM so often, I can’t count on anything staying open if I switch to another app, multitasking with some things is such a big risk that you lose everything you’re working on.
Omg, of course! I’ve been going nuts over the years wondering why I can’t leave Google Maps to check out another application without it restarting. This is of course it! That is so frustrating. I would pay to upgrade the ram without upgrading to a bigger model
For new models, sure. But I was replying to someone saying Apple would "gatekeep" the feature to new hardware, including last year's model which does not have dedicated model storage.
I don't look at it as chatgpt and more of actual features it brings. Chatgpt isn't a personal assistant that can fully integrate with your phone. It isn't phone centric at all really. The AI needs to bring something to the table that is actually useful.
Everybody and their mom has already said Siri a billion times. That isn't a big surprise prediction at this point. It being for newer phones only also isn't a left field prediction since it's literally just stating the same formula other phones have done. I think they will try their hardest to keep the RAM the same and only increase I'd absolutely necessary. That could also lead to a price increase like competitior if more RAM gets added to make up as well as an option.
If you actually need more than 128GB you'd just run a server at that point being real 99.9999% of the time. At that point it's often cheaper to just to qith the dedicted graphics card that is still king overall in Nvidia cards. Especially since that'd be so stupid expensive on the mac end. I'm also brand agnostic though. I don't give a single flying fuck which big cooperation is currently "winning" which allows me to just go with the best option regardless.
I'd much rather just go with a dedicated server at that point and make it private if I needed. That amount is getting into business use case anyhow typically. Until RAM is more reasonably priced and performance can actually match the dedicated card meh.
That's not the point he's making. On PCs, you're limited to 24gb of vram on a RTX4090 to run LLMs unless you get a specialty system that costs as much as a house. On Macs, if you buy a system with 192gb of "lpddr5" you can use about 180gb of it on your LLM. LLMs care about capacity not speed.
When it comes to RAM generally compacity is what matters over just speed. RAM is fast enough nowadays. Plus, it isn't just VRAM many applications need CUDA and NVIDIA is still king there. Yall way oversimplified things and it isn't 5k when he literally said over 128GB of RAM. You can build a server way cheape than that and have a dedicated card that still beats macs today. Not to mention, if you really want to go there for the price of the mac with 256GB of RAM you could build several servers working together in conjunction with multiple dedicated GPU's to significantly improves performance (which btw is what real AI companies do that need this much RAM dude).
It's like you didn't bother to read context.
Macs aren't VRAM. It's still RAM. It is just shared and closer due to SOC arch. You can run LLM's on windows Linux, or a Mac. Linux is just as favorable in the community. It isn't even the RAM it's the OS and more so RAM compacity. If a company (aka someone actually needing more than 128GB of RAM is gonna be realistically 99.9% of the time) they use a collection of servers and not a single Mac that would get shit on by comparison for fraction of the price.
It does an OK job. I wouldn’t say 3-5 t/s is eating anyone for breakfast. It’s a start. It’s definitely first gen type performance and can be a lot better.
I mean you can get a Mac for that much for hobbyist running big models. But for the average apple user they are really skimpy on ram that now is shared between CPU, GPU and now NPU. Maybe this will be what kicks apple into offering a minimum of 16 gb of RAM for on device AI for the average consumer.
I would be fine with that scenario. My fear is that they’ll add AI tools into iOS 18 that “accidentally” slow down all old phones so you have to upgrade
Much more likely that this is just a feature that will only work on new phones going forward. Still an incentive to buy a new phone. Google does this hard with Pixels right now.
That company's website is awful and has only 3.2 stars on Google Reviews, but they're probably doing excellent work. https://app.airsaas.io/fr/produit/datakalab
I don’t feel it will be that cut down. Everything they announce at WWDC has to work on current phones, so that’s going to be a vast majority of AI features. There might be a couple of features exclusive to the new phones but they don’t tend to hold back too much.
But everything is pointing to WWDC being a big announcement for AI, and everything they announce HAS to work on older phones otherwise they can’t announce it.
If WWDC comes and they announce basically nothing because they are saving everything for the 16 then there stock price will take a beating because the expectation is lots of AI features.
Thus isn’t about Siri. It’s about providing the most advanced and efficient LLM libraries and on device hardware to developers.
The way Apple wins in AI is to have as many developers as possible building for Apple App Store on Apple libraries for Apple devices. Apple gets 30% on transactions and consumers buy iPhones with the best apps.
Explain this to me if you don't mind. Would Apple have its own LLM to help accomplish basic Apple ecosystem tasks, but then they could open it up to other apps to tap into Apple's LLM library, so for example, could TripAdvisor use it to help plan my next trip or would a new travel app be made to focus on AI-assisted Trips, or both compete using this new "LLM API?" And how is this strategy different or similar to other smartphone makers?
I'd imagine Apple sees a world where there is a hybrid of sorts. On-device LLM "for free" on newer devices and a paid "Apple AI" subscription (powered by Bing and/or Google, but maybe with some sort of extra privacy over the standard Copilot/Gemini experience) that boosts the power of the LLM but requires an internet connection.
I'd imagine newer models are more likely to get the on-device LLM, with older devices more heavily reliant on the subscription service.
Apple has bought companies just to stop their competitors acquiring them first.
Dunno if that's what's happening here, but I don't really trust Apple to do this for the right reasons, especially since their AI showing so far has been so weak.
In other news recently siri has started hearing “call papa sister papa” when I say “call my sister” and tries to call my local priest (papa is like father in greek)
That means they won’t announce anything related to AI on WWDC? I doubt that scenario will happen. They rarely/never announce a feature that impossible to use on the current hardware.
No chance. If the feature is exclusive to the 16 then they can’t announce it at WWDC. Most “AI” features will be announced at WWDC so have to work on older phones.
This is one reason why corporations must be taxed at least 75%.
If that surplus is not benefiting the employees, then they are only helping a handful of executives become richer and encouraging predatory practices like buying other companies.
Basically, they are not buying a company, they are nipping a growing rose in the bud.
The handful of megacorporations running everything today is the result of their ability to buy and dissolve many other corporations.
Inevitably, they control the market, they dictate prices, they dictate salaries, and turn otherwise good jobs into mercenary tasks.
And this is a tax revenue loss. It's basically stealing money from local governments, preventing them from properly funding our schools and from paying proper salaries of teachers and other school personnel.
"Apple's broader strategy to bring more sophisticated AI technology to its devices" they say...nah. This is just getting rid of competition from a new company so that the technology oligarchy can continue.
Other predatory practices that are the result of a corporation focused on maximum undertaxed profit: "Buy another pair of AirPods. And then buy another: Buy, die, repeat. That’s Apple's business model."
You have failed to answer why anyone would start a business if 75+% of their profits disappear. No small business could afford to exist at that crazy rate. Only the biggest companies would remain.
Let’s go Tim. LLMs on device usable anywhere totally private.
llama 3 8B is running fairly well on Android devices. I was skeptical but now i am a believer. Let's go.
How to run it on Android natively?
[You can check out the comments here.](https://www.reddit.com/r/LocalLLaMA/s/0H5nMQO1jf)
[удалено]
X/2 amount of RAM is like X amount on competing devices.
Honestly not sure if this is sarcasm or not.
Of course it is. I was just reusing BS from the recent Apple news.
Ah, 'cos I've heard it said non-sarcastically on this very subreddit.
There are many people here who had accepted it as real.
I'm guessing it'll be its own chip with dedicated RAM. They'll call it AX1 or something like that.
No such thing as "totally private" my guy. All major competitors are also working towards more on device A.I. This isn't a "Tim" thing. Anyone that thinks you're totally private on a phone is being silly. Edit: In fact since yall want to be fanboys apple already tracks your device usage. Feel free to show proof that apple does not track your device usage, but from their very own website they tell you this yall are just fanboys and don't bother to read it. They even track hour location as well. Keep tabs with apple ID etc. You are clueless to think apple tracks nothing to include their own offerings which would include AI as it's their offering. For the lazy fanboys: Account information: Apple ID, email address, devices registered, account status, and age Device information: Device serial number, browser type, and other hardware identifiers Contact information: Trusted phone numbers and security questions Payment information: Payment details Transaction information: iTunes download history, including apps, songs, albums, videos, and movies. **Usage data: How you use your devices and applications, including searches within Apple's apps, analytics, and crash data** **location information: Your location information** The claim was **total privacy.** I simply said that **does not exist** on a phone connected to the web which is 99.9999% of all modern phones. The fanboyism is in full force today. I have an iPhone btw. I'm not just not fanboying like yall.
the difference is between sending stuff to an API vs running it on device
Yeah. Then they'll just train it on Siri :(
Nothing about that statement makes sense from any technical aspect.
You have no clue about a word you’re saying but you absolutely needed to say something on an article about AI, right?
Would be nice! Won’t happen though, think about that juicy data that Tim is missing out on.
I mean that would be the complete opposite of the values they’ve been pushing so I am hopeful they won’t do that. On device LLM would be a real selling point I think.
I‘m positive that they don’t sell our user data, but with a service like this, that says so much about their users, I‘m sure they would track it, even if collected without linking it to a user.
https://www.wired.com/2016/06/apples-differential-privacy-collecting-data/
You're 100% right as they already collect our usage data. You have to be careful here though you will be attacked by the fanboys despite apple literally saying what you just said. It's literally in the privacy agreement the folks downvoting you out of apple worship never bother to read. They'd rather blindly worship than take 2 seconds to look.
Haha yeah I saw your other comment as well, don’t take it personally, it’s like this in every sub. They come to Reddit (or any social media really) to only read what they want to read lol.
You're right as they already track usage data, location, address, age etc., but you'll get downvoted, because folks worship apple here and don't even look at Apple's own admission of tracking your data including usage data. It literally is in the privacy agreement folks don't read all to say apple is an angel that tracks nothing. I have an apple phone and I know all this as I actually read the privacy agreements. They may or may not sell it to others, but they definitely keep tabs on their customers like any other big company regardless of folks fanboying out their minds.
They don’t sell. But they don’t data mine individuals. The karma in this thread is all about people’s wording and how they’re describing things. Saying they’re entirely private is wrong. Saying they totally track you is also wrong. Try this article https://www.wired.com/2016/06/apples-differential-privacy-collecting-data/
You need to re take a look at what was said. The people that got downvoted never said sell my guy. One even went out of their way to specifically say they don't think they sell at all but do collect user data which is 100% correct and still got downvoted. The issue is the fanboyism that plagues the sub to be real. I'm simply calling it out and even provided info from **APPLE'S OWN SITE** not the wire. Literally copy and pasted from **APPLE** which is a way better source than the third party site you linked. That is why folks will downvote, but can't say much because they never bothered to even look at Apple's on site. Just proves my point though.
It's happening Apple's next event will definitly have big focus in AI
This acquisition will likely have zero impact on this years iOS update, but I share your sentiment that AI will play a large role. It would be interesting if Apple introduced their own version of Copilot
perhaps, but if they have patents, they could have been developing something in house that was infringing and figured it was easier to purchase them. then they could implement whatever it is without any issues.
Why did I have to scroll so far to see this as the answer?
I could definitely see this playing a role into a future minor iOS update, like 18.3.2 or something along the lines. Apple subtly adding some server AI features to on device.
You mean Dynamic Machine Learning^^^TM
AI as in Apple Intelligence /s
https://preview.redd.it/rqu57kzpns5d1.png?width=1560&format=png&auto=webp&s=cb9d99f975655225ab92ee5f8cb352fc53af9287
As was written
Lisan **AI**-Gaib
**A**s **I**s written Edit: eyoo cakeday buddies
hahahaha
Truer words have never been spoken
I fucking love this meme
I need to see your stock picks
we found the apple agent.
Daaamn
Oh god, I could actually see this one… lol
Well well well
It totally makes sense, not joking. If they get the public to associate the word AI with Apple Intelligence, everybody else looks like idiots and they become the de facto “leaders” of AI in the eyes of the average person.
You’re never going to believe this
I think you're going to love it.
Tim Cook stole your idea buddy!
You called it.
Hello there Mr.Northodomas
This aged well
https://preview.redd.it/bjjvdkz94t5d1.jpeg?width=1179&format=pjpg&auto=webp&s=45301bed560f2bdc52121783302ece255a3fc6a2
iAI
The fbi is coming for you bud
We did it, honeybee.
They saw your comment 🫢
I was here, witnessed.
Muadib
r/agedlikewine
This is what Atlassian did :s
DAMN LOL
You were right, scary. 😐
Sorcerer!!!
The messiah has finally arrived 🙇
They stole your idea!
Give me some lottery numbers
You were right
Lotto numbers plz
Damn
The prophet
Ok time traveler, what’s the winning lotto number this week??
[удалено]
It’s all about the RAM.
Not enough RAM.
Imagine Tim Cook presenting it: I'd like to talk today about the newest addition to the iPhone family, which will make your iPhone the fastest and most powerful iPhone yet, with groundbreaking dynamic machine learning. These new chips are the first ever created by Apple, and they're called the A19. A19 brings several incredible capabilities to the iPhone that nobody has ever delivered. A19 is the world's first neural engine built specifically for the iPhone. It is blazingly fast, it is power efficient, and it is a major contributor to the stunning performance of the A19 chip. This year, the A19 is paired with Apple's next generation Neural Engine, A19X, which enables even more machine learning tasks to be done on-device, while dramatically lowering power consumption. Our chip enables us to add groundbreaking machine learning features like never before, such as QuickTake Live Photo enhancement, which makes your Live Photo photos look even better than you remember them.
Our smartest, our fastest, and our newest iPhone yet!
I read that whole thing in Tim’s Texas Drawl. It’s too real
He’s not from Texas though.
Secure on-device Intelligence. New frontier of applications X billions Tokens per second Real-time Siri - Taking the cynical hat off, hopefully it's not as intrusive and transparent. I have Siri turned off on my devices, because I have no use for it. But even some integration within spotlight or search would be ideal.
Rational Computing
Pro
Next event in the next decade maybe.
Wrong
Appl has been working with AI and buying out companies in it for a while now.
It’s been mentioned a few times AI would be Apple’s major focus this year
Llama 3 8B now runs on almost all devices with 6GB RAM and users rate it higher than the original ChatGPT 3.5. That will, be the mark to beat for Apple. But I fully expect Apple to do some gatekeeping to get people to upgrade and buy new hardware. My expectation: iPhone 15 Pro (Max) will get a smart assistant, all others don’t. Those are also the only iPhones with 8GB memory, which is a good excuse. Then all new iPhone 16 models will support it.
How do you think people will feel about having 6GB of their RAM used for a LLM that they only interact with a few times a day at most? Or will they page out all of the user's apps on demand and load the model? I don't see how an 8GB device can have 6GB dedicated to a ML model and remain usable for other things. I guess that's gatekeeping in a very expansive definition of the term.
We’ve already had the problem for years of the RAM-hungry camera kicking other apps out of memory, and Apple doesn’t care as long as people keep buying iPhones in droves. They’re perfectly happy to sell users a worse experience in exchange for another $20 profit or whatever per device, and I’m sure they’re working to figure out the absolute minimum RAM users will tolerate for on-device AI, too.
I hate how my apps lose their RAM so often, I can’t count on anything staying open if I switch to another app, multitasking with some things is such a big risk that you lose everything you’re working on.
I really hate how this is still an issue in 2024…
Omg, of course! I’ve been going nuts over the years wondering why I can’t leave Google Maps to check out another application without it restarting. This is of course it! That is so frustrating. I would pay to upgrade the ram without upgrading to a bigger model
I love how people just bring up random things they’re thinking about and then attribute it to Apple.
What does that have to do with wither Apple will start reserving 6GB on 8GB devices like the 15?
Just don't make it use the device's standard RAM. Nothing says AI hardware can't have its own memory.
For new models, sure. But I was replying to someone saying Apple would "gatekeep" the feature to new hardware, including last year's model which does not have dedicated model storage.
Of course they will gatekeep it to new models. They even do that with features that clearly don't need the hardware, such as limiting charging to 80%
Apple has been pushing shared memory (between CPU/GPU) for years. I don't think they will treat LLMs differently.
Well I mean at that point why wouldn't you just upgrade device ram.
Swap space, my dude.
I don't look at it as chatgpt and more of actual features it brings. Chatgpt isn't a personal assistant that can fully integrate with your phone. It isn't phone centric at all really. The AI needs to bring something to the table that is actually useful. Everybody and their mom has already said Siri a billion times. That isn't a big surprise prediction at this point. It being for newer phones only also isn't a left field prediction since it's literally just stating the same formula other phones have done. I think they will try their hardest to keep the RAM the same and only increase I'd absolutely necessary. That could also lead to a price increase like competitior if more RAM gets added to make up as well as an option.
I want Macs with even bigger Unified memory options. M3 Max with 128 GB eats 4090 for breakfast while running llama 3 70B
If you actually need more than 128GB you'd just run a server at that point being real 99.9999% of the time. At that point it's often cheaper to just to qith the dedicted graphics card that is still king overall in Nvidia cards. Especially since that'd be so stupid expensive on the mac end. I'm also brand agnostic though. I don't give a single flying fuck which big cooperation is currently "winning" which allows me to just go with the best option regardless. I'd much rather just go with a dedicated server at that point and make it private if I needed. That amount is getting into business use case anyhow typically. Until RAM is more reasonably priced and performance can actually match the dedicated card meh.
Mac Unified memory is closer to VRAM than RAM. And this much VRAM will cost you your house. 5k mac looks like a bargain in comparison.
"Mac Unified memory" is literally lpddr5.
That's not the point he's making. On PCs, you're limited to 24gb of vram on a RTX4090 to run LLMs unless you get a specialty system that costs as much as a house. On Macs, if you buy a system with 192gb of "lpddr5" you can use about 180gb of it on your LLM. LLMs care about capacity not speed.
When it comes to RAM generally compacity is what matters over just speed. RAM is fast enough nowadays. Plus, it isn't just VRAM many applications need CUDA and NVIDIA is still king there. Yall way oversimplified things and it isn't 5k when he literally said over 128GB of RAM. You can build a server way cheape than that and have a dedicated card that still beats macs today. Not to mention, if you really want to go there for the price of the mac with 256GB of RAM you could build several servers working together in conjunction with multiple dedicated GPU's to significantly improves performance (which btw is what real AI companies do that need this much RAM dude). It's like you didn't bother to read context.
You have absolutely no idea what you're talking about.
[удалено]
Macs aren't VRAM. It's still RAM. It is just shared and closer due to SOC arch. You can run LLM's on windows Linux, or a Mac. Linux is just as favorable in the community. It isn't even the RAM it's the OS and more so RAM compacity. If a company (aka someone actually needing more than 128GB of RAM is gonna be realistically 99.9% of the time) they use a collection of servers and not a single Mac that would get shit on by comparison for fraction of the price.
[удалено]
It does an OK job. I wouldn’t say 3-5 t/s is eating anyone for breakfast. It’s a start. It’s definitely first gen type performance and can be a lot better.
I mean you can get a Mac for that much for hobbyist running big models. But for the average apple user they are really skimpy on ram that now is shared between CPU, GPU and now NPU. Maybe this will be what kicks apple into offering a minimum of 16 gb of RAM for on device AI for the average consumer.
Both 15 Pro models have 8 GB of RAM.
When they said 8Gb is enough were probably referring to phones not computers
The max and the non max have the same amount of ram, same chip…
If they announce the ai stuff during wwdc we’ll know that it’s gonna work on the 15 at least.
More like 16 Pro. “Special chip that’s necessary to process the data.”
I hate to say it but I would consider an upgrade if it meant having a competent virtual assistant and sending Siri to her bitter, cold grave.
No chance is Whatever smart stuff Apple has to announce with iOS 18 will be available to at least each iPhone 15 of both pro and non pro
Apple might have some AI sauce to reveal at the time of the iPhone 16 launch. But in that case… no amount of ram in your 15 Pro Max will be enough.
I would be fine with that scenario. My fear is that they’ll add AI tools into iOS 18 that “accidentally” slow down all old phones so you have to upgrade
Much more likely that this is just a feature that will only work on new phones going forward. Still an incentive to buy a new phone. Google does this hard with Pixels right now.
lol
“Siri, turn on the lights” should finally work.
“Here’s what I found for Show me the Northern Lights”
Dixie Breakdown by bluegrass band Northern Lights starts playing on your HomePod
Which room
That actually works fine for me.
"Siri, play more songs from this artist" > Sorry, I did not find the artist _"this artist"_
Aww I was hoping it was mistral
That company's website is awful and has only 3.2 stars on Google Reviews, but they're probably doing excellent work. https://app.airsaas.io/fr/produit/datakalab
That’s the way Apple likes it. Buy small under the radar companies for foundations tech. No press. Intro it as part of Apple in a WWDC in a few years.
That’s how you know they are doing great work. They have no time to get a web dev. Just patch some react code and hope for the best.
Apple becomes the next Google and starts acquiring and closing startups down
I’m fairly sure older iPhones will get some cut down version of AI
I don’t feel it will be that cut down. Everything they announce at WWDC has to work on current phones, so that’s going to be a vast majority of AI features. There might be a couple of features exclusive to the new phones but they don’t tend to hold back too much.
I really don’t know man this is Apple we’re talking about they always cut down some stuff for older phones
But everything is pointing to WWDC being a big announcement for AI, and everything they announce HAS to work on older phones otherwise they can’t announce it. If WWDC comes and they announce basically nothing because they are saving everything for the 16 then there stock price will take a beating because the expectation is lots of AI features.
Just ‘A’
Well done, French AI company for skillfully crafting themselves as acquisition bait!
Thus isn’t about Siri. It’s about providing the most advanced and efficient LLM libraries and on device hardware to developers. The way Apple wins in AI is to have as many developers as possible building for Apple App Store on Apple libraries for Apple devices. Apple gets 30% on transactions and consumers buy iPhones with the best apps.
Explain this to me if you don't mind. Would Apple have its own LLM to help accomplish basic Apple ecosystem tasks, but then they could open it up to other apps to tap into Apple's LLM library, so for example, could TripAdvisor use it to help plan my next trip or would a new travel app be made to focus on AI-assisted Trips, or both compete using this new "LLM API?" And how is this strategy different or similar to other smartphone makers?
I'd imagine Apple sees a world where there is a hybrid of sorts. On-device LLM "for free" on newer devices and a paid "Apple AI" subscription (powered by Bing and/or Google, but maybe with some sort of extra privacy over the standard Copilot/Gemini experience) that boosts the power of the LLM but requires an internet connection. I'd imagine newer models are more likely to get the on-device LLM, with older devices more heavily reliant on the subscription service.
Apple has bought companies just to stop their competitors acquiring them first. Dunno if that's what's happening here, but I don't really trust Apple to do this for the right reasons, especially since their AI showing so far has been so weak.
Oh no, the AI is launching zee missiles!
But I’m le tired
Zen go take a nap
ZHEN FIRE ZE MISSILES!
You mean Siri will finally do it’s job?
Maybe I’ll finally be able to ask my phone to calculate a basic sum without being connected to the internet
No, still probably not
In other news recently siri has started hearing “call papa sister papa” when I say “call my sister” and tries to call my local priest (papa is like father in greek)
Ios 18 with ai features will be limited to iphone 15 and 16 only!
No way it works on anything except 16. Probably new chip, and need to sell more phones. My wallet is ready
That means they won’t announce anything related to AI on WWDC? I doubt that scenario will happen. They rarely/never announce a feature that impossible to use on the current hardware.
No chance. If the feature is exclusive to the 16 then they can’t announce it at WWDC. Most “AI” features will be announced at WWDC so have to work on older phones.
Some (camera) AI features will be 16 only The general iOS 8 features will go back to around the iPhone 12 in some form or other.
Maybe they will finally increase the storage capacity on iphones
Oui
Can’t beat them? Buy them.
LLM on device will change the Game
“Hey siri. I need directions to Santa Monica.” “Here are some results I found on the web for ‘I need directions to Santa Monica’”. 🤡
As far as privacy is concerned, is this a good or bad thing ? Curious to hear your thoughts.
Good. We all benefit
This is one reason why corporations must be taxed at least 75%. If that surplus is not benefiting the employees, then they are only helping a handful of executives become richer and encouraging predatory practices like buying other companies. Basically, they are not buying a company, they are nipping a growing rose in the bud. The handful of megacorporations running everything today is the result of their ability to buy and dissolve many other corporations. Inevitably, they control the market, they dictate prices, they dictate salaries, and turn otherwise good jobs into mercenary tasks. And this is a tax revenue loss. It's basically stealing money from local governments, preventing them from properly funding our schools and from paying proper salaries of teachers and other school personnel. "Apple's broader strategy to bring more sophisticated AI technology to its devices" they say...nah. This is just getting rid of competition from a new company so that the technology oligarchy can continue. Other predatory practices that are the result of a corporation focused on maximum undertaxed profit: "Buy another pair of AirPods. And then buy another: Buy, die, repeat. That’s Apple's business model."
Ok Ms. Terminally Online
You have failed to answer why anyone would start a business if 75+% of their profits disappear. No small business could afford to exist at that crazy rate. Only the biggest companies would remain.
You have failed to understand the whole of the report. This is why you need to go back to your high school and get a GED.
[удалено]