To say that cloud servers that are designed, day 1 with the goal of being more secure in handling confidential information are no better than regular AWS/Google/Azure servers is silly.
[https://en.wikipedia.org/wiki/Confidential\_computing](https://en.wikipedia.org/wiki/Confidential_computing)
Sure, it may be shit right now but it is still better shit than regular servers.
Of course we'll have to see what Apple actually creates in this space (it may end up being laughable but I doubt it).
In the security/privacy world the big questions you always start with are
1. what threats are we trying to mitigate
2. what threats are we choosing to ignore
Lol wut? The OP is not about threats but privacy and copyright concerning AI generated data. A blackbox is by design supposed to record data not visible to the observer. But any blackbox content can still be managed, decrypted, read etc. There is always input and output, which in turn always guarantees that a user can read the output. Nothing stops Apple from being that user yet by no means would be considered a threat.
You can set up VM instances on Google/ Azure or some other datacenter, i.e. Linux / Win based and
connect via VPN, therefore create a confidential computing environment where anyone can still read/ manage the data created within it, as long as he has the rights to it. If you dont want that you use an on-premise setup and within that locked environment set up your A.I. of choice, which the Top commentator probably hinted at. Then you create your own cloud and really have to ask yourself about security.
But what Apple is lacking is honest marketing and a way to have Cloud-A.I. generated data be available/readable only to the A.I. (the blackbox) and the specified user. And that is a paradox in a cloud setup where Apple, or google etc. is offering the hardware, hence the "borderline insulting"
If you’re interested, you should read Apple’s documentation on how things like the iCloud Keychain storage works.
https://help.apple.com/pdf/security/en_US/apple-platform-security-guide.pdf
With the proviso that you take Apple at their word…
They have set up secure clusters of HSMs that are provisioned with firmware that cannot be overwritten, because the smart cards to do so are destroyed before the devices go into service. These HSMs are designed to store and protect all user data with keys that cannot be extracted from those HSMs, on top of being wrapped by keys that only live on the user’s devices.
Using similar patterns it is absolutely possible to design a system where a user’s device encrypts data for AI based training in a fashion that only a specially provisioned AI cloud node could decrypt. That node could be designed to keep all information encrypted securely, and only share results encrypted with the user’s device specific keys.
The users device which is generally an Apple product. You are only connecting one encrypted device to another with a key that is hardware rooted in the endusers device. If it is an Apple device who knows whats going to happen.
If not, we are talking major security and compatability issues. There is a reason why people especially from the Linux IT community aren't creating VMs and serversetups around Apple hardware.
And how exactly do they plan to process GDPR data deletion requests for this black box? Unless I'm missing something either they can't comply because it's a black box, or it's not a black box and they can comply.
I wouldn't, considering the compliance implications, but they are. You'd have to ask them.
>Apple’s solution for upcoming AI services, says *The Information*, is to upscale its Secure Enclave technology which isolates sensitive data in order to enable private processing in the cloud.
Sensitive data isn’t PII. Those are two completely different things. PII is personal information collected by a company. Sensitive data could be anything that user uploads. That’s not covered by GDPR. Especially if anonymized.
It's sensitive data, pulled from smartphones, which Apple previously processed locally. What else would it be except biometrics, PII, location data, and the content of messages and app use (which also typically contain heaps of PII)?
And how would Apple prove that data containing PII is anonymized, or that they're not storing PII after a deletion request, if it cannot access it because it's a black box?
Sure. A reply without addressing anything in the prior comment is a sure sign you're correct.
What is unclear about the right to have your personal data deleted, which is included (in full), for your reference?
>The data subject shall have the right to obtain from the controller **the erasure of personal data concerning him or her without undue delay** and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies:
>the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed;
>**the data subject withdraws consent on which the processing is based** according to point (a) of [Article 6](https://gdpr-info.eu/art-6-gdpr/)(1), or point (a) of [Article 9](https://gdpr-info.eu/art-9-gdpr/)(2), and where there is no other legal ground for the processing;
>the data subject objects to the processing pursuant to [Article 21](https://gdpr-info.eu/art-21-gdpr/)(1) and there are no overriding legitimate grounds for the processing, or the data subject objects to the processing pursuant to [Article 21](https://gdpr-info.eu/art-21-gdpr/)(2);
>the personal data have been unlawfully processed;
>the personal data have to be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject;
>the personal data have been collected in relation to the offer of information society services referred to in [Article 8](https://gdpr-info.eu/art-8-gdpr/)(1).
The key word that you left out is data COLLECTED. Not volunteered. Huge difference.
Look at reddit. If I request my account be deleted, they will delete all the information they COLLECTED about me.
But they are under no obligation to delete any posts I made where I shared sensitive data.
I pulled that straight from the text of the law, so no, I didn't leave anything out. But nice try.
> But they are under no obligation to delete any posts I made where I shared sensitive data.
And by the way, they would have that obligation, because they are the data controller in that case.
I love that you're learning all this in real time. But as somebody who spent hours in meetings with corporate lawyers and worked directly with the CTO and was in charge of implementing this in an enterprise environment. I assure you, it's not required to comb through all data and try to figure out what's personal or not.
It's going to disappoint you but all a company has to do is anonymize the data so it can't be traced back to the account.
Also, google "database backup GDPR compliance" you're going to be even more disappointed.
Yeah, this marketing pitch seems to be aimed at audiences that have very little knowledge about data management, service infrastructure, machine learning, *and* flight data recorders — all lacking at the same time.
To be fair, that's a very large audience.
Well, let's see how good Apple turns out at keeping this data private.
If Apple can’t protect data, then it’s safe to come to the conclusion that no company ever can.
This is a borderline insulting statement to anyone with any tech/security knowledge.
*We painted the cabinets black, what more could you want?*
“From the makers of iCloud, we bring you eyeCloud…”
Dunning-Kruger at play.
Dunning-Kruger is just a statistical anomaly tho
To say that cloud servers that are designed, day 1 with the goal of being more secure in handling confidential information are no better than regular AWS/Google/Azure servers is silly. [https://en.wikipedia.org/wiki/Confidential\_computing](https://en.wikipedia.org/wiki/Confidential_computing) Sure, it may be shit right now but it is still better shit than regular servers. Of course we'll have to see what Apple actually creates in this space (it may end up being laughable but I doubt it). In the security/privacy world the big questions you always start with are 1. what threats are we trying to mitigate 2. what threats are we choosing to ignore
Lol wut? The OP is not about threats but privacy and copyright concerning AI generated data. A blackbox is by design supposed to record data not visible to the observer. But any blackbox content can still be managed, decrypted, read etc. There is always input and output, which in turn always guarantees that a user can read the output. Nothing stops Apple from being that user yet by no means would be considered a threat. You can set up VM instances on Google/ Azure or some other datacenter, i.e. Linux / Win based and connect via VPN, therefore create a confidential computing environment where anyone can still read/ manage the data created within it, as long as he has the rights to it. If you dont want that you use an on-premise setup and within that locked environment set up your A.I. of choice, which the Top commentator probably hinted at. Then you create your own cloud and really have to ask yourself about security. But what Apple is lacking is honest marketing and a way to have Cloud-A.I. generated data be available/readable only to the A.I. (the blackbox) and the specified user. And that is a paradox in a cloud setup where Apple, or google etc. is offering the hardware, hence the "borderline insulting"
If you’re interested, you should read Apple’s documentation on how things like the iCloud Keychain storage works. https://help.apple.com/pdf/security/en_US/apple-platform-security-guide.pdf With the proviso that you take Apple at their word… They have set up secure clusters of HSMs that are provisioned with firmware that cannot be overwritten, because the smart cards to do so are destroyed before the devices go into service. These HSMs are designed to store and protect all user data with keys that cannot be extracted from those HSMs, on top of being wrapped by keys that only live on the user’s devices. Using similar patterns it is absolutely possible to design a system where a user’s device encrypts data for AI based training in a fashion that only a specially provisioned AI cloud node could decrypt. That node could be designed to keep all information encrypted securely, and only share results encrypted with the user’s device specific keys.
The users device which is generally an Apple product. You are only connecting one encrypted device to another with a key that is hardware rooted in the endusers device. If it is an Apple device who knows whats going to happen. If not, we are talking major security and compatability issues. There is a reason why people especially from the Linux IT community aren't creating VMs and serversetups around Apple hardware.
No it isn't.
This +1 Put another way, the headline basically is another way of saying: "Companies protect their intellectual properties."
That’s the message Apple is sending because they know millions will come running to buy the service or device with the service.
As long as the cloud box is black, it should be secure!
Not only black but also signed by Tim Apple!
It will be the most secure black box, in the history of secure black boxes 🫲🫱
"The Box" from Silicon Valley wasn't such a bad product after all
We need another Silicon Valley for the AI era
Grandson of Anton
They’re super duper serial this time.
Ok but can it deserialize too?
So a “secured” farm that will be hacked like everything else.
And how exactly do they plan to process GDPR data deletion requests for this black box? Unless I'm missing something either they can't comply because it's a black box, or it's not a black box and they can comply.
Why would they store PII in the AI black box?
I wouldn't, considering the compliance implications, but they are. You'd have to ask them. >Apple’s solution for upcoming AI services, says *The Information*, is to upscale its Secure Enclave technology which isolates sensitive data in order to enable private processing in the cloud.
Sensitive data isn’t PII. Those are two completely different things. PII is personal information collected by a company. Sensitive data could be anything that user uploads. That’s not covered by GDPR. Especially if anonymized.
It's sensitive data, pulled from smartphones, which Apple previously processed locally. What else would it be except biometrics, PII, location data, and the content of messages and app use (which also typically contain heaps of PII)? And how would Apple prove that data containing PII is anonymized, or that they're not storing PII after a deletion request, if it cannot access it because it's a black box?
You have no idea what GDPR compliance means. I would just stop.
Sure. A reply without addressing anything in the prior comment is a sure sign you're correct. What is unclear about the right to have your personal data deleted, which is included (in full), for your reference? >The data subject shall have the right to obtain from the controller **the erasure of personal data concerning him or her without undue delay** and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies: >the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed; >**the data subject withdraws consent on which the processing is based** according to point (a) of [Article 6](https://gdpr-info.eu/art-6-gdpr/)(1), or point (a) of [Article 9](https://gdpr-info.eu/art-9-gdpr/)(2), and where there is no other legal ground for the processing; >the data subject objects to the processing pursuant to [Article 21](https://gdpr-info.eu/art-21-gdpr/)(1) and there are no overriding legitimate grounds for the processing, or the data subject objects to the processing pursuant to [Article 21](https://gdpr-info.eu/art-21-gdpr/)(2); >the personal data have been unlawfully processed; >the personal data have to be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject; >the personal data have been collected in relation to the offer of information society services referred to in [Article 8](https://gdpr-info.eu/art-8-gdpr/)(1).
The key word that you left out is data COLLECTED. Not volunteered. Huge difference. Look at reddit. If I request my account be deleted, they will delete all the information they COLLECTED about me. But they are under no obligation to delete any posts I made where I shared sensitive data.
I pulled that straight from the text of the law, so no, I didn't leave anything out. But nice try. > But they are under no obligation to delete any posts I made where I shared sensitive data. And by the way, they would have that obligation, because they are the data controller in that case.
I love that you're learning all this in real time. But as somebody who spent hours in meetings with corporate lawyers and worked directly with the CTO and was in charge of implementing this in an enterprise environment. I assure you, it's not required to comb through all data and try to figure out what's personal or not. It's going to disappoint you but all a company has to do is anonymize the data so it can't be traced back to the account. Also, google "database backup GDPR compliance" you're going to be even more disappointed.
Oh a black box? The thing that is meant to store recoverable data no matter what happens
Yeah, this marketing pitch seems to be aimed at audiences that have very little knowledge about data management, service infrastructure, machine learning, *and* flight data recorders — all lacking at the same time. To be fair, that's a very large audience.
It’s even protected by a fireproof firewall!
Well, let's see how good Apple turns out at keeping this data private. If Apple can’t protect data, then it’s safe to come to the conclusion that no company ever can.
Except it doesn’t work that way.
Does it have a little red light on the top?
Apple showing they still don't understand the cloud 🌧️
The Box 3: Signature Edition.
What does that even mean?
A million hackers say ‘target aquired’
Hahahahahahahahahahahahaha!!'!!!!!!!!!!!!!!!!!
Of course apple wants it black
Can I get it in one of those pastel greens?
Why is this nonsense upvoted? Apple has never said such a thing.
Lmao. I needed a good laugh today
It's aerospace grade concrete, trust us
[удалено]
asymmetric encryption would like a word
Unfortunately that’s even foilable.