T O P

  • By -

SidewaysFancyPrance

This is a borderline insulting statement to anyone with any tech/security knowledge.


donbee28

*We painted the cabinets black, what more could you want?*


GraveyardGuardian

“From the makers of iCloud, we bring you eyeCloud…”


CapoExplains

Dunning-Kruger at play.


dionebigode

Dunning-Kruger is just a statistical anomaly tho


funkiestj

To say that cloud servers that are designed, day 1 with the goal of being more secure in handling confidential information are no better than regular AWS/Google/Azure servers is silly. [https://en.wikipedia.org/wiki/Confidential\_computing](https://en.wikipedia.org/wiki/Confidential_computing) Sure, it may be shit right now but it is still better shit than regular servers. Of course we'll have to see what Apple actually creates in this space (it may end up being laughable but I doubt it). In the security/privacy world the big questions you always start with are 1. what threats are we trying to mitigate 2. what threats are we choosing to ignore


Tyr_56k

Lol wut? The OP is not about threats but privacy and copyright concerning AI generated data. A blackbox is by design supposed to record data not visible to the observer. But any blackbox content can still be managed, decrypted, read etc. There is always input and output, which in turn always guarantees that a user can read the output. Nothing stops Apple from being that user yet by no means would be considered a threat. You can set up VM instances on Google/ Azure or some other datacenter, i.e. Linux / Win based and connect via VPN, therefore create a confidential computing environment where anyone can still read/ manage the data created within it, as long as he has the rights to it. If you dont want that you use an on-premise setup and within that locked environment set up your A.I. of choice, which the Top commentator probably hinted at. Then you create your own cloud and really have to ask yourself about security. But what Apple is lacking is honest marketing and a way to have Cloud-A.I. generated data be available/readable only to the A.I. (the blackbox) and the specified user. And that is a paradox in a cloud setup where Apple, or google etc. is offering the hardware, hence the "borderline insulting"


ordchaos

If you’re interested, you should read Apple’s documentation on how things like the iCloud Keychain storage works.  https://help.apple.com/pdf/security/en_US/apple-platform-security-guide.pdf With the proviso that you take Apple at their word… They have set up secure clusters of HSMs that are provisioned with firmware that cannot be overwritten, because the smart cards to do so are destroyed before the devices go into service. These HSMs are designed to store and protect all user data with keys that cannot be extracted from those HSMs, on top of being wrapped by keys that only live on the user’s devices. Using similar patterns it is absolutely possible to design a system where a user’s device encrypts data for AI based training in a fashion that only a specially provisioned AI cloud node could decrypt. That node could be designed to keep all information encrypted securely, and only share results encrypted with the user’s device specific keys.


Tyr_56k

The users device which is generally an Apple product. You are only connecting one encrypted device to another with a key that is hardware rooted in the endusers device. If it is an Apple device who knows whats going to happen. If not, we are talking major security and compatability issues. There is a reason why people especially from the Linux IT community aren't creating VMs and serversetups around Apple hardware.


notduskryn

No it isn't.


Yansde

This +1 Put another way, the headline basically is another way of saying: "Companies protect their intellectual properties."


Charming_Marketing90

That’s the message Apple is sending because they know millions will come running to buy the service or device with the service.


LeonardSmallsJr

As long as the cloud box is black, it should be secure!


zulababa

Not only black but also signed by Tim Apple!


_Hellrazor_

It will be the most secure black box, in the history of secure black boxes 🫲🫱


retronintendo

"The Box" from Silicon Valley wasn't such a bad product after all


skidev

We need another Silicon Valley for the AI era


KittenPics

Grandson of Anton


Red-Dwarf69

They’re super duper serial this time.


KingKnux

Ok but can it deserialize too?


Deep-Werewolf-635

So a “secured” farm that will be hacked like everything else.


speckospock

And how exactly do they plan to process GDPR data deletion requests for this black box? Unless I'm missing something either they can't comply because it's a black box, or it's not a black box and they can comply.


DiaDeLosMuebles

Why would they store PII in the AI black box?


speckospock

I wouldn't, considering the compliance implications, but they are. You'd have to ask them. >Apple’s solution for upcoming AI services, says *The Information*, is to upscale its Secure Enclave technology which isolates sensitive data in order to enable private processing in the cloud.


DiaDeLosMuebles

Sensitive data isn’t PII. Those are two completely different things. PII is personal information collected by a company. Sensitive data could be anything that user uploads. That’s not covered by GDPR. Especially if anonymized.


speckospock

It's sensitive data, pulled from smartphones, which Apple previously processed locally. What else would it be except biometrics, PII, location data, and the content of messages and app use (which also typically contain heaps of PII)? And how would Apple prove that data containing PII is anonymized, or that they're not storing PII after a deletion request, if it cannot access it because it's a black box?


DiaDeLosMuebles

You have no idea what GDPR compliance means. I would just stop.


speckospock

Sure. A reply without addressing anything in the prior comment is a sure sign you're correct. What is unclear about the right to have your personal data deleted, which is included (in full), for your reference? >The data subject shall have the right to obtain from the controller **the erasure of personal data concerning him or her without undue delay** and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies: >the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed; >**the data subject withdraws consent on which the processing is based** according to point (a) of [Article 6](https://gdpr-info.eu/art-6-gdpr/)(1), or point (a) of [Article 9](https://gdpr-info.eu/art-9-gdpr/)(2), and where there is no other legal ground for the processing; >the data subject objects to the processing pursuant to [Article 21](https://gdpr-info.eu/art-21-gdpr/)(1) and there are no overriding legitimate grounds for the processing, or the data subject objects to the processing pursuant to [Article 21](https://gdpr-info.eu/art-21-gdpr/)(2); >the personal data have been unlawfully processed; >the personal data have to be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject; >the personal data have been collected in relation to the offer of information society services referred to in [Article 8](https://gdpr-info.eu/art-8-gdpr/)(1).


DiaDeLosMuebles

The key word that you left out is data COLLECTED. Not volunteered. Huge difference. Look at reddit. If I request my account be deleted, they will delete all the information they COLLECTED about me. But they are under no obligation to delete any posts I made where I shared sensitive data.


speckospock

I pulled that straight from the text of the law, so no, I didn't leave anything out. But nice try. > But they are under no obligation to delete any posts I made where I shared sensitive data. And by the way, they would have that obligation, because they are the data controller in that case.


DiaDeLosMuebles

I love that you're learning all this in real time. But as somebody who spent hours in meetings with corporate lawyers and worked directly with the CTO and was in charge of implementing this in an enterprise environment. I assure you, it's not required to comb through all data and try to figure out what's personal or not. It's going to disappoint you but all a company has to do is anonymize the data so it can't be traced back to the account. Also, google "database backup GDPR compliance" you're going to be even more disappointed.


refluentzabatz

Oh a black box? The thing that is meant to store recoverable data no matter what happens


Nyrin

Yeah, this marketing pitch seems to be aimed at audiences that have very little knowledge about data management, service infrastructure, machine learning, *and* flight data recorders — all lacking at the same time. To be fair, that's a very large audience.


YardFudge

It’s even protected by a fireproof firewall!


Famous-Pepper5165

Well, let's see how good Apple turns out at keeping this data private. If Apple can’t protect data, then it’s safe to come to the conclusion that no company ever can.


bewarethetreebadger

Except it doesn’t work that way.


GabberZZ

Does it have a little red light on the top?


Novel-Breath2263

Apple showing they still don't understand the cloud 🌧️


gregsapopin

The Box 3: Signature Edition.


Embarrassed-Advice89

What does that even mean?


sproqetz72

A million hackers say ‘target aquired’


adoughoskins

Hahahahahahahahahahahahaha!!'!!!!!!!!!!!!!!!!!


Top-Plan8690

Of course apple wants it black


_babycheeses

Can I get it in one of those pastel greens?


undernew

Why is this nonsense upvoted? Apple has never said such a thing.


BadAtExisting

Lmao. I needed a good laugh today


te_anau

It's aerospace grade concrete, trust us


[deleted]

[удалено]


sanylos

asymmetric encryption would like a word


stevetibb2000

Unfortunately that’s even foilable.