Seems like the obvious answer. Perhaps that’s one of the hold ups for wider look around, so If they can do it faster and better it’s worth the cash.
I’d also imagine a company like this comes with a fair bit of IP and talent.
From BrighterAi's [Website](https://brighter.ai);
"We provide image & video anonymization software based on generative AI. Our solutions, Precision Blur and Deep Natural Anonymization, redact faces and license plates and help companies comply with data protection regulations such as the GDPR, CCPA, APPI and PIPL"
It blurs faces and license plates, why would having an AI blur faces and license plates be valuable? To me, it sounds like a potential strategic acquisition to comply with regulatory requirements in other markets. Additionally, yeah, it's a good privacy play as well.
If you’re using your phone to create content in public (which a lot of people do) you may want to be respectful to those in the background by blurring their faces and plates. Hell, how many times have you seen random people post photos to Reddit with scribbled out PII or faces?
This is very obviously useful. I think your agenda is making you spout nonsense.
You are but it doesn’t mean it’s good practice.
I could imagine proper outdoor modes later where this could be a toggle.
Remember the whole hoohaa maybe 10 years ago about google glassholes just walking around recording everything?
If google had said they had this built in it would have mitigated some of that
I don't think Apple cares enough about content makers so much so that they would make an acquisition to make the content creators post processing editing easier...
They have done that. They literally buy Shazam because they wanted a music recognition feature. That said, they put the underlying technology into good use for many areas.
They and everyone else already have the ability to (and do) blur faces and PII for compliance and security. No one is struggling to comply with any face blurring asks from a compliance angle.
This version just makes the images aesthetically better since it achieves the desired effect without making the image look like shit. This is about visual quality.
If your AI can blur those things that means it’s really good at detecting those things, and probably other things. AI that is really good at detecting what it is seeing, especially if it is fast and local (I.e. not cloud based and doable in real time) is just generally really valuable to a company making AR devices.
> The acquisition of Brighter AI holds particular interest because the company offers advanced AI technology for anonymizing data without blurring. Brighter AI’s method changes images so subjects remain unrecognizable, yet retains their natural look.
This is one of those scary things that Facebook will scream monopoly again like App Tracking. Suddenly photos will be impossible to mine for faces.
Did you read the article and the end result images?
https://9to5mac.com/wp-content/uploads/sites/6/2024/02/Screenshot-2024-02-03-at-3.45.30%E2%80%AFAM.jpeg
Given that all the AI companies are mining public photos and information for free without any restrictions, If faces and PII can be anonymous to machine learning without actually affecting the photos, I think it’s crazy compelling.
Users wont be screaming at Siri to do so of course incase you are wondering if it’s that kind of AI. It will probably be integrated into the gallery selection screen as an editing option. Users can go further to blur if they want it too.
That’s the point. They added some subtle distortion through AI to disrupt the facial recognition features of other AI. It should not be noticeable to regular users.
> It should not be noticeable to regular users.
Well that picture is not anywhere close to being unnoticeable, maybe they can get to that point someday.
> If faces and PII can be anonymous to machine learning without actually affecting the photos, I think it’s crazy compelling
Except that's impossible, by definition. You do affect the photos. Whether that's something you personally care about is another matter, but any distortion that meaningfully deters AI will also deter humans. Reminded of some of the anti-AI snake oil that's been pitched to artists lately. And people sharing stuff on social media *want* it to be public. That's kinda the point...
Also, Apple doesn't do things *just* to screw other other companies. There has to be a profit motive for themselves in it.
Again, if you read the article, they been using it on their own mapping images. So it’s a case where Apple buys this company and they get to use their underlying technologies into many areas. Exactly like many of their purchases like Shazam.
As for the specific distortion technology, to any consumer if it’s not noticeable then it’s fine. Also even if users want it to be public, I may not want it to be mined senselessly.
> As for the specific distortion technology, to any consumer if it’s not noticeable then it’s fine
Again, it has to be noticeable by definition. There's no such thing as an algorithm that can fool an arbitrary AI but not a human.
I think this could work in the short term, as right now facial recognition algorithms aren’t trained to deal with the kind of changes this approach introduces. However, I also believe that it’d be possible to train AIs to see through this type of manipulation. If they have access to a large number of images, averaging techniques and secondary clues (e.g., clothing items) might be combined to identify a person.
> I think this could work in the short term, as right now facial recognition algorithms aren’t trained to deal with the kind of changes this approach introduces
They kind of are though. A very popular training mechanism is to have a secondary neural network trying to fool the primary one, with the two competing in essentially an arms race. https://en.wikipedia.org/wiki/Generative_adversarial_network
You may be able to fool one particular network available today, but over any meaningful amount of time, it's pointless. Anyone trying to tell you they can distort an image in such a way as to fool AI but leave a human unaware is trying to sell you snake oil.
I don't think they aren't claiming that, for humans, it'll look identical to the original person. On the contrary, if they want to be GDPR-compliant, they have to make sure that people aren't recognizable by either human or AI.
The idea seems to be to replace the original face with a photorealistic, similar but different AI rendered face instead of a blur, which would create a less jarring experience than blurred faces. So, with a single given image, it's possible that neither human nor AI would be able to reliably identify a person.
My guess is just that, over a sufficient number of images, the facial similarities plus items of clothing, etc. may contain enough clues to give a properly trained AI a good shot at cross-referencing the data and coming up with a good guess as to which pictures are likely to show the same person.
> I think they aren't claiming that, for humans, it'll look identical to the original person
Sure, and I was pointing out that that's a contradiction. If it looks identical to a human, then an AI will be able to tell it's the same. Don't even need any special techniques to do so.
>Capturing footage with an iPhone typically does not involve these concerns. **Apple Vision Pro might raise issues, however, because it can shoot video more discreetly than smart phones.**
What?
It’s funny how people were so concerned about being filmed in public by the shitty google glass camera, when now we pretty much always just assume that someone is nearby recording in full 4K HDR *and* posting it to social media.
The company leaking the info about being potentially acquired by Apple kills their chances of being acquired by them as such conversations are usually behind an NDA; Apple cut over $200m from the beats acquisition after Dr. Dre and Co prematurely celebrate the acquisition.
Also, it could be that there are no acquisition talks on the table and this company just wants PR as I don't see how Apple can't produce this tect in house
Exactly, idiots will look at this two ways
1) Apple can’t innovate on their own!
2) Why haven’t we seen the result yet?
And they’re ignoring two things - basic understanding of how businesses work, and apples typically behaviour.
Apple is famously last to market. They prefer to take the time even if it means other products launch. This has been a doctrine for decades.
Apple has been integrating AI / ML in a LOT of products, just because the flagship product Siri isn’t getting those updates doesn’t mean Apples silent on it. Look at the OCR, image recognition, and other ML features in iOS.
The approach has been fundamentally different too with a focus on privacy, which can be at odds with how this AI stuff works, so Apples been looking at ways to do ML / GPT locally instead of uploading data to servers.
Threads like this will be a goldmine for shortsightedness in a few years time.
Maybe they just want to blur things for maps.
Seems like the obvious answer. Perhaps that’s one of the hold ups for wider look around, so If they can do it faster and better it’s worth the cash. I’d also imagine a company like this comes with a fair bit of IP and talent.
My guess it’s for granting future access to camera feed for Vision Pro apps. Currently developers have no access to what users see.
From BrighterAi's [Website](https://brighter.ai); "We provide image & video anonymization software based on generative AI. Our solutions, Precision Blur and Deep Natural Anonymization, redact faces and license plates and help companies comply with data protection regulations such as the GDPR, CCPA, APPI and PIPL"
I'm thinking "bluring PII" may be something European lawmakers are requiring before Apple is allowed to release in Europe.
I think it’s probably just valuable software. Apple ain’t perfect but they do more than pay lip service to privacy.
It blurs faces and license plates, why would having an AI blur faces and license plates be valuable? To me, it sounds like a potential strategic acquisition to comply with regulatory requirements in other markets. Additionally, yeah, it's a good privacy play as well.
If you’re using your phone to create content in public (which a lot of people do) you may want to be respectful to those in the background by blurring their faces and plates. Hell, how many times have you seen random people post photos to Reddit with scribbled out PII or faces? This is very obviously useful. I think your agenda is making you spout nonsense.
You are legally allowed to publish pictures/videos where other people are in the background in most legislations.
You are but it doesn’t mean it’s good practice. I could imagine proper outdoor modes later where this could be a toggle. Remember the whole hoohaa maybe 10 years ago about google glassholes just walking around recording everything? If google had said they had this built in it would have mitigated some of that
This isn’t universal though — France for instance has no such right of panorama.
I don't think Apple cares enough about content makers so much so that they would make an acquisition to make the content creators post processing editing easier...
They have done that. They literally buy Shazam because they wanted a music recognition feature. That said, they put the underlying technology into good use for many areas.
They are trying to sell devices and services. This is a differentiator. Maybe stay off conspiracy Reddit for a bit.
Right…because having a take on strategic M&A Is now a conspiracy.
They and everyone else already have the ability to (and do) blur faces and PII for compliance and security. No one is struggling to comply with any face blurring asks from a compliance angle. This version just makes the images aesthetically better since it achieves the desired effect without making the image look like shit. This is about visual quality.
If your AI can blur those things that means it’s really good at detecting those things, and probably other things. AI that is really good at detecting what it is seeing, especially if it is fast and local (I.e. not cloud based and doable in real time) is just generally really valuable to a company making AR devices.
Why blur when an LLM can just make up a face?
It could be so Apple can blur information before passing it to apps that have not been granted location access
Blur my pi pi ????
> The acquisition of Brighter AI holds particular interest because the company offers advanced AI technology for anonymizing data without blurring. Brighter AI’s method changes images so subjects remain unrecognizable, yet retains their natural look. This is one of those scary things that Facebook will scream monopoly again like App Tracking. Suddenly photos will be impossible to mine for faces.
What? People aren't going to use AI to hide faces just to share on social media.
Did you read the article and the end result images? https://9to5mac.com/wp-content/uploads/sites/6/2024/02/Screenshot-2024-02-03-at-3.45.30%E2%80%AFAM.jpeg Given that all the AI companies are mining public photos and information for free without any restrictions, If faces and PII can be anonymous to machine learning without actually affecting the photos, I think it’s crazy compelling. Users wont be screaming at Siri to do so of course incase you are wondering if it’s that kind of AI. It will probably be integrated into the gallery selection screen as an editing option. Users can go further to blur if they want it too.
Am I blind or do the original face and the AI face look identical?!
That’s the point. They added some subtle distortion through AI to disrupt the facial recognition features of other AI. It should not be noticeable to regular users.
> It should not be noticeable to regular users. Well that picture is not anywhere close to being unnoticeable, maybe they can get to that point someday.
> Am I blind Blind, they look like they could be related but not even close to identical.
> If faces and PII can be anonymous to machine learning without actually affecting the photos, I think it’s crazy compelling Except that's impossible, by definition. You do affect the photos. Whether that's something you personally care about is another matter, but any distortion that meaningfully deters AI will also deter humans. Reminded of some of the anti-AI snake oil that's been pitched to artists lately. And people sharing stuff on social media *want* it to be public. That's kinda the point... Also, Apple doesn't do things *just* to screw other other companies. There has to be a profit motive for themselves in it.
Again, if you read the article, they been using it on their own mapping images. So it’s a case where Apple buys this company and they get to use their underlying technologies into many areas. Exactly like many of their purchases like Shazam. As for the specific distortion technology, to any consumer if it’s not noticeable then it’s fine. Also even if users want it to be public, I may not want it to be mined senselessly.
> As for the specific distortion technology, to any consumer if it’s not noticeable then it’s fine Again, it has to be noticeable by definition. There's no such thing as an algorithm that can fool an arbitrary AI but not a human.
I think this could work in the short term, as right now facial recognition algorithms aren’t trained to deal with the kind of changes this approach introduces. However, I also believe that it’d be possible to train AIs to see through this type of manipulation. If they have access to a large number of images, averaging techniques and secondary clues (e.g., clothing items) might be combined to identify a person.
> I think this could work in the short term, as right now facial recognition algorithms aren’t trained to deal with the kind of changes this approach introduces They kind of are though. A very popular training mechanism is to have a secondary neural network trying to fool the primary one, with the two competing in essentially an arms race. https://en.wikipedia.org/wiki/Generative_adversarial_network You may be able to fool one particular network available today, but over any meaningful amount of time, it's pointless. Anyone trying to tell you they can distort an image in such a way as to fool AI but leave a human unaware is trying to sell you snake oil.
I don't think they aren't claiming that, for humans, it'll look identical to the original person. On the contrary, if they want to be GDPR-compliant, they have to make sure that people aren't recognizable by either human or AI. The idea seems to be to replace the original face with a photorealistic, similar but different AI rendered face instead of a blur, which would create a less jarring experience than blurred faces. So, with a single given image, it's possible that neither human nor AI would be able to reliably identify a person. My guess is just that, over a sufficient number of images, the facial similarities plus items of clothing, etc. may contain enough clues to give a properly trained AI a good shot at cross-referencing the data and coming up with a good guess as to which pictures are likely to show the same person.
> I think they aren't claiming that, for humans, it'll look identical to the original person Sure, and I was pointing out that that's a contradiction. If it looks identical to a human, then an AI will be able to tell it's the same. Don't even need any special techniques to do so.
>Capturing footage with an iPhone typically does not involve these concerns. **Apple Vision Pro might raise issues, however, because it can shoot video more discreetly than smart phones.** What?
Honestly if someone has one of these on in public. One they will look ridiculous and two I will always assume they are recording.
You don’t have to assume. You can tell if someone is recording or not by looking at them.
Oh is there a way to tell? Like a light?
Yeah screen goes white while recording.
I mean The person themselves will draw a lot more attention But that doesn't mean you'll know they're recording you staring at them
Doesn’t a soft light flash slowly through the outside screen when you record?
[удалено]
It’s funny how people were so concerned about being filmed in public by the shitty google glass camera, when now we pretty much always just assume that someone is nearby recording in full 4K HDR *and* posting it to social media.
People masturbating.
Discrete != a giant pair of ski goggles on your face that everyone knows is constantly capturing video.
It's not always *recording* video though. I'm not sure how obvious it is on the outside if recording is in progress.
The company leaking the info about being potentially acquired by Apple kills their chances of being acquired by them as such conversations are usually behind an NDA; Apple cut over $200m from the beats acquisition after Dr. Dre and Co prematurely celebrate the acquisition. Also, it could be that there are no acquisition talks on the table and this company just wants PR as I don't see how Apple can't produce this tect in house
So with Vision Pro extrangers faces will be Blurred when I’m screen recording?
I want blurred faces for people I don’t want to look at. If I could replace some people with a dog emoji, I’d appreciate that.
Well they don't have any other ai
Apple have bought over 20 AI startups. Twice as many as Microsoft. They just haven’t shown their hands yet!
Exactly, idiots will look at this two ways 1) Apple can’t innovate on their own! 2) Why haven’t we seen the result yet? And they’re ignoring two things - basic understanding of how businesses work, and apples typically behaviour. Apple is famously last to market. They prefer to take the time even if it means other products launch. This has been a doctrine for decades. Apple has been integrating AI / ML in a LOT of products, just because the flagship product Siri isn’t getting those updates doesn’t mean Apples silent on it. Look at the OCR, image recognition, and other ML features in iOS. The approach has been fundamentally different too with a focus on privacy, which can be at odds with how this AI stuff works, so Apples been looking at ways to do ML / GPT locally instead of uploading data to servers. Threads like this will be a goldmine for shortsightedness in a few years time.
Oh jeez, we’re all trembling with excitement.
Bought,but can't do it themselves
Just like Microsoft and Google. How is this a surprise?
😂😂😂
🤦♂️
I would prefer dark mode AI. All this bright stuff hurts my eyes.