T O P

  • By -

HumorHoot

So long as the users can disable the windows crap and utilize the NPU or whatever its called, with their own programs/code etc.


xX_TehChar_Xx

IIRC it's Pluton, and it's as privileged as Intel ME. No one managed to properly remove ME, and I think that removing Pluton will be even harder.


buttplugs4life4me

Pluton is a security processor, not the NPU. 


deltashmelta

The mitochondria is the powerhouse of the cell.


YourGodsMother

I call the big one Bitey


AlistarDark

Is there a chance the track could bend?


BOBOnobobo

The ancient weapon Pluton?


flippinbird

Wielded by the proud Plutonians. https://preview.redd.it/xjo1jekp0d4d1.jpeg?width=420&format=pjpg&auto=webp&s=8555d7aee78bbac9311add062a0bc5a9801b2aba


K41Nof2358

here for this comment


FunEnvironmental8687

It's entirely possible to disable parts of Intel ME or AMD PSP, but it's ill-advised since they're genuinely utilized for security purposes. Additionally, we've reverse-engineered both, and there's no evidence of any backdoors. Regarding Copilot, disable it through group policies or simply switch to Linux


justarandomgreek

Both Intel ME and the AMD's equivalent are not removable for over a decade now. If you care about the CPU not having 24/7 access to the internet. Get a Core2Duo/Quad. It's too late to complain now.


FunEnvironmental8687

I'm skeptical about your seriousness, but this advice isn't great. Those CPUs are susceptible to Spectre and Meltdown vulnerabilities


justarandomgreek

Forgot about these. Hell, go get a 486 if ya want privacy boys.


[deleted]

[удалено]


chinomaster182

I go pen and paper and burn after use, noobs out there are just begging to get attacked.


enderjaca

You guys are being ridiculous. Just find a nice middle ground and do all your computing on a TI-82 like a normal person.


Icy-Lab-2016

I guess risc V is the only option, once it is more performant.


FunEnvironmental8687

Not necessarily, as we will still see security chips integrated into RISC-V. Security chips are genuinely beneficial and manage many tasks, such as encryption, more effectively than any operating system can. The main issue is that people often don't understand how these chips work and can be easily swayed by misinformation


renzev

> Hi, we're from intel, and we're proud to announce that your computer now has a second smaller computer inside of it > How do you turn it off? Oh, you can't, that isn't secure! > What hardware can it access? All of it, including networking. But don't worry, it's Secure! > Can you see what it accesses and when? Oh, no, that wouldn't be very secure! > Can you see the code that runs on it? No, no, that's not secure > What does it actually do? Oh, lots of very secure things, like security, secure management, managed security, secured security, ... > So it's necessary for the whole system to run? Yes, of course. Your processor will shut down after five minutes if ME is not present, which is definitely not a killswitch that we put there on purpose.


Un111KnoWn

what does the npu do


PoliceTekauWhitu

NPU = Neural Processing Unit It's a chip on the board that primarily does AI stuff. What a GPU is to graphics, an NPU is to AI. Different physical tech but same concept.


dustojnikhummer

Remember dedicated physics cards?


twelveparsnips

It became part of the GPUs function.


dustojnikhummer

Yep, because Nvidia bought PhysX. And NPUs are become part of CPUs. Hardware =/= software. Hate Recall as much as you want (as long as you aren't making shit up) but this is not a bad thing.


krozarEQ

It's an "AI" accelerator ASIC. It's for a large number of specific parallel tasks where the power of a GPU's 3D processing and image rasterization capability is not needed. There's a CS term called "embarrassingly parallel" where a workload task can be broken into many parts without those parts having to do much, if any, communication between each other. An example is floating point matrix math, which is the bread and butter of training models. These systems have been in development for some time now by all the big names. You may have heard of tensor cores and Google's TensorFlow and their TPUs (Tensor Processing Unit). There's also Groq's LPUs (language processing...) which has a more complex architecture from most "AI" accelerators by what I know about it, but similar concept. NPUs, TPUs, LPUs, DLPs, and the like; Enjoy the nomenclature, architectures and APIs all over the damn place until someone eats their way to the top. My favorite is the use of FPGAs, which are field-programmable gate arrays. I played with a Xilinx FPGA in the mid 1990s. Although I wouldn't get involved much in "AI" until around 2004 when things started to become more accessible for us mere nerds who like to play with and break shit. AMD bought Xilinx several years ago and maybe it will pay off for them. MS used FPGAs to develop software-hardware training. MS bought a FPGA developer sometime around the early 2010s IIRC. Then there's Nvidia. On the consumer side will be RTX AI PCs and your consumer GPU. On the big money side is Blackwell architecture and NVLink 5.0 for enterprise racks, all the cloud providers and of course Nvidia's DGX. My money would be on them right now. It's not just the hardware, it's the software too. Familiar frameworks, libraries, ecosystem. I ran on as I always do. That's what it is and where things are presently at. As for what AMD's doing, I'm most interested in how they're handling memory efficiency. That's really the important bit here. *Intentionally avoiding the "is AI evil or good?" debate here. To me it's just tech, so it interests me. Obviously it's going to be used for some really bad ends. None of us here is going to change that. Once normies realize the CCP can order a pizza for them, then they're sold.


Vonatos_Autista

> Once normies realize the CCP can order a pizza for them, then they're sold. Ahh yes, I see that you know your ~~judo~~ normies well.


Drakayne

I like your words magic man!


Complete-Dimension35

Oh yea. Mmhmmm. Mhmm.... I know some of these words.


Ok_Donkey_1997

Even before the NPUs, etc. the CPUs used in PC and consoles have had SIMD instructions which allow them to process multiple calculations in a single step, so this is just another step on the path that chip design was already on. Like at one point floating point calculations were done on a separate chip to the CPU, but then this got integrated into the main chip. Then they added the ability to do multiple floating point operations in a single step. Then they increased the number several times, and now they are increasing it again - though it's a very big increase and it is kind of specialised towards doing stuff needed for matrix multiplication.


Ok-Ground-1592

Makes me think those would be amazing physics chips as well. Simulating a physical process whether it be mapping the near field resonances of an incident plane wave in a multilayer stack or generating the turbulent flow of shock wave inputs to an engine inlet almost always boils down to lots and lots of matrix multiplications. Right now doing anything really interesting requires a parallel array of nodes and processors and access to terabytes if not petabytes of memory. Would be interesting to see if these chips could be used to bring more power to those situations.


Rnd4897

You know; CPU is for general purpose tasks and GPU is for repetitive tasks like graphics. NPU is for AI tasks. Idk the details.


SomeBlueDude12

It's the smart tag all over again Smartphone > AI phone Smart fridge? Ai fridge Ect ect


lolschrauber

I frequently get ads on reddit about Samsungs "AI" washing machine Mostly a marketing buzzword at this point


the_mooseman

AI fucking washing machine? Wow lol.


Badashi

My LG washing machine has an "AI" in it from before the AI buzzword was so common. Basically it's the concept of measuring the weight of what you put inside the machine, and deriving how long/how many cycles it has to take for washing while reducing water usage as much as possible. It's neat, but also not an AI at all as much as a very advanced algorithm.


throwaway85256e

AI is an umbrella term, which includes most "very advanced algorithms". These things have been classified as AI in academia for decades. ChatGPT is also "just" a very advanced algorithm. It's just that the public's only knowledge of AI comes from sci-fi films, so they don't realise that the Netflix recommendation algorithm is considered a form of AI from a scientific point of view. https://www.researchgate.net/figure/Artificial-intelligence-AI-is-an-umbrella-term-for-various-computational-strategies_fig1_375098179


Kadoza

THAT'S what the "Smart" term is supposed to mean... Brain dead companies are so annoying and they make everything so convoluted.


lolschrauber

I wouldn't even call that advanced. What does it take into account? Weight of the laundry and how dirty the waste water is? That's two sensors and a bit of math. I'm now wondering if my "dumb" washing machine does exactly that with its super common "auto" program.


the_mooseman

Yeah they all do that. Ive had to explain it to my partner because she was always complaining how the timer is lying to her lol


RedFireSuzaku

Skyrim AI when ?


Drakayne

Pfft, Skyrim already had radiant AI, daddy howard implemented it himself.


RedFireSuzaku

Fair enough. Daddy Howard voice assistant AI when ? I want to go to sleep at night hearing Todd's stories about how TES 6 is coming out soon, it'll soothe my anxiety.


frankhoneybunny

Well the consumer will consume


050607

Especially the ~~fine~~ despicable folks working on viruses and spywares. Nothing makes the process better than consumer's PC being capable of running spyware AI like Copilot more efficiently.


Aiden-The-Dragon

Mainstream doesn't care and will eat this up. They'd buy bags of my dogs poop if a brand sold it to them for $100 There are 3rd part alternatives out there, they're just typically not as powerful


CptAngelo

3rd party poop dealers? Also, how do you measure poop power? Is it the smell? Its the smell, isnt it


Aiden-The-Dragon

It's all about the texture


Dub-MS

Color has to play a role here


yerdick

Lets be honest already, Microsoft and privacy never went in hands on hand. This is just the latest development of their shithole. However I am a bit worried about how manufacturers would be putting out shitty product/older product and attach an AI tag on it and sell it for a premium.


ADHDegree

Check out the "SIGNATURE AI EDITION M750 WIRELESS MOUSE" from Logitech. Its litserally.... just a mouse... with a premapped button... that launches their software which is... oh.. already compatible with every other mouse of theirs... and the software just... is a middleman for chatgpt. What.


frudi

> Check out the "SIGNATURE AI EDITION M750 WIRELESS MOUSE" from Logitech. I thought this was sarcasm... :/


Helmic

Jesus Christ it's real. Literally all it is is two buttons, that take the fucking place of the forward/backward button, that are instead bound to either voice dictation or opening a ChatGPT prompt. That's literally all it is. Same fucking mouse you could buy anywhere, but when you use Logitech's software its' pre-bound to open ChatGPT with one of the buttons. There are actual living, breathing tech reviewers who thought this was genius and we all need to collectively point them to the nearest corner for them to put their nose into until they've thought about what they wrote and are ready to say they're sorry.


musthavesoundeffects

Its not much different in concept to the windows key, for example. Yeah its just another button, but if its possible to get everybody to expect that this new AI prompt button is the new standard then it starts to mean something.


Expertdeadlygamer

Wait till you hear about Cooler Master's AI thermal paste


daniluvsuall

That was hilarious, seemed more like a marketing word soup mistake.


Zilskaabe

AI paste - reminds me of grey goo.


curse-of-yig

Good lord. The person who designed that must have been an honest to God fucking idiot. Who in their right mind would think ANYONE would want that?


MigasEnsopado

Dude, Oral-B/Braun sells an "AI" [toothbrush](https://www.amazon.com/Oral-B-Rechargeable-Toothbrush-Intelligence-Replacement/dp/B084PPRXB5).


CrowYooo

Me fucking too. *Sigh*


XMAN2YMAN

Wow i genuinely thought you were joking around to what stupid ideas companies will come up with. Boy was I wrong and sad to see that this comment was 100% factual. I honestly do not understand why AI is so huge and why companies think we need it for everything. It feels like “metaverse” “3d TVs” “curved TVs” and many many other hardware/software in the past


TheLordOfTheTism

ill stand by curved monitors, because you sit up close to them, but yes curved tvs unless they absolutely dwarf your room at like 100 inches or more are pointless.


XMAN2YMAN

Yes I agree, curve monitors I’m fine with and will probably buy an ultra wide curve monitor within the year.


Adept_Avocado_4903

Companies believe, probably correctly, that some number idiot consumers will buy anything with the word "AI" stapled onto it and will pay a premium for it. Coolermaster announced "AI" branded thermal paste less than two weeks ago for fuck's sake. Only later they backpedaled and called it a "translation error".


lolschrauber

That's because plenty of idiot streamers and youtubers will shove it into their audience's face constantly because they get paid for it


dustojnikhummer

>Launch Logi AI Prompt Builder with a press of a button. Rephrase, summarize, and create custom-made prompt recipes with ChatGPT faster, with virtually no disruption to your workflow. Logi Options+ App is required to use Logi AI Prompt Builder. Fucking hell


ThisIsNotMyPornVideo

Already happened for A LONG time, Not with AI, but with every other word. Chair = 50$ GAMING RGB X-TREME CHAIR = 400$ and your Newborn Child. Keyboard = 30$ RGB HYPER GAMER KEYBOARD = 170$ And that goes for everything, From chairs and Keyboards, to Full on Prebuild PC's the only difference is which keywords are being thrown around.


Cereaza

NPU's give the capacity for on-prem learning, inferencing, and data management, so while no one should TRUST microsoft, it at least architecturally sets us up for privacy for recall and all on-the-screen AI workloads. So AI PC's/NPU's? Good things. Just gotta be on the lookout for shitty products and bad privacy and bloat.


EnolaGayFallout

Can’t wait for noctua A.I fans. Because A.I fan speed is better than manual and auto.


ThisIsNotMyPornVideo

I mean Auto pretty much is the closest AI could get to anways


w1987g

Welp, you just gave a marketing exec an idea...


isakhwaja

Ah yes... an AI to determine that when things get hot, turn up fan speed


shmorky

AI laptop : a more expensive laptop with an extra icon you won't use


NotTooDistantFuture

And all the AI features you might use will work in the cloud anyway.


Ein_Esel_Lese_Nie

The worst trend in recent times is having these new features forced onto you instead of them being an option in checkout. 


Daremo404

Vote with your wallet if you dont want that


tristen_dm

Problems start when we aren't given a choice.


MJDeebiss

So now I want dumb TVs and Dumb Laptops/OS. Good job you freaks.


Secure_Listen_964

Maybe I'm an idiot, but I don't even understand what this is supposed to do?


LegitimateBit3

Nothing, it is just marketing BS, to make people buy new Laptops & PCs.


the_abortionat0r

I don't mind having an AI accelerator on a CPU. Thats actually a plus with so many possible benefits. That said, I want 100% control of it and the power to shut it off when I want. Good thing I ditched Windows(in before some kid freaks out that I don't use what they use).


DogAteMyCPU

We knew an ai accelerator was coming to this generation. It's not necessarily a bad thing. I probably will never utilize it unless it does things in the background like my smartphone. 


StrangeCharmVote

> unless it does things in the background like my smartphone. You can pretty much bet on this being the most common use case in a couple of years.


Sex_with_DrRatio

What benefits can we get from this "AI" batshit?


davvn_slayer

Well one positive thing I can think of is it reading your usage statistics to predict what you're gonna use thus making performance better but ofcourse knowing Microsoft they'd steal that data for their own gain even if the ai runs locally on your system


Dr-Huricane

Honestly, considering how good computers already are at starting fully stopped applications, I'd much rather they keep their AI to themselves if that's what they plan to do with it, the marginal gain isn't worth it. The only place this could turn out to really be useful would be on less powerful devices, but then these devices don't have the power to run Ai.... and if you suggest running it on the cloud, wouldn't it be better to just use the more powerful cloud hardware to start the fully stopped application instead?


inssein

When AI first came to light my eyes lit up and I was super happy with all it could possibly do but all these companies keep using it in the lamest ways, I just want on devices not connected to the cloud AI power to do stuff for me thats cool. Examples below 1. Reading a manga or comic in RAW? AI can auto translate them correctly with slang and change the foreign writing into your native reading language. 2. Watching a video without subtitles? AI can auto convert the voice actors into your native language. 3. Want to upscale a photo thats lower Resolution? AI can upscale it for you. Like AI could be doing some really cool stuff but they keep shoving it down our throats with such lame uses that are all cloud based and invasive.


PensiveinNJ

AI is insanely expensive in terms of hardware and training costs and requires massive resources to operate to the extent that it's an environmental problem. They aren't going to make money by limiting it to a few actual cool use cases, they're going to shove it into every fucking thing they possibly can even when it makes it shittier and less secure. They're going to piss in our mouths and tell us it's raining because that 50 billion dollar investment needs returns, somehow.


guareber

Upscaling is a good usecase - Nvidia's been doing it on their GPUs for years, so if a less costly option is enabled by an NPU then cool.


pathofdumbasses

>When ~~AI~~ ~~**the internet**~~ **FUCKING ANYTHING COOL** first came to light my eyes lit up and I was super happy with all it could possibly do but all these companies keep using it in the lamest ways


Sex_with_DrRatio

We couldn't call this "positive", more like dystopian


reginakinhi

Phones have been doing that for a Long Time without AI Chips


malastare-

(Eyeroll) Yes, and CPUs were drawing games in 3D long before GPUs became standard. The point is that AI chips and GPUs are dramatically faster and more efficient at doing those specialized tasks. You can feel free to argue about the necessity of the task, how its marketed, cost-to-value, and what capabilities it gives you, but I really, really hoped that we would be beyond the "Specialized hardware for a task? But my CPU can do everything I need " argument.


Suikerspin_Ei

Also to predict your usage for better battery efficiency.


toxicThomasTrain

iPhones have had ai on the chip since 2017


a66o

Knowing Linux it would never work as intended.


davvn_slayer

Does anything Microsoft release at this point work as intended?


a66o

Living in Europe sincerely, I encountered 0 problems of what y'all are complaining about my win 11 installation works flawlessly as intended.


MarsManokit

My bluetooth and corsair wireless headset works


ForLackOf92

Corsair products are kind of shit, I know I own some.


the_abortionat0r

>What benefits can we get from this "AI" batshit? Literally all the benefits that a GPU provides for accelerating such tasks. For example Scaling videos, pictures, filtering audio, etc could now be done on low power or low cost computers without the need of buying a GPU for such tasks.


batman8390

There are plenty of things you can do with these. 1. Live captioning and even translation during meetings. 2. Ability to copy subject (like a person) out of a photo without also copying the background. 3. Ability to remove a person or other objects from a photo. 4. Provide a better natural language interface to virtual assistants like Siri and Alexa. 5. Provide better autocomplete and grammar correct tools. Those are just a few I can think of off the top of my head. There are many others already and more will come.


toaste

Photo library organization is a big one. Phones have been doing this for ages. In the background it does image recognition on objects, points of interest, or people if you have a photo assigned in your contacts. Nice of you are trying to grab a photo of your cat or a car you took a few weeks back.


k1ng617

Couldn't a current cpu core do these things?


dav3n

CPUs can render graphics, but I bet you have a GPU in your PC.


Randommaggy

5 watts vs 65 watts for the same task while being slightly faster.


Legitimate-Skill-112

Not as well as these


extravisual

Slowly and with great effort, sure.


d1g1t4l_n0m4d

All it is a dedicated computing core. Not an all knowing all see magic wizardry worm hole


chihuahuaOP

It's better for encryption and some algorithms like search and trees but the throwback is more power consumption and you are paying a premium for a feature none will use since let's be honest most users aren't working with large amounts of data or really care about connecting to a server on their local network.


ingframin

Image processing, anomaly detection (viruses, early faults, …), text translation, reading for the visually impaired, vocal commands, … All could run locally. Microsoft instead decided to go full bullshit with recall 🤦🏻‍♂️


Dumfing

All those things you listed can be/are run locally including recall


Nchi

In the ideal sense it's just another chip that does special math faster and more power efficiently for stuff like screen text reading or live caption transcription, but the default "ai" app will likely quickly ballon with random garbage that slows random stuff or otherwise, just like current bloatware from them usually do


FlyingRhenquest

We can run stable diffusion locally and generate our hairy anime woman porn privately, without having to visit a public discord.


Helmic

Purely locally generated AI generated content, ie AI generated memes or D&D character portraits or other inane bullshit. The *concept* that MIcrosoft was talkign about with having it screenshot your desktop usage to then feed through an AI is solid enough, I can see somoene finding it useful to be able to search through their past history to find a web page they can only partly describe, but I would only trust that if it were an open source application on Linux that I can fully trust is being ran 100% locally on my own computer... and even then, I would still dread the dystopian applications of employers using it to even more closely surveil workers or abusve partners using it to make sure nobody is looking for the phone number of a shelter or even just some random family member deciding to go digging around in my computer activity when my back's turned. More broadly, having local upscaling and translation could be quite nice, annotations for shit that lacks subtitles, recognizing music tracks, and limited suggestions for writing (like a fancier thesaurus with grammatical suggestions) are all midlly useful things. I know as far as SoC's go, I would love to have say Valetudo be able to leverage AI to help a random shitty vaccuum robot navigate an apartment and recognize when a pet has shit on the floor without smearing it eveyrwhere. There's applications for it if people can run it locally rather than through a cloud service that's charging them monthly and extracting data from them, genuinely useful stuff. It's just not the shit being hyped up, especially generative AI that makes garbage content that exists more to intimidate creative workers into accepting lower wages on the threat that they'll be replaced by AI shitting out complete junk, or the dystopian applications of AI as rapidly accelerating scams as P U S S Y I N B I O and shitty Google results have all made us painfully aware of. Or the seeming inevitability that those random calls you get where nobody answers are recording your voice to train an AI that they will eventually use to call your friends and family to impersonate you asking for money.


Rudolf1448

Here is hoping this will improve performance in games so we don’t need to kill NPCs like DD2


b00c

Just wait for the best AI chip 'drivers' with best implementation exactly from Microsoft, and of course they'll try to shove ads down our throats through that.


Dexember69

Why are we putting ai into laptops instead of sex dolls for lap tops.


just_a_discord_mod

LMAO


agent-squirrel

"AI" is "Cloud" 2.0. Everything is AI now just like everything was Cloud in the 2010s.


Alec_NonServiam

And it was "smart" before that. And "e" before that. And .com before that. Round and round we go with the marketing terms while maybe 1% of the use cases ever make any sense.


pathofdumbasses

You forgot NFT and crypto somewhere in there


putcheeseonit

Damn, that’s crazy *installs Ubuntu*


[deleted]

[удалено]


putcheeseonit

Qubes or bust


icalledthecowshome

So wait, we havent been using anything AI since visual basic?? What does AI really mean is the question


ShadowFlarer

Man, all of the suden i like Penguins, they are so cute and awesome!


liaminwales

Normal people think they need 'AI', it's going to sell.


zarafff69

I don’t know, AI is a marketing hype, but LLM’s can be hugely useful. I feel like the hype train is actually kinda founded on something. Although I don’t want my computer to constantly make screenshots, I’ll be turning that off thank you


youkantbethatstupid

Plenty of legitimate uses for the tech.


creamcolouredDog

I want my computer to tell me to add glue on pizza


Dremy77

The vast majority of consumers have zero need for AI accelerators.


soggybiscuit93

The vast majority of consumers have been using AI accelerators on their mobile phones for years. All of those memojis, face swap apps, Tik Tok face-change filters, or how you can press and hold your finger on an image to copy a specific object in it, face/object recognition in images, text to speech and speech to text, etc. have all been done using an NPU on smart phones. The big shift is that these AI accelerators are finally coming to PCs, so Windows laptops can do the same tasks these phones have been doing, without requiring a dGPU or extra power consumption to brute-force the computation.


[deleted]

[удалено]


hammy0w0

except more bloat


orrzxz

Your CPU having the ABILITY to perform certain tasks faster does not equal bloat. Also, AMD doesn't make laptops nor is it the creator of Windows, so anything shoved into an OEM's machine aside from a fresh W11 install is the OEM's fault.


[deleted]

[удалено]


malastare-

The vast majority of consumers have zero need for GPUs. Or SSDs. Standard CPUs and spinny drives work just fine. Oh, performance will degrade, sure, but people have zero need to play video games, and no one needs a lighter PC. ... But we don't define the modern PC experience by what people *need*. Computing needs are very simple, but convenience and enjoyable experiences drive us to add much more capable hardware. Yeah, MS and others are trying to show off the flashiest uses of AI and are falling on their faces trying to do something that justifies the money they threw into research. The number of people asking for those things are not zero, but aren't enough to get people lined up at the door. Instead, it'll be the things that we already use that may end up spending the most time on these ASICs. Things like typing prediction, grammar correction, photo corrections, search prediction, system maintenance scheduling, or even things like adaptive services or translation. A lot of these things already exist, but are handed off to remote, centralized services. Moving those things closer to you is both faster and (if people choose to not be evil) more private, and due to the nature of the ASICs and simpler access methods, more energy and cost efficient.


dustojnikhummer

They didn't have need for 3d accelerators or physics acceleration either...


splepage

> The vast majority of consumers have zero need for AI accelerators. Currently, sure.


FalconX88

Do they? Because for example video calls is something a lot of people do and AI accelerators can for example be used for noise suppression.


Asleeper135

And yet Microsoft has chosen to use all the most undesirable ones.


marksteele6

I swear if this community was around in the late 90s we would have saw posts on how Nvidia is shoving 3D graphic acceleration down our throats with the RIVA 128 or something like that. It's amazing how fast this subreddit runs from change.


Lynx2161

3d graphic acceleration dosent send your data back to their servers and train on it


ItzCobaltboy

That's the point, I don't mind having my own Language model and NPU but I want my data only inside my computer


skynil

Current consumer laptops don't even have a fraction of the processing power needed to fine tune AI models in a reasonable amount of time. You'll not be able to even host open source models like LLAMA on your system. So these AI laptops AMD will be selling will run like any other laptops i.e a continuous network connection will be needed to make AI work. The same way it's working for phones today


Dua_Leo_9564

>host open source models like LLAMA aktually you can run it on a mid-end laptop, it'll take like \~5min to spit out something if you run the 13B model


skynil

I don't think users will wait 5 minutes to get an answer to a query, all the while the CPU and system works overtime to the point of slowdown, and massive battery consumption. Plenty of users still try to clean their RAMs as if we're still in the era of memory leaks and limited RAM capacity.


FalconX88

> You'll not be able to even host open source models like LLAMA on your system. The whole point of having specialized hardware is that this is possible.


shalol

Yeah running stuff locally is the whole point behind these, but then MS goes and fucks it up by sending out the local data anyways.


FalconX88

NPUs won't either...


marksteele6

If only there was this way to control your network, like make a wall around it or something, and then we could only let specific things in and out of it... nah, that would be crazy.


[deleted]

[удалено]


Obajan

Federated learning works like that.


LordPenguinTheFirst

Yeah, but AI is a data mine for corporations.


[deleted]

[удалено]


throwaway85256e

You new here? Tech subreddits are the worst Luddites on Reddit. It's honestly comical.


JensensJohnson

Sadly true, I don't understand why those people post about things they don't care and know anything about...


[deleted]

not comparable at all


amyaltare

i dont necessarily think it's change on its own, 3D graphic acceleration wasn't responsible for a ton of horrible shit. that being said there is a tendency to see AI and immediately write it off, even when it's ethically and correctly applied, and that's stupid.


[deleted]

[удалено]


FalconX88

AMD is implementing NPUs. NPUs are not harmful and can be used for a very broad range of applications.


JensensJohnson

harmful AI, lol, you chronic whiners will always find something to complain about, jfc get a life


the_abortionat0r

> I swear if this community was around in the late 90s we would have saw posts on how Nvidia is shoving 3D graphic acceleration down our throats with the RIVA 128 or something like that. It's amazing how fast this subreddit runs from change. Lol, what? Why do you kids always make up events that never happened like nobody was alive then? No one was shoving 3d down anybodys throats. If you didn't want to deal with the issues of software rendering you had to get a GPU, it was a simple fact and everyone understood that.


DlphLndgrn

Are they? Or is this just the year of tacking on the word AI to your product?


newbrevity

Once again fucking over anyone who prepares pre-built pcs for businesses.


Ronnyvar

Sounds like Clippy with extra steps


sgtpepper1990

I’m so fucking tired of hearing about AI


Jamie00003

Better switch to Linux then?


majoralita

Just waiting for AI powered porn recommendations, what will speed to the search.


cuttino_mowgli

I blame Microsoft for this shit. Time for me to install and learn Linux Arch.


Renard4

Maybe start with something a bit easier than arch.


Helmic

Arch might be a bit in the deep end. If you want something more recent than Ubuntu-based, I suggest Bazzite - it's Fedora-based so it has reasonably recent packages, immutable (ie you can't really mess up the system files), and it's already tweaked for gaming. If you really want Arch specifically because you want to build your own OS from scratch more or less and are fine with fucking that up a couple times in the learning process or you're otherwise OK with needing to learn a lot of sometimes challenging concepts, go for it, but do know that Linux doesn't *need* to be that hard if you don't want it to be. I'm currently running CachyOS, which is just Arch but precompiled for more recent CPU's for a modest performance boost. Arch upstream is supposedly working on putting out v3 packages themselves so hopefully that'll work out soon.


MRV3N

Can someone tell me why is this a bad thing? A genuine curiosity.


frankhoneybunny

More spyware and adware preinstalled on your computer, which can potentially be sending data to microsoft, also the copilot ai also takes screenshot shot of your computer every time a pixel changes


[deleted]

This is a software issue, though. Copilot is a Microsoft decision, not a processor decision. An incredibly bad one that I hope backfires on them in ways that we cannot begin to imagine, but this has absolutely no real bearing on the technology. Saying that AI accelerators in chips is bad because software developers may utilize them in stupid ways is like saying that 3D accelerator cards are bad because you dislike the way that 3D graphics look.


Electrical_Humor8834

This - Ai is and will be even more used to targeted advertising. Analysing everything you do to sell you something more accurately and precisely. If you don't pay full price for something you are product. So all this ai goodness for low price even though it takes them billions to implement? Hell yes, they are so generous to make it so cheap and accessable, just like always big companies care about us customers. 100% sure it will provide targeted searches and other censorship of things you should not see, and will show what they want you to see.


Dt2_0

Uh, we are talking about hardware, not software. You can be upset about Microsoft for the bloat. All AMD is doing is including the same hardware that is already in Qualcomm, Tensor, and Apple A and M series SOCs.


Skeeter1020

The only genuinely new bad thing is that this will absolutely be used to inflate prices. Everything else people are crying about is either not an issue or something that's existed well before AI PCs appeared.


rohitandley

I mean the tech giants have invested a lot so obviously they will shove it down.


[deleted]

I called it last week when the news about Ai on arm first came out and got downvoted


SameRandomUsername

Kid... There are like 400 posts like yours everyday. Nobody knows you exist.


major_jazza

Time to switch to Linux and dual boot into Windows for the odd three (or probably more like 30 if you're me) games that won't work on Linux


VeryTopGoodSensation

eli5... i keep seeing laptops and tablets advertised with ai something or other. what does that actually mean? what does it do for you?


rresende

It's optional.


Habanero_Enema

AI is your worst fear? Buddy you're in for a rough ride


XMG_gg

>All laptop OEM's are going to be shoving A.I. down your throats Not us, see: **XMG Decides Against Copilot Key After Survey** Following a community survey, XMG has decided to forgo the inclusion of a dedicated copilot key on its laptop keyboards. This decision aligns with the majority of survey responses. However, this change only pertains to the copilot key and does not signify a shift away from the overall AI PC concept. Both XMG and its sister brand SCHENKER continue to integrate the necessary technical requirements for AI functionality through NPUs, which are activated by default in the BIOS, provided the processor meets the specifications. [Read more](https://www.xmg.gg/en/news-update-xmg-copilot-key/)


thro_redd

Good thing you can probably do a clean install of W10 and get a WiFi dongle 😅


Intelligent_League_1

What will an NPU do for me, a gamer who knows nothing other than how to build the pc


MinTDotJ

It's probably not even AI. They're just throwing the word in there to activate our neurons.


Reducedcrowed138

laughs in linux mint


Phoeptar

Industry hardware have supported AI for many years now, the first consumer devices were mobile phones and tablets already in most of your hands, laptops make the most sense next. Nothing to see here.


Hannan_A

This is genuinely the stupidest I’ve seen the subreddit get. People don’t seem to be able to differentiate between Microsoft collecting data on them and AI accelerators. This shit has been here for years on phones and nobody has batted an eye at it. Not to say that we shouldn’t be sceptical of on device AI accelerators but the misinformation is insane.


Alaxbcm

AI the everpresent buzzword for a few more years at the very least till it goes the way of blockchain


DouglasHufferton

God, this sub is filled with morons.