The astoshing facts I found out from buying a new PC.

I recently bought a new PC, the new “in thing” is having an NPU. A Neural Processing Unit. I was like what the heck is this, so I looked it up…. I found AMD and Intel have been asked to include a seperate NPU on all their chips for “local LLMs and AI” i guess some people have them local. Well AMD and intel said no thanks, the GPU handles all AI compute just fine. Then a year goes by and now all of a sudden every chip coming out this year has an NPU. I thought this was odd, I thought they publicly said it wasnt needed. Well ok, I guess I’ll now include NPU specs for my new PC. Microsoft Copilot AI says it needs 40 TOPS to run. TOPS is the new NPU speq buzzword. Well I got a PC with 16 TOPS, I hate Copilot anyway, so they can suck it.

I set up my new PC and a week later all 4 of my PCs forced me to upgrade and reinstall drop box. Annoying, but ok I guess. It took 4 days to reinstall and every single file was re uploaded and then re downloaded. So I wondered why. Well Microsoft now has new policies on encryption, ok cool, and also on the future architecture of file indexing etc. Wait what was that last part….

Now on to the “astonishing” part. Dropboxes future architecture will be AI driven, your computer will do all the leg work compute, their servers just hold the files. Ok I guess. I wonder if the others like one drive etc. will do the same, oh they will? Hmmm. Then I found out about the “AI edge revolution”, in the background all the software and hardware companies have been getting our pcs AND phones ready for THEM to do all the compute. Phones are actually ahead of PCs in TOPS power. So you know how weve all been discussing how OpenAI and other AIs are going to go bankrupt in x number of years….. well thats part of it. Every question you ask costs them a fraction of a cent in raw electricity compute power. So if WE do that, it just costs “us” a tiny fraction of battery power and then “THEY” save billions in electricity costs, and the environmentalists can rejoice.

The AI revolution “IS” coming, and it includes the shift to “our” devices doing the legwork. The switchover has already begun, and within the next 12-24 months the switchover will be slowly happening 1 update at a time quietly in the background until WE are the server farm which offsets billions to each AI company. Once skynet goes online, there is no turning back.
Ok, well maybe not that last part. :)