Opposites attract, so that's probably why I stumbled upon a review of a monster Linux Workstation made by System76 (https://www.jeffgeerling.com/blog/2025/ ... ows-arm-pc) while in the process of getting rid of Windows and having to transfer some Powershell scripts to an ancient Shuttle XS25 with 4GB RAM, running Debian 12 x86-64 on a mobile Intel Atom D525 (4) CPU (1.795GHz) released in 2010.
This Shuttle XS25 is the real opposite of a monster machine because it has a TDP of only 13W, with 2 threads per each of the 2 cores. It still performs as it did in 2010 though, and it is completely silent because it does not need a fan. And yes, the Powershell of 2025 runs on Debian, on any Linux actually, I even have it running on Alpine Linux.
The workstation I mentioned contains an ARM CPU with 128 cores at 3.00 GHz and is equipped with 512 GB ECC DDR4 RAM. I know, a bit on the extreme other end of specs compared to my Shuttle, and not a workstation for the regular public. This beast must have its use, otherwise no one would buy it, but it did make me think of the massive over-spec of other modern consumer computers that do get manufactured for regular use.
Most of the computers in my network are way less powerful, and I have no complaints about performance or reliability. Are regular consumers so demanding or am I the one user that's "off". If you consider the rest of my computer stuff, besides a set of HP Elitedesk 35W Mini's running a 2-node Proxmox cluster (no shared storage) and another set for backups, I use four RPI Zero 2W's as piholes and nginx web-servers, and one RPI Zero 1 with TVHeadend. They do their respective jobs admirably and I'm quite satisfied with them.
And the scripts I mentioned before, they all now run on the Shuttle without any problem. One of them does a 24/7 hourly rsync backup of all the computers in my network to the before mentioned backup set of HP Elitedesk 35W Mini's. Another script checks some financials every 5 minutes while the Stock Exchange is open, and some other scripts are doing daily minor administration stuff.
The Shuttle also functions as NAS for LXC's in my Proxmox cluster, and as backup storage for the cluster as a whole. All this doesn't prevent it from running a fbi command that shows a random picture (out of a local directory of 1500 pictures) on a 12 year old monitor, and changing it every 15 minutes.
I find it astonishing that I can maintain all of this with some old hardware and the minimal cpu power and RAM of some RPI Zero's. A modern but modest Intel N100 or RPI-5 Broadcom CPU would even be overkill for my usage, so where is the need for all this spec overkill in consumer hardware coming from?
I guess it may have something to do with desktops getting heavier and browsers using ever more resources over the years, and consumers wanting their machine being able to play YouTube and a fair number of games in good quality. These were probably the main reasons in the past and even the recent past. But now there may be another main, extra reason. It's to do with a performance enhancement that's specific to AI-based operations. These operations need a particular kind of processor chip, called a NPU (Neural Processor Unit), and while this NPU is not yet a default hardware addendum to the combo of CPU and GPU that is quite common nowadays, it will become so in the near future.
It is also the next serious enhancement in computational efficiency and overall performance in consumer computers. This particular push for extra performance is an industry wish and not in the first place consumer-driven. Its aim is to enable the deployment of AI-agents of major companies. Local NPU's are necessary to let Google Gemini, Microsoft Copilot or Apple Intelligence analyze locally assembled data (e.g. browser history or timeline screenshots). Keep in mind that analyzing the local data may be instigated and administered by an AI-agent, but it is the local NPU that will do all the analyzing work, not the AI-agent itself.
Though these local NPU's are not capable (yet) of running a local ChatCPT, they are powerful enough to analyze browser history, screenshots or any other set of images. The benefits could be significant. The result of an AI-chip analysis may just be some textual summary, and the raw data that underlay this information doesn't have to be uploaded to company servers anymore. The analysis summary could be enough, heavily cutting down on upload-size and Internet traffic. Also, these servers can now be scaled down because smaller payloads need less processing resources.
So this is where I think the need for up-specs are coming from in 2025, and it does not make me particularly happy or brings me into a mood of tense anticipation of all the wondrous things these AI-agents will enable me to do in the future. There are concerns with regard to privacy, to local data collection without consent, and to analyzing it down to the level of an individual person and selling that on to anyone who will pay.
Internet surveillance or the collecting of local usage data in browser history, keystroke logs and screen captures has become almost unavoidable when using the operating systems from Microsoft or Apple or the search engine from Google, and that worries me probably more than most. But it gets beyond worry and becomes blatantly scary when some AI-agent is able to let locally assembled data be analyzed by the local NPU and send the results to the company servers where all of it can be used at will. What can possibly go wrong you ask. Ugh... I will now pretend to suffer a momentary but severe hearing loss, and leave it with this.
At the end of this tale, let's get back to the "me getting rid of Windows" subject, and think of where all the perfectly good Windows computers that Microsoft considers not fit for Win11 will end up. Mine will run some Linux distro (not sure which one yet), but I suspect most of the privately owned ones will become landfill after first been laid to rest a while in a drawer. Most of the business Win10 computers though will be sold to a refurbish company, if not this year's October than next year's, and I am already looking forward to some good deals.
On a side note, my rsync backups cannot really be called backups, because there is only some synchronization going on. In my case, it protects against disk failure with a max possible loss of 1 hour of data, but that's acceptable for me. I do however once a month copy the complete sync to a separate location, retaining 3 copies, so these are my real backups. And as a suggestion for anyone using rsync like me, while rsync cannot sync between 2 remote locations, and I do just that, you can overcome this restriction by using an sshfs mount point for one (or even both) of the remote sites, effectively turning it into a local location.
This Shuttle XS25 is the real opposite of a monster machine because it has a TDP of only 13W, with 2 threads per each of the 2 cores. It still performs as it did in 2010 though, and it is completely silent because it does not need a fan. And yes, the Powershell of 2025 runs on Debian, on any Linux actually, I even have it running on Alpine Linux.
The workstation I mentioned contains an ARM CPU with 128 cores at 3.00 GHz and is equipped with 512 GB ECC DDR4 RAM. I know, a bit on the extreme other end of specs compared to my Shuttle, and not a workstation for the regular public. This beast must have its use, otherwise no one would buy it, but it did make me think of the massive over-spec of other modern consumer computers that do get manufactured for regular use.
Most of the computers in my network are way less powerful, and I have no complaints about performance or reliability. Are regular consumers so demanding or am I the one user that's "off". If you consider the rest of my computer stuff, besides a set of HP Elitedesk 35W Mini's running a 2-node Proxmox cluster (no shared storage) and another set for backups, I use four RPI Zero 2W's as piholes and nginx web-servers, and one RPI Zero 1 with TVHeadend. They do their respective jobs admirably and I'm quite satisfied with them.
And the scripts I mentioned before, they all now run on the Shuttle without any problem. One of them does a 24/7 hourly rsync backup of all the computers in my network to the before mentioned backup set of HP Elitedesk 35W Mini's. Another script checks some financials every 5 minutes while the Stock Exchange is open, and some other scripts are doing daily minor administration stuff.
The Shuttle also functions as NAS for LXC's in my Proxmox cluster, and as backup storage for the cluster as a whole. All this doesn't prevent it from running a fbi command that shows a random picture (out of a local directory of 1500 pictures) on a 12 year old monitor, and changing it every 15 minutes.
I find it astonishing that I can maintain all of this with some old hardware and the minimal cpu power and RAM of some RPI Zero's. A modern but modest Intel N100 or RPI-5 Broadcom CPU would even be overkill for my usage, so where is the need for all this spec overkill in consumer hardware coming from?
I guess it may have something to do with desktops getting heavier and browsers using ever more resources over the years, and consumers wanting their machine being able to play YouTube and a fair number of games in good quality. These were probably the main reasons in the past and even the recent past. But now there may be another main, extra reason. It's to do with a performance enhancement that's specific to AI-based operations. These operations need a particular kind of processor chip, called a NPU (Neural Processor Unit), and while this NPU is not yet a default hardware addendum to the combo of CPU and GPU that is quite common nowadays, it will become so in the near future.
It is also the next serious enhancement in computational efficiency and overall performance in consumer computers. This particular push for extra performance is an industry wish and not in the first place consumer-driven. Its aim is to enable the deployment of AI-agents of major companies. Local NPU's are necessary to let Google Gemini, Microsoft Copilot or Apple Intelligence analyze locally assembled data (e.g. browser history or timeline screenshots). Keep in mind that analyzing the local data may be instigated and administered by an AI-agent, but it is the local NPU that will do all the analyzing work, not the AI-agent itself.
Though these local NPU's are not capable (yet) of running a local ChatCPT, they are powerful enough to analyze browser history, screenshots or any other set of images. The benefits could be significant. The result of an AI-chip analysis may just be some textual summary, and the raw data that underlay this information doesn't have to be uploaded to company servers anymore. The analysis summary could be enough, heavily cutting down on upload-size and Internet traffic. Also, these servers can now be scaled down because smaller payloads need less processing resources.
So this is where I think the need for up-specs are coming from in 2025, and it does not make me particularly happy or brings me into a mood of tense anticipation of all the wondrous things these AI-agents will enable me to do in the future. There are concerns with regard to privacy, to local data collection without consent, and to analyzing it down to the level of an individual person and selling that on to anyone who will pay.
Internet surveillance or the collecting of local usage data in browser history, keystroke logs and screen captures has become almost unavoidable when using the operating systems from Microsoft or Apple or the search engine from Google, and that worries me probably more than most. But it gets beyond worry and becomes blatantly scary when some AI-agent is able to let locally assembled data be analyzed by the local NPU and send the results to the company servers where all of it can be used at will. What can possibly go wrong you ask. Ugh... I will now pretend to suffer a momentary but severe hearing loss, and leave it with this.
At the end of this tale, let's get back to the "me getting rid of Windows" subject, and think of where all the perfectly good Windows computers that Microsoft considers not fit for Win11 will end up. Mine will run some Linux distro (not sure which one yet), but I suspect most of the privately owned ones will become landfill after first been laid to rest a while in a drawer. Most of the business Win10 computers though will be sold to a refurbish company, if not this year's October than next year's, and I am already looking forward to some good deals.
On a side note, my rsync backups cannot really be called backups, because there is only some synchronization going on. In my case, it protects against disk failure with a max possible loss of 1 hour of data, but that's acceptable for me. I do however once a month copy the complete sync to a separate location, retaining 3 copies, so these are my real backups. And as a suggestion for anyone using rsync like me, while rsync cannot sync between 2 remote locations, and I do just that, you can overcome this restriction by using an sshfs mount point for one (or even both) of the remote sites, effectively turning it into a local location.
Statistics: Posted by HanDonotob — Thu Jul 31, 2025 2:32 pm