Framework Desktop – An alternative to NVidia to run AI@Home?
Recently I came to know about a new processor from AMD, called AMD Ryzen Ryzen AI Max+ 395. I saw the first post of this, as I’m following the subreddit r/LocalLLaMA about local large language models. This Reddit post discusses an AMD Ryzen AI Max+ 395 chip being 2.2 times faster than an NVIDIA 4090 in terms of AI performance. While the comparison was not completely fair – the used model exceeded by far the VRAM of the RTX 4090 – this new processor is quite interesting, as it follows a similar approach as Apple does with the M-processor line-up. It stands out for several reasons:
- Powerful CPU: It features 16 Zen 5 cores with 32 threads, capable of reaching a peak clock speed of 5.1 GHz
- Integrated Graphics: The processor includes a potent integrated GPU, the Radeon 8060S, with 40 RDNA 3+ compute units, making it comparable to some dedicated graphics cards
- AI Capabilities: It incorporates a 50 TOPS XDNA 2 Neural Engine, significantly enhancing its AI processing capabilities.
- Versatility: With a configurable TDP ranging up to 120W, it offers flexibility for various use cases, from mobile workstations to high-performance laptops.
- Memory Support: The processor supports LPDDR5x-8000 RAM, enabling impressive memory bandwidth.
- Competitive Edge: In certain benchmarks, it has been reported to outperform Intel’s Lunar Lake Ultra9 V in CPU performance and even surpass NVIDIA’s GeForce RTX 4090 in specific AI workloads. However, this has to be taken with a grain of salt. As long as the model with fit in 24GB, I assume every RTX 3090, 4090 or 5090 (with 32GB) will be more capable.
In essence, this processor combines excellent CPU performance with decent GPU performance (on RTX 4060 level) and the option to equip it with lots of memory. Now where to get this beast? At the moment I’m aware of 3 options:
- Asus ROG Flow Z13 is tablet style mobile gaming computer. While very impressive as such, it only offers 32GB of shared ram. Not enough for larger models.
- HP Z2 Mini G1a (who is creating these names?) is a mini workstation which fully exploits the capabilities of the new processor – especially allowing up to 128GB memory. Downside: coming soon, pricing unknown.
- Framework Destop is the newest creation of Framework, a company that is known for its upgradeable Lego-like build-your-own Laptops. Last week they have announced this mini Desktop workstation which already can be pre-ordered.
Framework Desktop

There are three configurations available, the most interesting in my view again, comes with 128GB of shared RAM. Pricing starts from 1,999 USD for the 128GB version. In the Euro zone, this essentially starts at 2,383 EUR if you choose to bring your own storage, system fan, operating system, and power cable. I personally could not resist and pre-ordered a “full” system with a 1TB NVMe, a fan, a power cable, the recommended USB-C and USB-A extension cards, and some of the mandatory tiles for the front. In total, 2,544 EUR. Holy cow!

Did I really do this? Yes! First of all, there is not a big risk. For the moment, it is only a pre-order. Framework initially charged a 100 EUR deposit. The rest will be charged once the product is ready and shippable. Until then, Framework claims it can be canceled anytime before that, including a full payback of the deposit, which overall sounds very fair to me.
Second, I assume alternative products such as the HP Z2 G1A Mini will be in a similar price range. Apple devices such as the Mac Mini or Mac Studio are definitely also very capable and alternative devices. However, anything with 128GB memory (Macbook M4 Ultra or Mac Studio M2 Ultra) is somewhere above 5,500€, which is finally way too expensive for my private ambitions. Lastly, NVidia Digits is on the horizon. But not much is clear about this, especially the price tag is unknown.
Third, I desperately tried to buy a Nvidia RTX device. Even used RTX 3090s are unreasonably expensive, not to mention the 4090 model. The new RTX 5090 models are effectively not available, and if they are, the pricing is just ridiculously over the top. For this reason, it is very obvious that more competition is needed. It is really a pity that AMD has given up the race for high-end GPUs and, even worse, has not invested in a software stack as consistently as Nvidia did with CUDA in the past. However, I now have the impression that the wind is changing as Nvidia increasingly focuses on business clients rather than consumers. Also, just from reading and admittedly without any personal experience, AMD seems to be more supported across my used toolchain, namely Ollama and ComfyUI. I will find out soon with the Framework Desktop and the AMD Ryzen AI Max 395+. But I’m also realistic, it will not beat my current RTX 3080 ti for smaller models but it will allow me (hopefully) to run large models like Llama 3.3 70b or Deepseek R1 70b in parallel with image generation models such as Flux.1 without any workarounds of forced unloading or garbage collection.
Long story short: Can’t wait to play around with this mini beast, unfortunately I’ll have to wait until Q3/2025.