Today Raspberry Pi launched their new $130 AI HAT+ 2 which includes a Hailo 10H and 8 GB of LPDDR4X RAM.With that, the Hailo 10H is capable of running LLMs entirely standalone, freeing the Pi's CPU and system RAM for other tasks. The chip runs at a maximum of 3W, with 40 TOPS of INT8 NPU inference performance in addition to the equivalent 26 TOPS INT4 machine vision performance on the earlier AI HAT with Hailo 8.In practice, it's not as amazing as it sounds.You still can't upgrade the RAM on the Pi, but at least this way if you do have a need for an AI coprocessor, you don't have to eat up the Pi's memory to run things on it.And it's a lot cheaper and more compact than running an eGPU on a Pi. In that sense, it's more useful than the silly NPUs Microsoft forces into their 'AI PCs'.But it's still a solution in search of a problem, in all but the most niche of use cases.Besides feeling like I'm living in the world of the Turbo Encabulator every time I'm testing AI hardware, I find the marketing of these things to be very vague, and the applications not very broad.For example, the Hailo 10H is advertised as being used for a Fujitsu demo of automatic shrink detection for a self-checkout.That's certainly not a worthless use case, but it's not something I've ever needed to do. I have a feeling this board is meant more for development, for people who want to deploy the 10H in other devices, rather than as a total solution to problems individual Pi owners need to solve.Especially when it comes to the headline feature: running inference, like with LLMs.VideoI also published a video with all the information in this blog post, but if you enjoy text more than video, scroll on past鈥攊t doesn't offend me!LLM performance on the AI HAT+ 2I ran everything on an 8 gig Pi 5, so I could get an apples-to-apples comparison, running the same models on the Pi's CPU as I did on the AI HAT's NPU.They both have the same 8GB LPDDR4X RAM configuration, so ideally, they'd have similar performance...
First seen: 2026-01-15 09:16
Last seen: 2026-01-15 11:16