The AI Infrastructure Stack
Overview  /  Tier IV Compute Hardware  /  Layer 08: Accelerators — GPUs & ASICs
Sub-category 8.6

Edge AI & On-Device NPUs

Accelerators that run inside phones, laptops, cars, cameras. Doesn’t consume data-center capex, but is a parallel TAM.

Players

Players: Qualcomm QCOM, Apple Neural Engine AAPL, MediaTek 2454.TW, Samsung Exynos 005930.KS, Intel (Lunar Lake NPU) INTC, AMD (XDNA) AMD, Hailo Private, Ambarella AMBA, Mobileye (EyeQ) MBLY, Sima.ai Private

Analysis coming soon — this page is scaffolding for deeper research into edge ai & on-device npus.