Friday, May 31, 2019

SpArSe: Sparse Architecture Search for CNNson Resource-Constrained Microcontrollers

https://arxiv.org/pdf/1905.12107.pdf

 The vast majority of processors in the world are actually microcon-troller units (MCUs), which find widespread use performing simple con-trol tasks in applications ranging from automobiles to medical devices andoffice equipment. The Internet of Things (IoT) promises to inject machinelearning into many of these every-day objects via tiny, cheap MCUs. How-ever, these resource-impoverished hardware platforms severely limit thecomplexity of machine learning models that can be deployed.For exam-ple, although convolutional neural networks (CNNs) achieve state-of-the-art results on many visual recognition tasks, CNN inferenceon MCUs ischallenging due to severe finite memory limitations. To circumvent thememory challenge associated with CNNs, various alternatives have beenproposed that do fit within the memory budget of an MCU, albeitat thecost of prediction accuracy. This paper challenges the ideathat CNNs arenot suitable for deployment on MCUs. We demonstrate that it is possi-ble to automatically design CNNs which generalize well, while also beingsmall enough to fit onto memory-limited MCUs. Our Sparse ArchitectureSearch method combines neural architecture search with pruning in a sin-gle, unified approach, which learns superior models on four popular IoTdatasets. The CNNs we find are more accurate and up to4.35×smallerthan previous approaches, while meeting the strict MCU working memoryconstraint.

No comments:

Post a Comment