Synchronous FPNNs: neural models that fit reconfigurable hardware
Résumé
Neural networks are considered as naturally parallel computing models. But the number of operators and the complex connection graph of standard neural models can not be handled by digital hardware devices. Neural network hardware implementations have to reconcile simple hardware topologies with complex neural architectures. A theoretical and practical framework allows this combination by means of some configurable hardware principles applied to neural computation: Field Programmable Neural Arrays (FPNA) lead to powerful neural architectures that are easy to map onto FPGAs, thanks to a simplified topology and an original data exchange scheme, without having any significant loss of approximation capability. This report follows the overview of FPNAs in report 99.R.019: it focuses on a family of FPNA-based neural networks that are more specially adapted to reconfigurable hardware.