Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters










Database
Language
Publication year range
1.
Neural Netw ; 162: 531-540, 2023 May.
Article in English | MEDLINE | ID: mdl-36990002

ABSTRACT

In this paper we present CHARLES (C++ pHotonic Aware neuRaL nEtworkS), a C++ library aimed at providing a flexible tool to simulate the behavior of Photonic-Aware Neural Network (PANN). PANNs are neural network architectures aware of the constraints due to the underlying photonic hardware, mostly in terms of low equivalent precision of the computations. For this reason, CHARLES exploits fixed-point computations for inference, while it supports both floating-point and fixed-point numerical formats for training. In this way, we can compare the effects due to the quantization in the inference phase when the training phase is performed on a classical floating-point model and on a model exploiting high-precision fixed-point numbers. To validate CHARLES and identify the most suited numerical format for PANN training, we report the simulation results obtained considering three datasets: Iris, MNIST, and Fashion-MNIST. Fixed-training is shown to outperform floating-training when executing inference on bitwidths suitable for photonic implementation. Indeed, performing the training phase in the floating-point domain and then quantizing to lower bitwidths results in a very high accuracy loss. Instead, when fixed-point numbers are exploited in the training phase, the accuracy loss due to quantization to lower bitwidths is significantly reduced. In particular, we show that for Iris dataset, fixed-training achieves a performance similar to floating-training. Fixed-training allows to obtain an accuracy of 90.4% and 68.1% with the MNIST and Fashion-MNIST datasets using only 6 bits, while the floating-training reaches an accuracy of just 25.4% and 50.0% when exploiting the same bitwidths.


Subject(s)
Computers , Neural Networks, Computer , Computer Simulation
SELECTION OF CITATIONS
SEARCH DETAIL
...