: In this paper we present CHARLES (C++ pHotonic Aware neuRaL nEtworkS), a C++ library aimed at providing a flexible tool to simulate the behavior of Photonic-Aware Neural Network (PANN). PANNs are neural network architectures aware of the constraints due to the underlying photonic hardware, mostly in terms of low equivalent precision of the computations. For this reason, CHARLES exploits fixed-point computations for inference, while it supports both floating-point and fixed-point numerical formats for training. In this way, we can compare the effects due to the quantization in the inference phase when the training phase is performed on a classical floating-point model and on a model exploiting high-precision fixed-point numbers. To validate CHARLES and identify the most suited numerical format for PANN training, we report the simulation results obtained considering three datasets: Iris, MNIST, and Fashion-MNIST. Fixed-training is shown to outperform floating-training when executing inference on bitwidths suitable for photonic implementation. Indeed, performing the training phase in the floating-point domain and then quantizing to lower bitwidths results in a very high accuracy loss. Instead, when fixed-point numbers are exploited in the training phase, the accuracy loss due to quantization to lower bitwidths is significantly reduced. In particular, we show that for Iris dataset, fixed-training achieves a performance similar to floating-training. Fixed-training allows to obtain an accuracy of 90.4% and 68.1% with the MNIST and Fashion-MNIST datasets using only 6 bits, while the floating-training reaches an accuracy of just 25.4% and 50.0% when exploiting the same bitwidths.

CHARLES: A C++ fixed-point library for Photonic-Aware Neural Networks

Paolini, Emilio
;
De Marinis, Lorenzo;Maggiani, Luca;Cococcioni, Marco;Andriolli, Nicola
2023-01-01

Abstract

: In this paper we present CHARLES (C++ pHotonic Aware neuRaL nEtworkS), a C++ library aimed at providing a flexible tool to simulate the behavior of Photonic-Aware Neural Network (PANN). PANNs are neural network architectures aware of the constraints due to the underlying photonic hardware, mostly in terms of low equivalent precision of the computations. For this reason, CHARLES exploits fixed-point computations for inference, while it supports both floating-point and fixed-point numerical formats for training. In this way, we can compare the effects due to the quantization in the inference phase when the training phase is performed on a classical floating-point model and on a model exploiting high-precision fixed-point numbers. To validate CHARLES and identify the most suited numerical format for PANN training, we report the simulation results obtained considering three datasets: Iris, MNIST, and Fashion-MNIST. Fixed-training is shown to outperform floating-training when executing inference on bitwidths suitable for photonic implementation. Indeed, performing the training phase in the floating-point domain and then quantizing to lower bitwidths results in a very high accuracy loss. Instead, when fixed-point numbers are exploited in the training phase, the accuracy loss due to quantization to lower bitwidths is significantly reduced. In particular, we show that for Iris dataset, fixed-training achieves a performance similar to floating-training. Fixed-training allows to obtain an accuracy of 90.4% and 68.1% with the MNIST and Fashion-MNIST datasets using only 6 bits, while the floating-training reaches an accuracy of just 25.4% and 50.0% when exploiting the same bitwidths.
2023
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S0893608023001247-main.pdf

non disponibili

Tipologia: Documento in Pre-print/Submitted manuscript
Licenza: Copyright dell'editore
Dimensione 841.06 kB
Formato Adobe PDF
841.06 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11382/554251
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
social impact