%0 Conference Proceedings %T Power-Efficient Deep Neural Networks with Noisy Memristor Implementation %+ Département Mathematical and Electrical Engineering (IMT Atlantique - MEE) %+ Equipe CODES (Lab-STICC_CODES) %+ University of Illinois at Urbana-Champaign [Urbana] %+ École Polytechnique de Montréal (EPM) %A Dupraz, Elsa %A Varshney, Lav, R %A Leduc-Primeau, François %< avec comité de lecture %B ITW 2021: IEEE Information Theory Workshop %C Kanazawa, Japan %8 2021-10-17 %D 2021 %R 10.1109/ITW48936.2021.9611431 %Z Mathematics [math]/Information Theory [math.IT]Conference papers %X This paper considers Deep Neural Network (DNN) linear-nonlinear computations implemented on memristor crossbar substrates. To address the case where true memristor conductance values may differ from their target values, it introduces a theoretical framework that characterizes the effect of conductance value variations on the final inference computation. With only second-order moment assumptions, theoretical results on tracking the mean, variance, and covariance of the layer-bylayer noisy computations are given. By allowing the possibility of amplifying certain signals within the DNN, power consumption is characterized and then optimized via KKT conditions. Simulation results verify the accuracy of the proposed analysis and demonstrate the significant power efficiency gains that are possible via optimization for a target mean squared error. %G English %2 https://imt-atlantique.hal.science/hal-03337122/document %2 https://imt-atlantique.hal.science/hal-03337122/file/Dupraz21ITW.pdf %L hal-03337122 %U https://imt-atlantique.hal.science/hal-03337122 %~ UNIV-BREST %~ INSTITUT-TELECOM %~ CNRS %~ UNIV-UBS %~ INSMI %~ ENIB %~ LAB-STICC %~ LAB-STICC_IMTA %~ IMT-ATLANTIQUE %~ PRACOM %~ INSTITUTS-TELECOM %~ IMTA_MEE %~ LAB-STICC_CODES_IMTA %~ LAB-STICC_CODES %~ LAB-STICC_T2I3