The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Uma estrutura eficiente de memória de acesso aleatório resistiva (ReRAM) é desenvolvida para acelerar a rede neural convolucional (CNN) alimentada pela computação na memória. Um novo circuito de célula ReRAM é projetado com acessibilidade bidirecional (2-D). Todo o sistema de memória é organizado como uma matriz 2-D, na qual células de memória específicas podem ser acessadas de forma idêntica por localidade de coluna e linha. Para os cálculos na memória das CNNs, apenas células relevantes em uma submatriz idêntica são acessadas por operações de leitura 2-D, o que dificilmente é implementado por células ReRAM convencionais. Desta forma, o acesso redundante (coluna ou linha) das estruturas ReRAM convencionais é evitado para eliminar a movimentação desnecessária de dados quando as CNNs são processadas na memória. A partir dos resultados da simulação, a eficiência de energia e largura de banda da estrutura de memória proposta é 1.4x e 5x de uma arquitetura ReRAM de última geração, respectivamente.
Yan CHEN
Hunan University,Nara Institute of Science and Technology
Jing ZHANG
Hunan University
Yuebing XU
Hunan University
Yingjie ZHANG
Hunan University
Renyuan ZHANG
Nara Institute of Science and Technology
Yasuhiko NAKASHIMA
Nara Institute of Science and Technology
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copiar
Yan CHEN, Jing ZHANG, Yuebing XU, Yingjie ZHANG, Renyuan ZHANG, Yasuhiko NAKASHIMA, "A ReRAM-Based Row-Column-Oriented Memory Architecture for Convolutional Neural Networks" in IEICE TRANSACTIONS on Electronics,
vol. E102-C, no. 7, pp. 580-584, July 2019, doi: 10.1587/transele.2018CTS0001.
Abstract: An efficient resistive random access memory (ReRAM) structure is developed for accelerating convolutional neural network (CNN) powered by the in-memory computation. A novel ReRAM cell circuit is designed with two-directional (2-D) accessibility. The entire memory system is organized as a 2-D array, in which specific memory cells can be identically accessed by both of column- and row-locality. For the in-memory computations of CNNs, only relevant cells in an identical sub-array are accessed by 2-D read-out operations, which is hardly implemented by conventional ReRAM cells. In this manner, the redundant access (column or row) of the conventional ReRAM structures is prevented to eliminated the unnecessary data movement when CNNs are processed in-memory. From the simulation results, the energy and bandwidth efficiency of the proposed memory structure are 1.4x and 5x of a state-of-the-art ReRAM architecture, respectively.
URL: https://global.ieice.org/en_transactions/electronics/10.1587/transele.2018CTS0001/_p
Copiar
@ARTICLE{e102-c_7_580,
author={Yan CHEN, Jing ZHANG, Yuebing XU, Yingjie ZHANG, Renyuan ZHANG, Yasuhiko NAKASHIMA, },
journal={IEICE TRANSACTIONS on Electronics},
title={A ReRAM-Based Row-Column-Oriented Memory Architecture for Convolutional Neural Networks},
year={2019},
volume={E102-C},
number={7},
pages={580-584},
abstract={An efficient resistive random access memory (ReRAM) structure is developed for accelerating convolutional neural network (CNN) powered by the in-memory computation. A novel ReRAM cell circuit is designed with two-directional (2-D) accessibility. The entire memory system is organized as a 2-D array, in which specific memory cells can be identically accessed by both of column- and row-locality. For the in-memory computations of CNNs, only relevant cells in an identical sub-array are accessed by 2-D read-out operations, which is hardly implemented by conventional ReRAM cells. In this manner, the redundant access (column or row) of the conventional ReRAM structures is prevented to eliminated the unnecessary data movement when CNNs are processed in-memory. From the simulation results, the energy and bandwidth efficiency of the proposed memory structure are 1.4x and 5x of a state-of-the-art ReRAM architecture, respectively.},
keywords={},
doi={10.1587/transele.2018CTS0001},
ISSN={1745-1353},
month={July},}
Copiar
TY - JOUR
TI - A ReRAM-Based Row-Column-Oriented Memory Architecture for Convolutional Neural Networks
T2 - IEICE TRANSACTIONS on Electronics
SP - 580
EP - 584
AU - Yan CHEN
AU - Jing ZHANG
AU - Yuebing XU
AU - Yingjie ZHANG
AU - Renyuan ZHANG
AU - Yasuhiko NAKASHIMA
PY - 2019
DO - 10.1587/transele.2018CTS0001
JO - IEICE TRANSACTIONS on Electronics
SN - 1745-1353
VL - E102-C
IS - 7
JA - IEICE TRANSACTIONS on Electronics
Y1 - July 2019
AB - An efficient resistive random access memory (ReRAM) structure is developed for accelerating convolutional neural network (CNN) powered by the in-memory computation. A novel ReRAM cell circuit is designed with two-directional (2-D) accessibility. The entire memory system is organized as a 2-D array, in which specific memory cells can be identically accessed by both of column- and row-locality. For the in-memory computations of CNNs, only relevant cells in an identical sub-array are accessed by 2-D read-out operations, which is hardly implemented by conventional ReRAM cells. In this manner, the redundant access (column or row) of the conventional ReRAM structures is prevented to eliminated the unnecessary data movement when CNNs are processed in-memory. From the simulation results, the energy and bandwidth efficiency of the proposed memory structure are 1.4x and 5x of a state-of-the-art ReRAM architecture, respectively.
ER -