A Survey of Domain-Specific Architectures for Reinforcement Learning

Please use this identifier to cite or link to this item:
Open Access logo originally created by the Public Library of Science (PLoS)
Title: A Survey of Domain-Specific Architectures for Reinforcement Learning
Authors: Rothmann, Marc
Porrmann, Mario
ORCID of the author: https://orcid.org/0000-0003-2886-8197
Abstract: Reinforcement learning algorithms have been very successful at solving sequential decision-making problems in many different problem domains. However, their training is often time-consuming, with training times ranging from multiple hours to weeks. The development of domain-specific architectures for reinforcement learning promises faster computation times, decreased experiment turn-around time, and improved energy efficiency. This paper presents a review of hardware architectures for the acceleration of reinforcement learning algorithms. FPGA-based implementations are the focus of this work, but GPU-based approaches are considered as well. Both tabular and deep reinforcement learning algorithms are included in this survey. The techniques employed in different implementations are highlighted and compared. Finally, possible areas for future work are suggested, based on the preceding discussion of existing architectures.
Citations: M. Rothmann and M. Porrmann: A Survey of Domain-Specific Architectures for Reinforcement Learning. In: IEEE Access, vol. 10, pp. 13753-13767, 2022
URL: https://doi.org/10.48693/241
Subject Keywords: Reinforcement learning; Computer architecture; Training; Neural networks; Optimization; Graphics processing units; Q-learning; Domain-specific architectures; machine learning; deep learning; reconfigurable architectures; FPGA
Issue Date: 26-Jan-2022
License name: Attribution 4.0 International
License url: http://creativecommons.org/licenses/by/4.0/
Type of publication: Einzelbeitrag in einer wissenschaftlichen Zeitschrift [Article]
Appears in Collections:FB06 - Hochschulschriften

Files in This Item:
File Description SizeFormat 
IEEEAccess_Rothmann_etal_2022.pdfArticle1,21 MBAdobe PDF

This item is licensed under a Creative Commons License Creative Commons