2009 Poster Sessions : A Highly Scalable Restricted Boltzmann Machine FPGA Implementation

Student Name : Sang Kyun Kim, Lawrence McAfee, Peter McMahon
Advisor : Oyekunle Olukotun
Research Areas: Computer Systems
ABSTRACT:
Restricted Boltzmann Machines (RBMs) - the building block for newly popular Deep Belief networks (DBNs) - are a promising new tool for machine learning practitioners. However, future research in applications of DBNs is hampered by the considerable computation that training requires. We have designed a novel architecture and FPGA implementation that accelerates the training of general RBMs in a scalable manner, with the goal of producing a system that machine learning researchers can use to investigate ever larger networks. Our current (single FPGA) design uses a highly efficient, fully-pipelined architecture based on 16-bit arithmetic for performing RBM training on an FPGA. Single-board results show a speedup of 25-30X achieved over an optimized software implementation on a high-end CPU.
Current design efforts are for a multi-board implementation.


Bio:
Lawrence McAfee, PhD student in Electrical Engineering.
Sang Kyun Kim, PhD student in Electrical Engineering.
Peter McMahon, PhD student in Electrical Engineering.