Date of Award


Document Type

Open Access Master's Report

Degree Name

Master of Science in Electrical Engineering (MS)

Administrative Home Department

Department of Electrical and Computer Engineering

Advisor 1

Zhuo Feng

Committee Member 1

Zhaohui Wang

Committee Member 2

Timothy Havens


Deep learning is a trending topic widely studied by researchers due to increase in the abundance of data and getting meaningful results with them. Convolutional Neural Networks (CNN) is one of the most popular architectures used in deep learning. Binarized Neural Network (BNN) is also a neural network which consists of binary weights and activations. Neural Networks has large number of parameters and overfitting is a common problem to these networks. To overcome the overfitting problem, dropout is a solution. Randomly dropping some neurons along with its connections helps to prevent co-adaptations which finally help in reducing overfitting. Many researchers have analyzed the performance of CNN and studied about the effect of dropout on CNN using datasets like MNIST and CIFAR10. The factors like Dropout rate, Dataset size, Batch Normalization layer, Filter size, and Dropout layer addition has been studied on CNN. But there is a lack of literature in the study of dropout and the various factors in Binarized Neural Network. This report will provide a brief introduction about BNN, the advantage of using dropout and the performance comparison between BNN and CNN. A detailed description of the software packages, coding environment, algorithm flow, and deep learning framework is provided. A comprehensive analysis on the performance of BNN and CNN is performed, and BNN shows near state-of-the-art results like CNN. The research demonstrates the adding of dropout layer to a BNN for MNIST and CIFAR10 datasets, and shows that it might provide improvement to the baseline BNN’s classification accuracy. Finally, the report investigates the different factors such as Dropout rate, Dataset size, Batch Normalization layer, Filter size, and Dropout layer addition on BNN.