Skip to content
This repository has been archived by the owner on Jul 6, 2023. It is now read-only.

Implementation for the paper "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization"

License

Notifications You must be signed in to change notification settings

plumerai/rethinking-bnn-optimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Rethinking Binarized Neural Network Optimization

arXiv:1906.02107 License: Apache 2.0 Code style: black

Implementation for paper "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization".

A poster illustrating the proposed algorithm and its relation to the previous BNN optimization strategy is included at ./poster.pdf.

Note: Bop is now added to Larq, the open source training library for BNNs. We recommend using the Larq implementation of Bop: it is compatible with more versions of TensorFlow and will be more actively maintained.

Requirements

You can also check out one of our prebuilt docker images.

Installation

This is a complete Python module. To install it in your local Python environment, cd into the folder containing setup.py and run:

pip install -e .

Train

To train a model locally, you can use the cli:

bnno train binarynet --dataset cifar10

Reproduce Paper Experiments

Hyperparameter Analysis (section 5.1)

To reproduce the runs exploring various hyperparameters, run:

bnno train binarynet \
    --dataset cifar10 \
    --preprocess-fn resize_and_flip \
    --hparams-set bop \
    --hparams threshold=1e-6,gamma=1e-3

where you use the appropriate values for threshold and gamma.

CIFAR-10 (section 5.2)

To achieve the accuracy in the paper of 91.3%, run:

bnno train binarynet \
    --dataset cifar10 \
    --preprocess-fn resize_and_flip \
    --hparams-set bop_sec52 \

ImageNet (section 5.3)

To reproduce the reported results on ImageNet, run:

bnno train alexnet --dataset imagenet2012 --hparams-set bop
bnno train xnornet --dataset imagenet2012 --hparams-set bop
bnno train birealnet --dataset imagenet2012 --hparams-set bop

This should give the results listed below. Click on the tensorboard icons to see training and validation accuracy curves of the reported runs.

Network Bop - top-1 accuracy
Binary Alexnet 41.1% tensorboard
XNOR-Net 45.9% tensorboard
Bi-Real Net 56.6% tensorboard

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages