Bayesian Neural Networks in PyTorch

Overview

We present the new scheme to compute Monte Carlo estimator in Bayesian VI settings with almost no memory cost in GPU, regardles of the number of samples. Our method is described in the paper (UAI2021): "Graph Reparameterizations for Enabling 1000+ Monte Carlo Iterations in Bayesian Deep Neural Networks".

In addition, we provide an implementation framework to make your deterministic network Bayesian in PyTorch.

If you like our work, please click on a star. If you use our code in your research projects, please cite our paper above.

Bayesify your Neural Network

There are 3 main files which help you to Bayesify your deterministic network:

  1. bayes_layers.py - file contains a bayesian implementation of convolution(1d, 2d, 3d, transpose) and linear layers, according to approx posterior from Location-Scale family, i.e. which has 2 parameters mu and sigma. This file contains general definition, independent of specific distribution, as long as distribution contains 2 parameters mu and sigma. It uses forward method defined in vi_posteriors.py file. One of the main arguments for redefined classes is approx_post, which defined which posterior class to use from vi_posteriors.py. Please, specify this name same way as defined class in vi_posteriors.py. For example, if vi_posteriors.py contains class Gaus, then approx_post='Gaus'.

  2. vi_posteriors.py - file describes forward method, including kl term, for different approximate posterior distributions. Current implementation contains following disutributions:

  • Radial
  • Gaus

If you would like to implement your own class of distrubtions, in vi_posteriors.py copy one of defined classes and redefine following functions: forward(obj, x, fun=""), get_kl(obj, n_mc_iter, device).

It also contains usefull Utils class which provides

  • definition of loss functions:
    • get_loss_categorical
    • get_loss_normal,
  • different beta coefficients: get_beta for KL term and
  • allows to turn on/off computing the KL term, with function set_compute_kl. this is useful, when you perform testing/evaluation, and kl term is not required to be computed. In that case it accelerates computations.

Below is an example to bayesify your own network. Note the forward method, which handles situations if a layer is not of a Bayesian type, and thus, does not return kl term, e.g. ReLU(x).

import bayes_layers as bl # important for defining bayesian layers
class YourBayesNet(nn.Module):
    def __init__(self, num_classes, in_channels, 
                 **bayes_args):
        super(YourBayesNet, self).__init__()
        self.conv1 = bl.Conv2d(in_channels, 64,
                               kernel_size=11, stride=4,
                               padding=5,
                               **bayes_args)
        self.classifier = bl.Linear(1*1*128,
                                    num_classes,
                                    **bayes_args)
        self.layers = [self.conv1, nn.ReLU(), self.classifier]
        
    def forward(self, x):
        kl = 0
        for layer in self.layers:
            tmp = layer(x)
            if isinstance(tmp, tuple):
                x, kl_ = tmp
                kl += kl_
            else:
                x = tmp

        x = x.view(x.size(0), -1)
        logits, _kl = self.classifier.forward(x)
        kl += _kl
        
        return logits, kl

Then later in the main file during training, you can either use one of the loss functions, defined in utils as following:

output, kl = model(inputs)
kl = kl.mean()  # if several gpus are used to split minibatch

loss, _ = vi.Utils.get_loss_categorical(kl, output, targets, beta=beta) 
#loss, _ = vi.Utils.get_loss_normal(kl, output, targets, beta=beta) 
loss.backward()

or design your own, e.g.

loss = kl_coef*kl - loglikelihood
loss.backward()
  1. uncertainty_estimate.py - file describes set of functions to perform uncertainty estimation, e.g.
  • get_prediction_class - function which return the most common class in iterations
  • summary_class - function creates a summary file with statistics

Current implementation of networks for different problems

Classification

Script bayesian_dnn_class/main.py is the main executable code and all standard DNN models are located in bayesian_dnn_class/models, and are:

  • AlexNet
  • Fully Connected
  • DenseNet
  • ResNet
  • VGG
Owner
Jurijs Nazarovs
PhD student in statistics at the UW-Madison.
Jurijs Nazarovs
Yolov5 deepsort inference,使用YOLOv5+Deepsort实现车辆行人追踪和计数,代码封装成一个Detector类,更容易嵌入到自己的项目中

使用YOLOv5+Deepsort实现车辆行人追踪和计数,代码封装成一个Detector类,更容易嵌入到自己的项目中。

813 Dec 31, 2022
Code for pre-training CharacterBERT models (as well as BERT models).

Pre-training CharacterBERT (and BERT) This is a repository for pre-training BERT and CharacterBERT. DISCLAIMER: The code was largely adapted from an o

Hicham EL BOUKKOURI 31 Dec 05, 2022
CoaT: Co-Scale Conv-Attentional Image Transformers

CoaT: Co-Scale Conv-Attentional Image Transformers Introduction This repository contains the official code and pretrained models for CoaT: Co-Scale Co

mlpc-ucsd 191 Dec 03, 2022
Does MAML Only Work via Feature Re-use? A Data Set Centric Perspective

Does-MAML-Only-Work-via-Feature-Re-use-A-Data-Set-Centric-Perspective Does MAML Only Work via Feature Re-use? A Data Set Centric Perspective Installin

2 Nov 07, 2022
This is an early in-development version of training CLIP models with hivemind.

A transformer that does not hog your GPU memory This is an early in-development codebase: if you want a stable and documented hivemind codebase, look

<a href=[email protected]"> 4 Nov 06, 2022
Deep Image Matting implementation in PyTorch

Deep Image Matting Deep Image Matting paper implementation in PyTorch. Differences "fc6" is dropped. Indices pooling. "fc6" is clumpy, over 100 millio

Yang Liu 724 Dec 27, 2022
Collection of machine learning related notebooks to share.

ML_Notebooks Collection of machine learning related notebooks to share. Notebooks GAN_distributed_training.ipynb In this Notebook, TensorFlow's tutori

Sascha Kirch 14 Dec 22, 2022
This is an official implementation for "Exploiting Temporal Contexts with Strided Transformer for 3D Human Pose Estimation".

Exploiting Temporal Contexts with Strided Transformer for 3D Human Pose Estimation This repo is the official implementation of Exploiting Temporal Con

Vegetabird 241 Jan 07, 2023
Net2net - Network-to-Network Translation with Conditional Invertible Neural Networks

Net2Net Code accompanying the NeurIPS 2020 oral paper Network-to-Network Translation with Conditional Invertible Neural Networks Robin Rombach*, Patri

CompVis Heidelberg 206 Dec 20, 2022
Another pytorch implementation of FCN (Fully Convolutional Networks)

FCN-pytorch-easiest Trying to be the easiest FCN pytorch implementation and just in a get and use fashion Here I use a handbag semantic segmentation f

Y. Dong 158 Dec 21, 2022
Dense Passage Retriever - is a set of tools and models for open domain Q&A task.

Dense Passage Retrieval Dense Passage Retrieval (DPR) - is a set of tools and models for state-of-the-art open-domain Q&A research. It is based on the

Meta Research 1.1k Jan 03, 2023
DCSL - Generalizable Crowd Counting via Diverse Context Style Learning

DCSL Generalizable Crowd Counting via Diverse Context Style Learning Requirement

3 Jun 13, 2022
Face Mask Detection System built with OpenCV, TensorFlow using Computer Vision concepts

Face mask detection Face Mask Detection System built with OpenCV, TensorFlow using Computer Vision concepts in order to detect face masks in static im

Vaibhav Shukla 1 Oct 27, 2021
This repository contains the entire code for our work "Two-Timescale End-to-End Learning for Channel Acquisition and Hybrid Precoding"

Two-Timescale-DNN Two-Timescale End-to-End Learning for Channel Acquisition and Hybrid Precoding This repository contains the entire code for our work

QiyuHu 3 Mar 07, 2022
[CVPR 2021] Released code for Counterfactual Zero-Shot and Open-Set Visual Recognition

Counterfactual Zero-Shot and Open-Set Visual Recognition This project provides implementations for our CVPR 2021 paper Counterfactual Zero-S

144 Dec 24, 2022
🕺Full body detection and tracking

Pose-Detection 🤔 Overview Human pose estimation from video plays a critical role in various applications such as quantifying physical exercises, sign

Abbas Ataei 20 Nov 21, 2022
This application is the basic of automated online-class-joiner(for YıldızEdu) within the right time. Gets the ZOOM link by scheduled date and time.

This application is the basic of automated online-class-joiner(for YıldızEdu) within the right time. Gets the ZOOM link by scheduled date and time.

215355 1 Dec 16, 2021
Weak-supervised Visual Geo-localization via Attention-based Knowledge Distillation

Weak-supervised Visual Geo-localization via Attention-based Knowledge Distillation Introduction WAKD is a PyTorch implementation for our ICPR-2022 pap

2 Oct 20, 2022
Neural Lexicon Reader: Reduce Pronunciation Errors in End-to-end TTS by Leveraging External Textual Knowledge

Neural Lexicon Reader: Reduce Pronunciation Errors in End-to-end TTS by Leveraging External Textual Knowledge This is an implementation of the paper,

Mutian He 19 Oct 14, 2022
Localized representation learning from Vision and Text (LoVT)

Localized Vision-Text Pre-Training Contrastive learning has proven effective for pre- training image models on unlabeled data and achieved great resul

Philip Müller 10 Dec 07, 2022