Low rank solvers #################################### benchmark/ : folder with the random instances used in the paper. #################################### code/ : folder that contains the code. Usage : ./mixing INPUT_INSTANCE OPTIONS OPTIONS : solver : "1" for LR-LAS "2" for LR-BCD rank : "-1" for running with rank r = ceil(sqrt(2n)) "k" for running with rank r = k rounding : "k" for computing the best integer solution value with k rounding schemes example : ./mixing ../benchmark/rd50-3-sparse-0.wcsp 1 -1 50 OUTPUT : [Upper bound value] [lower bound value] [cpu time] [Best upper bound value after rounding schemes] [cpu time rounding schemes] Eigen3 must be installed and the path to eigen3 must be updated in the makefile.
Efficient semidefinite bounds for multi-label discrete graphical models.
Overview
Official repository of the paper 'Essentials for Class Incremental Learning'
Essentials for Class Incremental Learning Official repository of the paper 'Essentials for Class Incremental Learning' This Pytorch repository contain
This is a five-step framework for the development of intrusion detection systems (IDS) using machine learning (ML) considering model realization, and performance evaluation.
AB-TRAP: building invisibility shields to protect network devices The AB-TRAP framework is applicable to the development of Network Intrusion Detectio
Official implementation for "Symbolic Learning to Optimize: Towards Interpretability and Scalability"
Symbolic Learning to Optimize This is the official implementation for ICLR-2022 paper "Symbolic Learning to Optimize: Towards Interpretability and Sca
Controlling the MicriSpotAI robot from scratch
Project-MicroSpot-AI Controlling the MicriSpotAI robot from scratch Colaborators Alexander Dennis Components from MicroSpot The MicriSpotAI has the fo
Building blocks for uncertainty-aware cycle consistency presented at NeurIPS'21.
UncertaintyAwareCycleConsistency This repository provides the building blocks and the API for the work presented in the NeurIPS'21 paper Robustness vi
Continual learning with sketched Jacobian approximations
Continual learning with sketched Jacobian approximations This repository contains the code for reproducing figures and results in the paper ``Provable
Churn prediction
Churn-prediction Churn-prediction Data preprocessing:: Label encoder is used to normalize the categorical variable Data Transformation:: For each data
CoSMA: Convolutional Semi-Regular Mesh Autoencoder. From Paper "Mesh Convolutional Autoencoder for Semi-Regular Meshes of Different Sizes"
Mesh Convolutional Autoencoder for Semi-Regular Meshes of Different Sizes Implementation of CoSMA: Convolutional Semi-Regular Mesh Autoencoder arXiv p
Simple, efficient and flexible vision toolbox for mxnet framework.
MXbox: Simple, efficient and flexible vision toolbox for mxnet framework. MXbox is a toolbox aiming to provide a general and simple interface for visi
Discerning Decision-Making Process of Deep Neural Networks with Hierarchical Voting Transformation
Configurations Change HOME_PATH in CONFIG.py as the current path Data Prepare CENSINCOME Download data Put census-income.data and census-income.test i
Semi-Autoregressive Transformer for Image Captioning
Semi-Autoregressive Transformer for Image Captioning Requirements Python 3.6 Pytorch 1.6 Prepare data Please use git clone --recurse-submodules to clo
Code for technical report "An Improved Baseline for Sentence-level Relation Extraction".
RE_improved_baseline Code for technical report "An Improved Baseline for Sentence-level Relation Extraction". Requirements torch = 1.8.1 transformers
Implementation of neural class expression synthesizers
NCES Implementation of neural class expression synthesizers (NCES) Installation Clone this repository: https://github.com/ConceptLengthLearner/NCES.gi
Line-level Handwritten Text Recognition (HTR) system implemented with TensorFlow.
Line-level Handwritten Text Recognition with TensorFlow This model is an extended version of the Simple HTR system implemented by @Harald Scheidl and
A booklet on machine learning systems design with exercises
Machine Learning Systems Design Read this booklet here. This booklet covers four main steps of designing a machine learning system: Project setup Data
A stable algorithm for GAN training
DRAGAN (Deep Regret Analytic Generative Adversarial Networks) Link to our paper - https://arxiv.org/abs/1705.07215 Pytorch implementation (thanks!) -
Molecular Sets (MOSES): A Benchmarking Platform for Molecular Generation Models
Molecular Sets (MOSES): A benchmarking platform for molecular generation models Deep generative models are rapidly becoming popular for the discovery
This repo provides a demo for the CVPR 2021 paper "A Fourier-based Framework for Domain Generalization" on the PACS dataset.
FACT This repo provides a demo for the CVPR 2021 paper "A Fourier-based Framework for Domain Generalization" on the PACS dataset. To cite, please use:
Deep Federated Learning for Autonomous Driving
FADNet: Deep Federated Learning for Autonomous Driving Abstract Autonomous driving is an active research topic in both academia and industry. However,
Light-SERNet: A lightweight fully convolutional neural network for speech emotion recognition
Light-SERNet This is the Tensorflow 2.x implementation of our paper "Light-SERNet: A lightweight fully convolutional neural network for speech emotion