Questions tagged [rbm]

A restricted Boltzmann machine (RBM) is a stochastic neural network.

A restricted Boltzmann maschine (RBM) is a stochastic neural network. RBMs can be stacked to deep networks.

See also

30 questions
17
votes
3 answers

Intuition Behind Restricted Boltzmann Machine (RBM)

I went through Geoff Hinton's Neural Networks course on Coursera and also through introduction to restricted boltzmann machines, still I didn't understand the intuition behind RBMs. Why do we need to compute energy in this machine? And what is the…
Born2Code
  • 347
  • 2
  • 10
15
votes
3 answers

How to use RBM for classification?

At the moment I'm playing with Restricted Boltzmann Machines and since I'm at it I would like try to classify handwritten digits with it. The model I created is now a quite fancy generative model but I don't know how to go further with it. In this…
Stefan Falk
  • 263
  • 1
  • 2
  • 8
8
votes
1 answer

Why a restricted Boltzman machine (RBM) tends to learn very similar weights?

These are 4 different weight matrices that I got after training a restricted Boltzman machine (RBM) with ~4k visible units and only 96 hidden units/weight vectors. As you can see, weights are extremely similar - even black pixels on the face are…
ffriend
  • 2,831
  • 19
  • 19
7
votes
2 answers

How are non-restricted Boltzmann machines trained?

Restricted Boltzmann machines are stochastic neural networks. The neurons form a complete bipartite graph of visible units and hidden units. The "restricted" is exactly the bipartite property: There may not be a connection between any two visible…
Martin Thoma
  • 19,540
  • 36
  • 98
  • 170
5
votes
2 answers

What are the advantages of contrastive divergence vs the gradient of the quadratic difference between the original data and the reconstructed data?

In this example I have a RBM with a visible layer and a hidden layer. The original data is "data", the values of the hidden neurons is "hid", the values of the visible neurons calculated from "hid" is "vis", and then the value for the hidden neurons…
user
  • 2,023
  • 7
  • 23
  • 38
5
votes
3 answers

Training Restricted Boltzmann Machines (RBMs) using gradient descent

Hey I am a little new to the whole RBM entropy/energy training thing, having some trouble understanding and trying to figure whether it is worth the effort needed to understand. Can't RBMs quite easily be trained with a standard gradient descent…
user18886
  • 51
  • 2
4
votes
2 answers

How do I train an RBM on color images?

I am having a hard time understanding the strategy for inputting the color. Most tutorials on RBMs only train grayscale images. If the image is grayscale, the input units can be binary, and I can normalize the gray scale value to [0,1], and then…
smörkex
  • 141
  • 3
4
votes
1 answer

How is dimensionality reduction achieved in Deep Belief Networks with Restricted Boltzmann Machines?

In neural networks and old classification methods, we usually construct an objective function to achieve dimensionality reduction. But Deep Belief Networks (DBN) with Restricted Boltzmann Machines (RBM) learn the data structure through unsupervised…
Vespa
  • 43
  • 4
3
votes
1 answer

How can I know the name of the features selected by a Deep Belief Network?

I want to use DBN to reduce the 41 features of nslkdd dataset after transforming nominal data to numeric the number of features increases from 41 to 121 . I used 3 RBMs (121-50-10) now I want to know the 10 selected features i.e know their names to…
fattoun
  • 159
  • 1
  • 2
  • 7
3
votes
1 answer

Training the parameters of a Restricted Boltzman machine

Why are the parameters of a Restricted Boltzmann machine trained for a fixed number of iterations (epochs) in many papers instead of choosing the ones corresponding to a stationary point of the likelihood? Denote the observable data by $x$, hidden…
fabian
  • 205
  • 1
  • 6
2
votes
0 answers

Contrastive divergence in RBM

I have the following code, where x in the input data, w is the weight matrix, bv and bh are the biases for the visible and hidden units. import theano.tensor as T x_states = x > numpy.random.rand(training_examples, feats) hid = …
user
  • 2,023
  • 7
  • 23
  • 38
2
votes
1 answer

Build Deep Belief Autoencoder for Dimensionality Reduction

I'm working with a large dataset (about 50K observations x 11K features) and I'd like to reduce the dimensionality. This will eventually be used for multi-class classification, so I'd like to extract features that are useful for separating the data.…
CopyOfA
  • 167
  • 6
2
votes
1 answer

Isn't computing the "tractable error" in Restricted Boltzmann Machines (RBM) intractable?

Let $v \in \{0,1\}^M$ be the visible layer, $h \in \{0,1\}^N$ be the hidden layer, where $M$ and $N$ are natural numbers. Given the biases $b \in \Re^M$, $c \in \Re^N$ and weights $W \in \Re^{M \times N}$, the energy and probability of an RBM is…
2
votes
1 answer

Why training a Restricted Boltzmann Machine corresponds to having a good reconstruction of training data?

Many tutorials suggest that after training a RBM, one can have a good reconstruction of training data just like an autoencoder. An example tutorial. But the training process of RBM is essentially to maximize the likelihood of the training data. We…
LtChang
  • 23
  • 3
2
votes
1 answer

How hidden layer is made binary in Restricted Boltzmann Machine (RBM)?

In RBM, in the positive phase for updating the hidden layer(which should also be binary), [Acually consider a node of h1 ∈ H(hidden layer vector)] to make h1 a binary number we compute the probability of turning on a hidden unit by operating…
1
2