In which neural net architecture, does weight sharing occur?

recurrent neural network
convolutional neural network
. fully connected neural network
both a and b

The correct answer is B. convolutional neural network.

In a convolutional neural network (CNN), a weight matrix is shared across multiple input channels. This means that the same set of weights is applied to each input channel, which can help to reduce the number of parameters in the network.

A recurrent neural network (RNN) is a type of neural network that is used to process sequential data. In an RNN, the weights are not shared across multiple input channels, but are instead shared across multiple time steps. This allows the network to learn long-term dependencies in the data.

A fully connected neural network (FNN) is a type of neural network that has a fully connected architecture. This means that each neuron in the network is connected to every other neuron in the network. FNNs are often used for tasks such as classification and regression.

In summary, weight sharing occurs in convolutional neural networks. This is because the same set of weights is applied to each input channel, which can help to reduce the number of parameters in the network.