The influence of the number of hidden neurons to the performance of the feed-forward neural network

Description

In this experiment two-layer feed-forward neural networks with different number of hidden neurons are trained on the Sonar, Mines vs. Rocks data set. The average classification error rate from 10-fold cross-validation is determined for each neural network.

The main contribution of the experiment is that it shows how to change the value of a list type parameter of an operator (in our case, the hidden layers parameter of the Neural Net operator) in loops using a macro.

To obtain a reasonable execution time only neural networks with the following number of hidden neurons are considered: 1, 2, 4, 8, 16.

Input

Sonar, Mines vs. Rocks [UCI MLR]

Output

Figure 8.5. The average classification error rate obtained from 10-fold cross-validation against the number of hidden neurons.

The average classification error rate obtained from 10-fold cross-validation against the number of hidden neurons.

Interpretation of the results

The figure shows that the best average classification error rate (14.5%) is achieved when the number of hidden neurons is 8.

Video

Workflow

ann_exp3.rmp

Keywords

feed-forward neural network
supervised learning
error rate
classification
cross-validation

Operators

Apply Model
Guess Types
Log
Log to Data
Loop Values
Neural Net
Performance (Classification)
Print to Console
Provide Macro as Log Value
Read CSV
X-Validation
Execute Script (R) [R Extension]