9

I was experimenting with different modelling methods including KNN, Decision Trees, Neural Networks and SVN and trying to fit my data to see which works the best. To my surprise, the decision tree works the best with training accuracy of 1.0 and test accuracy of 0.5. The neural networks, which I believed would always perform the best no matter what has a training accuracy of 0.92 and test accuracy of 0.42 which is 8% less than the decision tree classifier.

What are the circumstances/cases where neural networks could have low accuracy when compared to a modelling technique like the decision tree? I had tried my neural network with different configurations like:

  • 1 hidden layer and 1 neuron : Train Accuracy 34% Test Accuracy 42%
  • 7 hidden layers and 5 neurons in each layer: Train Accuracy 79% Test Accuracy 42%
  • 1 hidden layer and 100 neurons: Train Accuracy 34% and Test Accuracy 35%

but not in a single case, I found the neural network to beat the decision tree test accuracy of 50%.

desertnaut
  • 2,154
  • 2
  • 16
  • 25
Suhail Gupta
  • 611
  • 8
  • 15

1 Answers1

9

Neural Networks, in my experience have several hyper-parameters (number of layers, neurons per layer, activation functions, optimizers, regularizers, etc.) and are very hard in finding the best configuration for each task. In fact in most cases it's not even worth it trying to find the optimal configuration as other classifiers can outperform Neural Networks with default hyper-parameters. Furthermore, NNs require caution as they are prone to overfitting.

For most tasks where you deal with structured data, I've found tree-based algorithms (especially boosted ones) to outperform NNs.

Some NN architectures are state-of-the-art tasks where we have a lot of unstructured data (e.g. CNNs for image-related tasks).

Finally, I'd like to say that there are no absolutes (e.g. SVMs will alawys outperform DTs). There is also a theorem along these lines: No Free Lunch Theorem.

JkBk
  • 432
  • 2
  • 7