5

I have created a dataset which has rather large number of features for example-100,000. Is it too large for a decent computer to handle ( I have a 1080ti )?

Green Falcon
  • 14,308
  • 10
  • 59
  • 98
Mahmud Sabbir
  • 153
  • 1
  • 3

1 Answers1

6

It highly depends on your data. If it's image, I guess it is somehow logical but if not I recommend you constructing covariance matrix and tracking whether features have correlation or not. If you see many features are correlated, it is better to discard correlated features. You also can employ PCA to do this. Correlated features cause larger number of parameters for neural network.

Also I have to say that maybe you can reduce the number of parameters if your inputs are images by resizing them. In popular nets the length and height of input images are usually less than three hundred which makes the number of input features 90000. Also you can employ max-pooling after some convolution layers, if you are using convolutional nets, to reduce the number of parameters. Refer here which maybe helpful.

Green Falcon
  • 14,308
  • 10
  • 59
  • 98