6

I was taught stepwise feature selection (like forward and backward selection) during college, and at the time, it seemed like a really effective way to pick features. But recently i have been reading more and realized that it’s actually considered pretty outdated and not ideal for real world projects.

Now I’m wondering, are there other methods or practices like this? Things that are still taught or commonly used, but aren’t really effective or recommended in modern data science? I’d really appreciate any insights or examples so I can learn the right way early on.

Thanks

Guna
  • 747
  • 1
  • 16

1 Answers1

2

Transformers have been one of the most effective approaches for word prediction. Their attention mechanism is more efficient than traditional RNN and the Seq2Seq Model.

Stable Diffusion has shown great progress in image generation over other image generation techniques. This is a hot topic and quite an active area of research.

Latent Spaces have been quite in the discussion since the introduction of VAE and GAN models.

Aviral Verma
  • 919
  • 1
  • 4