Questions tagged [learnability]

11 questions
18
votes
4 answers

Can a neural network compute $y = x^2$?

In spirit of the famous Tensorflow Fizz Buzz joke and XOr problem I started to think, if it's possible to design a neural network that implements $y = x^2$ function? Given some representation of a number (e.g. as a vector in binary form, so that…
Boris Burkov
  • 337
  • 3
  • 9
3
votes
2 answers

Formal conditions on mappings that can NOT be learned from data

I am new to machine learning and would appreciate some help on the following question. I have observed the literature is focused on algorithms, how one learning does better compared to others for a given data set and remarkable progress have been…
2
votes
1 answer

Why does PAC learning focus on learnability of the hypothesis class and not the target function?

The definition of PAC learning is roughly An algorithm is a PAC learning algorithm if it given enough data, for any target function, it asymptotically does as well as it possibly could given the functions it's capable of representing. This…
Jack M
  • 285
  • 1
  • 6
2
votes
1 answer

PAC Learnability - Notation

The following is from Understanding Machine Learning: Theory to Algorithm textbook: Definition of PAC Learnability: A hypothesis class $\mathcal H$ is PAC learnable if there exist a function $m_H : (0, 1)^2 \rightarrow \mathbb{N}$ and a learning…
tkj80
  • 139
  • 3
2
votes
1 answer

Meaning of Instance Space and Concept Class, (PAC Learnable)

I'm studying Probably approximately correct learning, and I don't understand what an Instance Space and a Concept is. I have see that wikipedia https://en.wikipedia.org/wiki/Probably_approximately_correct_learning provides various examples, but…
2
votes
2 answers

Are there any algorithms that learn to learn a function mapping?

Typical algorithms involve learning and applying a single mapping e.g. $ f: X \mapsto Y$ Are there any algorithms that learn multiple mappings given an extra variable e.g. $ f(Z): X \mapsto Y$ (This feels like an abuse of notation.). Here we learn a…
Daniel
  • 256
  • 1
  • 8
1
vote
1 answer

How do a real brain actually learn?

In the biology class we've learned, that the neurons are connected. And if two or more neurons interact with each other often, then the connection gets stronger and stronger, and new connections may form. But if a connection is unused for a while,…
1
vote
0 answers

Uniform convergence garantee on sample complexity

I can't understand why the Uniform Convergence guarantees an upper bound and not a lower bound on sample complexity as stated on [1] Corollary 4.4. If a class $H$ has the uniform convergence property with a function $m^{UC}_H$ then the class is…
1
vote
0 answers

Data is hard to learn as a whole, easier to learn after splitting logically

I have a 3D spaceship duel simulation (without the source code). I need to build a model that will learn the simulation's behavior. I can run the simulation as many times as I want, randomly feeding it inputs (velocities, ranges etc) and the…
0
votes
1 answer

Neural Network cannot learn nonlinear function

I am currently creating a neural network to learn a function of the following form Data that I want to learn x corresponds to x axis and y to y axis(one dependent and one independent variable) I am using both keras and tensorflow and with both…
0
votes
1 answer

Learnability of a finite series

How could one approach the following question: Given a series $ \{a_n\} \subset \mathbb{R}^2 $ where $n \in [N]$ , I want to characterize the predictability or learnability of it. One definition could be how easy is it to predict the next element in…