I have a question regarding limits.
Recently in a math class, my teacher states that $\frac{\sin x}{x}$ goes to $1$ hence in the case of a $\lim_{x\to 0} \frac{x}{\sin x}$, the answer is $1$.
Why is that so? Shouldn't the answer be $0$ in this case?
I have a question regarding limits.
Recently in a math class, my teacher states that $\frac{\sin x}{x}$ goes to $1$ hence in the case of a $\lim_{x\to 0} \frac{x}{\sin x}$, the answer is $1$.
Why is that so? Shouldn't the answer be $0$ in this case?
Hint: Look up the squeeze theorem or L'Hospital's rule.
There are many ways to go about this. One which may help shed light on "why" the limit tends to 1 is that for $x \approx 0$ $$\sin(x) \approx x.$$ You can see this with the Taylor series expansion of sine.
EDIT: Thanks to JLA for pointing this out.
You may also verify this by graphing the sine function for inputs close to 0 and observing that the graph looks linear with slope 1.
Given what you stated you know, namely that $\lim_{x \to 0} \frac{\sin(x)}{x} = 1$, I think what you are missing is the following property of limits.
If $f$ and $g$ are functions such that $\lim_{x \to a} f(x)$ and $\lim_{x \to a} g(x)$ exist, and moreover $\lim_{x \to a} g(x) \neq 0$ then $$\lim_{x \to a} \frac{f(x)}{g(x)} = \frac{ \lim_{x \to a} f(x)}{\lim_{x \to a} g(x)}.$$
In your case, $g(x) = \frac{\sin(x)}{x}, f(x) = 1$, and $a=0$ (as @Kf_Sansoo pointed out in their answer).