1

Let be $Q$ a matrix with eigenvalues less than 1. Let be $Q_f$ a matrix definided by $$Q_f(i,j)=f(i)Q(i,j).$$ I need to prove that for $f$ enough near 1, then $Q_f$ have only eigenvalues less than 1.

  • 1
    Please don't clutter the site with copies of your question, i.e., https://math.stackexchange.com/q/2771643/307944 Your question will not appear on the top of the main list, but many experienced users scan today's questions by their favorite tags. – Carl Christian May 08 '18 at 19:16
  • What Carl Christian says. Care to explain why didn't you just edit the question? Now we have two versions of it floating around. I need to delete the other, because keeping both of them makes the site less organized, and the answers more difficult to find (to any future visitor). – Jyrki Lahtonen May 09 '18 at 05:23
  • You see, the first version DID NOT DISAPPEAR ANYWHERE. Sometimes you just need to wait for a day or two (or a week) for an answer. Most people capable of answering your question follow their favorite tags, and even though your question drifted away from the "front page" OF SOME WAY OF VIEWING THE SITE it will still be very visible IN THE VIEWS THAT ARE MOST LIKELY TO MATTER. – Jyrki Lahtonen May 09 '18 at 05:45
  • And, sorry about shouting - the site is currently a bit tense leaving me a bit edgy. It's just that Occam's razor suggested to me that you reposted simply because as a new user you didn't really understand who the site functions. – Jyrki Lahtonen May 09 '18 at 05:48
  • You will find this question so closely related to your own that the answers will almost certainly solve your current problem. – Carl Christian May 11 '18 at 13:57

0 Answers0