Skip to content

Conversation

@nameEqualsJared
Copy link

@nameEqualsJared nameEqualsJared commented Mar 22, 2019

Was this meant to say what I've proposed? Because as stated, I am not sure if the proposition is correct. Let for a counter example A = [0 1; 1 0]. Then consider the column vectors a=[1; 1] and b=[-1; -1]. For the eigenvalue y=1, both vectors a and b are eigenvectors (because Av = 1v for both a and b), correct? So I think one eigenvalue may have many associated eigenvectors; but one eigenvector (I believe) always has a unique eigenvalue.

Edit: corrected the b vector

Was this meant to say what I've proposed? Because as stated, I am not sure if the proposition is correct. Let for a counter example A = [0 1; 1 0]. Then consider the column vectors a=[1; 1] and b=[-1; 1]. For the eigenvalue y=1, both vectors a and b and eigen vectors (because A*v = 1*v for both a and b), correct? So I think one eigenvalue may have many associated eigenvectors; but one eigenvector (I believe) always a unique eigenvalue.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant