New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
mathematics
linear algebra
Linear Algebra A Modern Introduction 4th edition David Poole - Solutions
In Exercises 47-50, draw the Gerschgorin disks for the given matrix.1.2. 3. 4.
In Exercises 1-3, a matrix A is given along with an iterate xk, produced using the power method, as in Example 4.31. (a) Approximate the dominant eigenvalue and eigenvector by computing the corresponding mk and Yk· (b) Verify that you have approximated an eigenvalue and an eigenvector of A
A square matrix is strictly diagonally dominant if the absolute value of each diagonal entry is greater than the sum of the absolute values of the remaining entries in that row. Use Gerschgorin's Disk Theorem to prove that a strictly diagonally dominant matrix must be invertible.
If A is an n à n matrix, let ||A|| denote the maximum of the sums of the absolute values of the rows of A; that is,
Let λ be an eigenvalue of a stochastic matrix A (see section 3.7). Prove that | λ | ≤ 1.
Prove that the eigenvalues ofAre all real, and locate each of these eigenvalues within a cloased interval on the real line.
In Exercises 1-3, use the power method to approximate the dominant eigenvalue and eigenvector of A. Use the given initial vector x0, the specified number of iterations k, and three-decimal-place accuracy1.2. 3.
Which of the stochastic matrices in Exercises 1-4 are regular?1.2. 3. 4.
Prove that the steady state probability vector of a regular Markov chain is unique.
In Exercises 11-14, calculate the positive eigenvalue and a corresponding positive eigenvector of the Leslie matrix L.1.2. 3. 4.
If a Leslie matrix has a unique positive eigenvalue λ1, what is the significance for the population if is λ1 > 1? λ1 > 1? λ1 = 1?
Verify that the characteristic polynomial of the Leslie matrix L in Equation (3) is cL(λ) = (-1)n(λn - b1 λn-1 - b2s1 λn-2 - b3s1s2 λn-3 - ∙∙∙ - bns1s2 ∙∙∙sn-1)
If all of the survival rates si are nonzero, let
Verify that an eigenvector of L corresponding to λ1 is
In Exercises 1-3, compute the steady state growth rate of the population with the Leslie matrix L from the given exercise. Then use Exercise 18 to help find the corresponding distribution of the age classes.1. Exercise 39 in Section 3.72. Exercise 40 in Section 3.7 3. Exercise 44 in Section 3.7
If λ1 is the unique positive eigenvalue of a Leslie matrix L and h is the sustainable harvest ratio, prove that h = 1 - 1/λ1.
In Exercises 1-2, find the Perron root and the corresponding Perron eigenvector of A.1.2. 3.
It can be shown that a nonnegative n X n matrix is irreducible if and only if (I + A) n- 1 > 0. In Exercises 1-3, use this criterion to determine whether the matrix A is irreducible. If A is reducible, find a permutation of its rows and columns that puts A into the block form1. 2. 3.
(a) Show that for any choice of initial conditions x0 = r and x1 = s, the scalars c1 and c2 can be found, as stated in Theorem 4.38(a) and (b).(b) If the eigenvalues λ1 and λ2 are distinct and the initial conditions are x0 = 0, x1 = 1, show that
In Exercises 1-3, P is the transition matrix of a regular Markov chain. Find the long range transition matrix L of P.1.2. 3.
If A is a 3 × 3 diagonalizable matrix with eigenvalues - 2, 3, and 4, find det A.
If A is a 2 à 2 matrix with eigenvalues λ1 = ½, λ2 = - 1, and corresponding eigenvectorsFind
If A is a diagonalizable matrix and all of its eigenvalues satisfy |λ| < 1, prove that A'' approaches the zero matrix as n gets large.
In Questions 1-3, determine, with reasons, whether A is similar to B. If a ~ B, give an invertible matrix P such that P-1 AP = B.1.2. 3.
If A is similar to B with P-1 AP = B and x is an eigenvector of A, show that P-1 x is an eigenvector of B.
Let A and B be 4 × 4 matrices with det A = 2 and det B = - ¼. Find det C for the indicated matrix C: (a) C = (AB) - 1 (b) C = A2B(3AT)
If A is a skew-symmetric n × n matrix and n is odd, prove that det A = 0.
Find all values of k for which
In Questions 1 and 2, show that x is an eigenvector of A and find the corresponding eigenvalue.1.2.
If x is an eigenvector of A with eigenvalue λ = 3, show what x is also an eigenvector of A2 - 5A + 2I. What is the corresponding eigenvalue?
In Exercises 1-3, determine which sets of vectors are orthogonal.1.2. 3.
In Exercises 1-3, determine whether the given orthogonal set of vectors is orthonormal. If it is not, normalize the vectors to form an orthonormal set.1.2. 3.
In Exercises 1-3, determine whether the given matrix is orthogonal. If it is, find its inverse.1.2. 3.
Let Q be an orthogonal 2 ( 2 matrix and let x and y be vectors in. If ( is the angle between x and y, prove that the angle between Qx and Qy is also (. (This proves that the linear transformations defined by orthogonal matrices are angle-preserving in, a fact that is true in general.)
(a)Prove that an orthogonal 2 ( 2 matrix must have the formWhereIs a unit vector.(b)Using part (a), show that every orthogonal 2 ( 2 matrix is of the formWhere 0 ( ( ( 2 (.
Let A and B be n ( n orthogonal matrices.(a)Prove that A (AT + BT)B = A + B.(b)Use part (a) to prove that, if det A + det B = 0, then A + B is not invertible.
Let
Prove that if an upper triangular matrix is orthogonal, then it must be a diagonal matrix.
Prove that if n ( m, then there is no m ( n matrix A such that ( Ax ( = ((x(( for all x in.
x ( y = (x ( v1) (y ( v1) + ( x ( v2) (y ( v2) + ( ( (+ (x ( vn) ( y ( vn)(This identity is called Parseval's Identity.)(b) What does Parseval's Identity imply about the relationship between the dot products x ( y and [x]B ( [y]B?
In Exercises 1-3, show that the given vectors form an orthogonal basis for or . Then use Theorem 5.2 to express w as a linear combination of these basis vectors. Give the coordinate vector [w]B of w with respect to the basis B = {v1, v2 of or B = v1, v2, v3 of .1.2.3.
In Exercises 1-3, find the orthogonal complement Wof W and give a basis for W.1.2.3.
In Exercises 1- 3, let W be the subspace spanned by the given vectors. Find a basis for W¥.1.2. 3.
In Exercises 1-3, find the orthogonal projection of v onto the subspace W spanned by the vectors ui.(You may assume that the vectors ui are orthogonal.)1.2. 3.
In Exercises 1-3, find the orthogonal decomposition of v with respect to W.1.2. 3.
Either prove that it is true or find a counterexample.
Let {v1 ( ( ( ( ( v} be an orthogonal basis for and let W = span (v1( ( ( ( ( vn). Is it necessarily true that W = span(vk+1( ( ( ( (vn)? Either prove that it is true or find a counterexample.
1. Prove that x is in W if and only if projw(x) = x.2. Prove that x is orthogonal to W if and only if projw(x) = 0.3. Prove that projw(projw(x)) = projw(x).
(a) Prove that(( x ((2 ( ( x ( v1 (2 + ( x ( v2 (2 + ( ( ( + ( x ( vk (2(This inequality is called Bassel's Inequality.)(b) Prove that Bessel's Inequality is an equality if and only if x is in span (S).
In Exercises 1 and 2, find bases for the row space and null space of A. Verify that every vector in row(A ) is orthogonal to every vector in null(A).1.2.
1.
Find an orthogonal basis for that contains the vector
Find an orthogonal basis for that contains the vectors
In Exercises 1 and 2, fill in the missing entries of Q to make Q an orthogonal matrix.1.2.
In Exercises 1 and 2, the columns of Q were obtained by applying the Gram-Schmidt Process to the columns of A. Find the upper triangular matrix R such that A = QR.1.2.
Prove that A is invertible if and only if A = QR, where Q is orthogonal and R is upper triangular with nonzero entries on its diagonal.
Let A b e an m ( n matrix with linearly independent columns. Give an alternative proof that the upper triangular matrix R in a QR factorization of A must be invertible, using property (c) of the Fundamental Theorem.
Let A be an m ( n matrix with linearly independent columns and let A = QR be a QR factorization of A. Show that A and Q have the same column space.
1.
In Exercises 1 and 2, find the orthogonal decomposition of v with respect to the subspace W.1.W as in Exercise 5 2. W as in Exercise 6
Use the Gram-Schmidt Process to find an orthogonal basis for the column spaces of the matrices in Exercises 1 and 2.1.2.
Orthogonally diagonalize the matrices in Exercises 1-3 by finding an orthogonal matrix Q and a diagonal matrix D such that QT AQ = D.1.2.3.
If b ‰ 0, orthogonally diagonalize
If b ‰ 0, orthogonally diogonolire
Let A and B be orthogonally diagonalizable n ( n matrices and let c be a scalar. Use the Spectral Theorem to prove that the following matrices are orthogonally diagonalizable: (a) A + B (b) cA (c) A2
If A is an invertible matrix that is orthogonally diagonalizable, show that A-1 is orthogonally diagonalizable.
If A is a symmetric matrix, show that every eigenvalue of A is nonnegative if and only if A = B2 for some symmetric matrix B.
In Exercises 1 and 2, find a symmetric 2 ( 2 matrix with eigenvalues 1 and 2 and corresponding orthogonal eigenvectors v1 and v2.1.2.
In Exercises 1 and 2, find a symmetric 3 ( 3 matrix with eigenvalues 1, 2, and 3 and corresponding orthogonal eigenvectors v1, v2, and v3.1.2.
(a) Show that the matrix of the orthogonal projection onto W is given by
Let A be an n ( n real matrix, all of whose eigenvalues are real. Prove that there exist an orthogonal matrix Q and an upper triangular matrix T such that QT AQ = T. This very useful result is known as Schur's Triangularization Theorem. [Hint: Adapt the proof of the Spectral Theorem.]
Let A be a nilpotent matrix (see Exercise 56 in Section 4.2). Prove that there is an orthogonal matrix Q such that QT AQ is upper triangular with zeros on its diagonal. [Hint: Use Exercise 27.]
In Exercises 1-3, evaluate the quadratic form f(x) = xT Ax for the given A and x.1.2. 3.
Diagonalize the quadratic forms in Exercises 1-3 by finding an orthogonal matrix Q such that the change of variable x = Qy transforms the given form into one with no cross-product terms. Give Q and the new quadratic form.1.2. x2 + 8xy + y2 3.
Classify each of the quadratic forms in Exercises 1-3 as positive definite, positive semidefinite, negative definite, negative semidefinite, or indefinite.1.2. 3. - 2x2 - 2y2 + 2xy
Let B be an invertible matrix. Show that A = BT B is positive definite.
Let A be a positive definite symmetric matrix. Show that there exists an invertible matrix B such that A = BT B. [Hint: Use the Spectral Theorem to write A = QDQT. Then show that D can be factored as CT C for some invertible matrix C.]
Let A and B be positive definite symmetric n ( n matrices and let c be a positive scalar. Show that the following matrices are positive definite. (a) cA (b) A2 (c) A + B (d) A-1 (First show that A is necessarily invertible.)
Let A be a positive definite symmetric matrix. Show that there is a positive definite symmetric matrix B such that A = B2. (Such a matrix B is called a square root of A.)
In Exercises 1-3, identify the graph of the given equation.1. x2 + 5y2 = 252. x2 - y2 - 4 = 03. x2 - y - 1 = 0
In Exercises 1-3, use a translation of axes to put the conic in standard position. Identify the graph, give its equation in the translated coordinate system, and sketch the curve.1. x2 + y2 - 4x - 4y + 4 = 02. 4x2 + 2y2 - 8x + l2y + 6 = 03. 9x2 - 4y2 - 4y = 37
In Exercises 1-3, use a rotation of axes to put the conic in standard position. Identify the graph, give its equation in the rotated coordinate system, and sketch the curve. 1. x2 + xy + y2 = 6 2. 4x2 + 10xy + 4y2 = 9 3. 4x2 + 6xy - 4y2 = 5
In Exercises 1-3, identify the conic with the given equation and give its equation in standard form.1. 3x2 - 4xy + 3y2 - 28√2x + 22√2y + 84 = 02. 6x2 - 4xy + 9y2 - 20x - l0y - 5 = 03. 2xy + 2√2x - 1 = 0
Sometimes the graph of a quadratic equation is a straight line, a pair of straight lines, or a single point. We refer to such a graph as a degenerate conic. It is also possible that the equation is not satisfied for any values of the variables, in which case there is no graph at all and we refer to
Let A be a symmetric 2 ( 2 matrix and let k be a scalar. Prove that the graph of the quadratic equation xT Ax = k is(a) A hyperbola if k ≠ 0 and det A < 0(b) An ellipse, circle, or imaginary conic if k ≠ 0 and det A > 0(c) A pair of straight lines or an imaginary conic if k ≠ 0 and det
In Exercises 1-3, identify the quadric with the given equation and give its equation in standard form. 1. 4x2 + 4y2 + 4z2 + 4xy + 4xz + 4yz = 8 2. x2 + y2 + z2 - 4yz = 1 3. -x2 - y2 - z2 + 4xy + 4xz + 4yz = 12
In Exercises 1-3, find the symmetric matrix A associated with the given quadratic form.1.2. x1x2 3. 3x2 - 3xy - y2
Let A be a real 2 ( 2 matrix with complex eigenvalues ‹‹ = a ( bi such that b ‰ 0 and ( ‹‹ ( = 1. Prove that every trajectory of the dynamical system xk+1 = Axk lies on an ellipse. [Hint: Theorem 4.43 shows that if v is an eigenvector corresponding to ‹‹ = a - bi, then the
Mark each of the following statements true or false:(a) Every orthonormal set of vectors is linearly independent.(c) If A is a square matrix with orthonormal rows, then A is an orthogonal matrix.(d) Every orthogonal matrix is invertible.(e) If A is a matrix with det A = 1, then A is an orthogonal
x = ty = 2tz = -t
W = span
W = span
Find bases for each of the four fundamental subspaces of
Find the orthogonal decomposition ofWith respect to
(a) Apply the Gram-Schmidt Process toto find an orthogonal basis for W = span{x1, x2, x3}. (b) Use the result of part (a) to find a QR factorization of
Find an orthogonal basis for that contains the vectors
Find an orthogonal basis for the subspace
Find a symmetric matrix with eigenvalues 1 = 2 = 1, 3 = - 2 and eigenspaces
Find all values of a and b such thatIs an orthogonal set of vectors.
Pove that A is a symmetrix matrix with eigenval-ues c1, v2, ( ( ( ( cn and corresponding eigenvectors v1, v2( ( ( ( ( vn.
Showing 2600 - 2700
of 11883
First
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
Last
Step by Step Answers