All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Ask a Question
Search
Search
Sign In
Register
study help
mathematics
linear algebra and its applications
Questions and Answers of
Linear Algebra And Its Applications
Solve A x = b and A(Δx) Δb, and show a that the inequality (2) holds in each case. A = 7 -5 10 19 Ab 10-4 -6 1 11 9 -4 1 0-2 7 -3 7 1 .49 -1.28 5.78 8.04 ,b= .100 2.888 -1.404 1.462
Let {v1, v2} be an orthogonal set of nonzero vectors, and let c1, c2 be any nonzero scalars. Show that {c1v1, c2v2} is also an orthogonal set. Since orthogonality of a set is defined in terms of
First change the given pattern into a vector w of zeros and ones and then use the method illustrated in Example 5 to find a matrix M so that wT M w = 0, but uT M u ≠ 0 for all other nonzero vectors
Let W be a subspace of Rn with an orthogonal basis {w1,..., wp), and let (v1, ..., vq} be an orthogonal basis for W⊥. a. Explain why {w1,..., wp}, v1,..., vq} is an orthogonal set. b.
Show that if x is in both W and Wᗮ, then x = 0.
Let U be an orthogonal matrix, and construct V by interchanging some of the columns of U. Explain why V is an orthogonal matrix.
Let U and V be n × n orthogonal matrices. Explain why UV is an orthogonal matrix. [That is, explain why UV is invertible and its inverse is (UV)T.]
Verify the parallelogram law for vectors u and v in Rn: ||u + v||2+ ||uv||2 = 2||u||2+2||v||² -
Let W = Span {V1,..., Vp). Show that if x is orthogonal to each vj, for 1 ≤ j ≤ p, then x is orthogonal to every vector in W.
Let U be an n × n orthogonal matrix. Show that the rows of U form an orthonormal basis of Rn.
Let V be the space C[-1, 1] with the inner product of Example 7. Find an orthogonal basis for the subspace spanned by the polynomials 1, t, and t2. The polynomials in this basis are called Legendre
Refer to V = C [0, 1], with the inner product given by an integral, as in Example 7.Compute ||g|| for g in Exercise 28.Data From Exercise 28Refer to V = C Ι0,1Ι, with the inner product given by an
Suppose a vector y is orthogonal to vectors u and v. Show that y is orthogonal to the vector u + v.
LetCompute and compare u • v, ||u||2, ||v||2, and ||u + v||2. Do not use the Pythagorean Theorem. u= 3 -4 -1 and v= -8 -7 4
Suppose W is a subspace of Rn spanned by n nonzero orthogonal vectors. Explain why W = Rn.
Concern the (real) Schur factorization of an n x n matrix A in the form A = URUT, where U is an orthogonal matrix and R is an n x n upper triangular matrix.1Show that if A admits a (real) Schur
Find a formula for the least-squares solution of Ax = b when the columns of A are orthonormal.
Suppose A is m x n with linearly independent columns and b is in Rm. Use the normal equations to produce a formula for b̂, the projection of b onto Col A.
All vectors are in Rn. Justify each answer.(T/F) An orthogonal matrix is invertible.
Explain why an equation Ax = b has a solution if and only if b is orthogonal to all solutions of the equation ATx = 0.
Use the steps below to prove the following relations among the four fundamental subspaces determined by an m x n matrix A. Row A = (Nul A)⊥ Col A = .NulAT)?a. Show that Row A is contained in (Nul
Let A be an m × n matrix. Prove that every vector x in Rn can be written in the form x = p + u, where p is in Row A and u is in Nul A. Also, show that if the equation Ax = b is consistent, then
All vectors are in Rn. Justify each answer.(T/F) If L is a line through 0 and if ŷ is the orthogonal projection of y onto L, then ||ŷ|| gives the distance from y to L.
All vectors and subspaces are in Rn. Justify each answer.(T/F) If an n × p matrix U has orthonormal columns, then UUT x = x for all x in Rn.
All vectors are in Rn. Justify each answer.(T/F) The orthogonal projection of y onto v is the same as the orthogonal projection of y onto cv whenever c ≠ 0.
Let A be an m x n matrix whose columns are linearly independent.a. Use Exercise 27 to show that ATA is an invertible matrix.b. Explain why A must have at least as many rows as columns.c. Determine
Use Exercise 27 to show that rank ATA = rank A.Data From Exercise 27Let A be an m x n matrix. Use the steps below to show that a vector x in Rn satisfies A x = 0 if and only if ATA x = 0. This will
Consider the problem of finding an eigenvalue of an n x n matrix A when an approximate eigenvector v is known. Since v is not exactly correct, the equation Av = λv will probably not have a solution.
Use the method in this section to produce a QR factorization of the matrix in Exercise 28.Data from in Exercise 28 A = -10 2 -6 16 -16 2 1 13 * 7 1 -5 3 13 -2 -5 -11 3 -3 5 -7
All vectors and subspaces are in Rn. Justify each answer.(T/F) If the columns of an n × p matrix U are orthonormal, then UUT y is the orthogonal projection of y onto the column space of U.
All vectors are in Rn. Justify each answer.(T/F) A matrix with orthonormal columns is an orthogonal matrix.
Refer to vectors in Rn (or Rm) with the standard inner product. Justify each answer.(T/F) If two vectors are orthogonal, they are linearly independent.
Refer to V = C [0, 1], with the inner product given by an integral, as in Example 7.Compute ||f|| for f in Exercise 27.Data From Exercise 27Refer to V = C Ι0,1Ι, with the inner product given by an
Use the Gram–Schmidt process as in Example 2 to produce an orthogonal basis for the column space ofData from in Example 2 A = -10 2 -6 16 -16 2 1 13 * 7 1 -5 3 13 -2 -5 -11 3 -3 5 -7
If a, b, and c are distinct numbers, then the following system is inconsistent because the graphs of the equations are parallel planes. Show that the set of all least squares solutions of the system
Let se the Cauchy–Schwarz inequality to show that u= and v = [1]
Refer to V = C [0, 1], with the inner product given by an integral, as in Example 7.Compute (f. g), where f(t) = 5t – 2 and g(t) = 7t3 - 612. EXAMPLE 7 For f, g in C[a, b], set (f. g) = f* f(1)g(1)
All vectors and subspaces are in Rn. Justify each answer.(T/F) In the Orthogonal Decomposition Theorem, each term in formula (2) for is itself an orthogonal projection of y onto a subspace of W.
a. Let {v1; v2,.......,vp} be linearly independent set of vectors in Rn that is not necessarily orthogonal. Describe how to find the best approximation to z in Rn by vectors in W = Span{v1;
Let A be an m x n matrix such that ATA is invertible. Show that the columns of A are linearly independent. [Careful: You may not assume that A is invertible; it may not even be square.]
Refer to V = C [0, 1], with the inner product given by an integral, as in Example 7.Compute (f. g), where f(t) = 1 – 3t2 and g(t) = t – t3. EXAMPLE 7 For f, g in C[a, b], set (f. g) = f* f(1)g(1)
All vectors are in Rn. Justify each answer.(T/F) For an m x n matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A.
All vectors and subspaces are in Rn. Justify each answer.(T/F) If y = Z1 + Z2, where z1 is in a subspace W and z2 is in W⊥, then z1 must be the orthogonal projection of y onto W.
All vectors are in Rn. Justify each answer.(T/F) If the columns of an m × n matrix A are orthonormal, then the linear mapping x → Ax preserves lengths.
Justify the equation SS(T) = SS(R) + SS(E). This equation is extremely important in statistics, both in regression theory and in the analysis of variance.Involve a design matrix X with two or more
All vectors are in Rn. Justify each answer.(T/F) For a square matrix A, vectors in Col A are orthogonal to vectors in Nul A.
Let A be an m x n matrix such that the matrix ATA is invertible. Let x̂1 and x̂2 be the least squares solutions of equations Ax = b1 and Ax = b2 respectively. Show that c1x̂1 +
Given A = QR as in Theorem 12, describe how to find an orthogonal m × m (square) matrix Q1 and an invertible n × n upper triangular matrix R such thatThe MATLAB qr command supplies this "full" QR
Suppose A = QR is a QR factorization of an m × n matrix A (with linearly independent columns). Partition A as [A1 A2], where A1 has p columns. Show how to obtain a QR factorization of
Let A be an m x n matrix. Use the steps below to show that a vector x in Rn satisfies Ax = 0 if and only if ATAx = 0. This will show that NulA = NulATA.a. Show that if Ax = 0, then ATAx = 0.b.
A is an m x n matrix and b is in Rm. Justify each answer.(T/F) If A has a QR factorization, say A = QR, then the best way to find the least-squares solution of Ax = b is to compute x̂ = R–1QT
All vectors are in Rn. Justify each answer.(T/F) If the vectors in an orthogonal set of nonzero vectors are normalized, then some of the new vectors may not be orthogonal.
Use a matrix inverse to solve the system of equations in (7) and thereby obtain formulas for β̂0 and β̂1 that appear inmany statistics texts. ηβο + β, Σx = Σ βο Σ.x + βι Σ. = Συν
Given a ≥ 0 and b ≥ 0, letUse the Cauchy–Schwarz inequality to compare the geometric mean √ab with the arithmetic mean (a + b) = 2. u= [] ar and v= √a
Let u1,...,up be an orthogonal basis for a subspace W of Rn, and let T : Rn → Rn be defined by T(x) = projw x. Show that T is a linear transformation.
Let T : Rn → Rn be a linear transformation that preserves lengths; that is, ΙΙT(x)ΙΙ = ΙxΙΙ for all x in Rn.a. Show that T also preserves orthogonality; that is, T(x) · T(y) = 0
a. Rewrite the data in Example 1 with new x-coordinates in mean deviation form. Let X be the associated design matrix. Why are the columns of X orthogonal?b. Write the normal equations for the data
All vectors are in Rn. Justify each answer.(T/F) For any scalar c, u • (cv) = c(u • v).
All vectors and subspaces are in Rn. Justify each answer.(T/F) If W is a subspace of Rn and if v is in both W and W⊥, then v must be the zero vector.
A is an m x n matrix and b is in Rm. Justify each answer.(T/F) The normal equations always provide a reliable method for computing least-squares solutions.
All vectors are in Rn. Justify each answer.(T/F) If a set S = {u1,..., up} has the property that ui • uj = 0 whenever i ≠ j, then S is an orthonormal set.
All vectors and subspaces are in Rn. Justify each answer.(T/F) The best approximation to y by elements of a subspace W is given by the vector y - projwy.
A Householder matrix, or an elementary reflector, has the form Q = I – 2uuT where u is a unit vector. Show that Q is an orthogonal matrix. Show that Qv = –v if v is in Span{u} and Qv = v if v is
All vectors are in Rn. Justify each answer.(T/F) If y is a linear combination of nonzero vectors from an orthogonal set, then the weights in the linear combination can be computed without row
All vectors are in Rn. Justify each answer.(T/F) For any scalar c, ||cv|| = c||v||.
Suppose A = QR, where R is an invertible matrix. Show that A and Q have the same column space.
Suppose the x-coordinates of the data (x1; y1),.....(xn, yn) are in mean deviation form, so that Σ xi = 0. Show that if X is the design matrix for the least-squares line in this case, then XTX is a
All vectors are in Rn. Justify each answer.(T/F) If x is orthogonal to every vector in a subspace W then x is in Wᗮ.
All vectors and subspaces are in Rn. Justify each answer.(T/F) If y is in a subspace W, then the orthogonal projection of y onto W is y itself.
Let {v1.........vp} be an orthonormal set in Rn. Verify the following inequality, called Bessel’s inequality, which is true for each x in Rn: ||Xx||² ≥ |x-v₁|² + x V₂|² + ... + x•vp|²
Determine which sets of vectors are orthonormal. If a set is only orthogonal, normalize the vectors to produce an orthonormal set. 1/18 4//18 1/√18 1/2 [[ -1/2 -2/3 1/3 -2/3
All vectors are in Rn. Justify each answer.(T/F) Not every orthogonal set in Rn is linearly independent.
All vectors and subspaces are in Rn. Justify each answer.(T/F) The orthogonal projection ŷ of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute ŷ.
A is an m x n matrix and b is in Rm. Justify each answer.(T/F) A least-squares solution of Ax = b is a list of weights that, when applied to the columns of A, produces the orthogonal projection of b
A is an m x n matrix and b is in Rm. Justify each answer.(T/F) The least-squares solution of Ax = b is the point in the column space of A closest to b.
Show that if U is an orthogonal matrix, then any real eigenvalue of U must be ±1.
All vectors are in Rn. Justify each answer.(T/F) If vectors V1,..., Vp span a subspace W and if x is orthogonal to each vj for j = 1,..., p, then x is in Wᗮ.
Let u1 and u2 be as in Exercise 19, and letIt can be shown that u4 is not in the subspace W spanned by u1 and u2. Use this fact to construct a nonzero vector v in R3 that is orthogonal to u1 and
Suppose A = QR, where Q is m × n and R is n × n. Show that if the columns of A are linearly independent, then R must be invertible.
Show that if an n x n matrix U satisfies (Ux) · (Uy) = x·y for all x and y in Rn, then U is an orthogonal matrix.
All vectors and subspaces are inRn. Justify each answer.(T/F) In a QR factorization, say A = QR (when A has linearly independent columns), the columns of Q form an orthonormal basis for the column
Determine which sets of vectors are orthonormal. If a set is only orthogonal, normalize the vectors to produce an orthonormal set. 4/3 7/3 7/3 4/3 [ -4/3 0
A is an m x n matrix and b is in Rm. Justify each answer.(T/F) If the columns of A are linearly independent, then the equation Ax = b has exactly one least-squares solution.
All vectors are in Rn. Justify each answer.(T/F) If ||u||2 + ||v||2 = ||u + v||2, then u and v are orthogonal.
All vectors and subspaces are in Rn. Justify each answer.(T/F) For each y and each subspace W, the vector y - projwy is orthogonal to W.
All vectors are in Rn. Justify each answer.(T/F) Not every linearly independent set in Rn is an orthogonal set.
According to Kepler’s first law, a comet should have an elliptic, parabolic, or hyperbolic orbit (with gravitational attractions from the planets ignored). In suitable polar coordinates, the
LetNote that u1 and u2 are orthogonal but that u3 is not orthogonal to u1 or u2. It can be shown that u3 is not in the subspace W spanned by u1 and u2. Use this fact to construct a nonzero vector v
Letand W = Span {u1}.a. Let U be the 2 × 1 matrix whose only column is u1. Compute UTU and UUT. b. Compute projw y and (UUT)y. 7 1/√10 = [ 3 ] ₁ ₁ = [-3/V/TO [₁ -3/√10
Let U be an n x n orthogonal matrix. Show that if {v1,........,vn} is an orthonormal basis for Rn, then so is {Uv1,..........,Uvn}.
A is an m x n matrix and b is in Rm. Justify each answer.(T/F) Any solution of AT Ax = AT b is a least squares solution of Ax = b.
To measure the takeoff performance of an airplane, the horizontal position of the plane was measured every second, from t = 0 to t = 12. The positions (in feet) were: 0, 8.8, 29.9, 62.0,
All vectors and subspaces are inRn. Justify each answer.(T/F) If A = QR, where Q has orthonormal columns, then R = QT A.
All vectors are in Rn. Justify each answer.(T/F) If the distance from u to v equals the distance from u to - v, then u and v are orthogonal.
All vectors and subspaces are inRn. Justify each answer.(T/F) If x is not in a subspace W, then x- projw x is not zero.
A healthy child’s systolic blood pressure p (in millimeters of mercury) and weight ω (in pounds) are approximately related by the equationUse the following experimental data to estimate the
Suppose radioactive substances A and B have decay constants of .02 and .07, respectively. If a mixture of these two substances at time t = 0 contains MA grams of A and MB grams of B, then a model for
All vectors and subspaces are in Rn. Justify each answer.(T/F) If z is orthogonal to u1 and to u2 and if W = Span {u1, u2}, then z must be in W⊥.
A is an m × n matrix and b is in Rm. Justify each answer.(T/F) A least-squares solution of Ax = b is a vector x̂ such that ΙΙb - AxΙΙ ≤ ΙΙb - Ax̂ΙΙ for all x in Rn.
All vectors are in Rn. Justify each answer.(T/F) u • v - v • u = 0.
Showing 700 - 800
of 2243
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Last