Linear Final True or False

  1. If a linear system has four equations and seven variables, then it must have infinitely many solutions.

    1. False. A system can always be inconsistent.   

  2. The plane 2x + y — z = 3 contains the point (1,1,1).

    1. False. Verify with algebra by plugging in the values x = 1, y = 1, z = 1

  3. The set of all solutions (x, y, z) to the equation x — y — z = 0 is a line in R3.

    1. False. This would have one pivot and two free variable columns, resulting in a plane.

  4. The equation 3x + ln(2)y = π is a linear equation in x and y.

    1. True. ln(2) and π are both constants, so the equation is in the linear form
      ax + by = c, where a, b, and c are constants. 

  5. It is possible for a single linear equation with 2 unknowns to have exactly one solution. 

    1. False. Consider that the matrix will have 1 row and 2 columns. There must be a free variable column, and therefore there is not a unique solution. 

  6. A system of six equations with eight unknowns corresponds to a matrix in row echelon form. Therefore, the largest possible number of pivots is 8. 

    1. False. This is a 6x9 matrix with 6 rows. There are at most 6 pivots since each row can contain at most 1 pivot. 

  7. A "tall matrix" (more rows than columns, m > n) can never have a pivot in every row. 

    1. True. You can only have one pivot per column.

  8. Two non-collinear vectors, whether in R2, R3, R4, R5, etc. will always span a plane. 

    1. True. 

  9. If the bottom row of an augmented matrix in reduced row echelon form is (0, 1, 3, 1), then the system has no solution.

    1. False. The corresponding equations have infinitely many solutions, since the third column does not contain a pivot.

  10. The following matrices are in reduced row echelon form (RREF):

      1. False. Pivot columns should not contain entries other than the pivot entry.

      1. True.

      1. True.

      1. False. This matrix is in row echelon form, but not reduced row echelon  form. 

  11. If the solution to a system of linear equations is given by (4 — 2z, -3 + z, z), then (4, -3, 0) is a solution to the system. 

    1. True. Check by plugging in the values x = 4, y = -3, and z = 0 into the solution. 

  12. If the bottom row of a matrix in reduced row echelon form contains all 0s to the left of the vertical bar and a nonzero entry to the right, then the system has no solution.

    1. True. The equation corresponding to the bottom row is 0 = the nonzero entry. 

  13. Suppose a1, a2, and a3 are three different nonzero vectors for the following questions. 

    1. Span{a1, a2} = contains only the line through a1 and the origin, and the line through the a2 and the origin.

      1. False. Span{a1, a2} contains all linear combinations of a1 and a2. In the case that these vectors do not lie on the same line, then (for example)
        a
        1 + a2  does not lie on the line through a1 or a2, yet it is a linear combination.

    2. The solution set of the linear system whose augmented matrix [a1 a2 a3 | b] is the same as the solution set of the equation x1a1 + x2a2 + x3a3 = b. 

      1. True. Both the matrix equation and the augmented matrix translate into the same system of linear equations.

    3. There are exactly three vectors in the set {a1, a2, a3}.

      1. True. Note this is a set, not a span. 

    4. Asking whether the linear system corresponding to an augmented matrix
      [a
      1 a2 a3 | b] has a solution amounts to asking whether b is in Span{a1, a2, a3}.

      1. True. the augmented matrix [a1 a2 a3 | b] and the vector equation
        x
        1a1 + x2a2 + x3a3 = b both translate into the same system of linear equations.

    5. There are exactly three vectors in Span{a1, a2, a3}.

      1. False. There are infinitely many vectors in Span{a1, a2, a3}, for instance, all multiples of a1.

  14. A =  . There is not a solution for every b in R3 given Ax = b.

    1. True. 2 < 3, so there is no way for every row to have a pivot. 

  15. The solution set of a linear system whose augmented matrix is [a1 a2 a3 | b] is the same as the solution set of Ax = b, if A = [a1 a2 a3]. 

    1. True. 

  16. If the equation Ax = b is inconsistent, then b is not in the set spanned by the columns of A.

    1. True. The set of b for which Ax = b is consistent is exactly the span of the columns of A.

  17. The equation Ax = b is referred to as a vector equation.

    1. False. It is called a matrix equation.

  18. Every matrix equation Ax = b corresponds to a vector equation with the same solution set.

    1. True. The vectors are the columns of A, and the coefficients of the vector equation are the entries of x.

  19. If A is an mxn matrix and if the equation Ax = b is inconsistent for some b in Rm, then A cannot have a pivot in every row.

    1.  True. If A had a pivot position in every row, then its columns would span Rm. But the set of b for which Ax = b is consistent is exactly the span of the columns of A.

  20. The equation Ax = b is consistent if the augmented matrix [A | b] has a pivot position in every row.

    1. False. A pivot in the last column of the last row will make the system inconsistent. 

  21. The equation Ax = b is homogenous if the zero vector is a solution.

    1. True. If x = 0 is a solution, then b = Ax = A0 = 0.

  22. The solution set of a consistent inhomogeneous system Ax = b is obtained by translating the solution set of Ax = 0.

    1. True.

  23. There is a vector so that the set of solutions to = is the z-axis.

    1. False. 

  24. The homogeneous system Ax = 0 has the trivial solution if and only if the system has at least one free variable.

    1. False. The homogeneous system Ax = 0 always has the trivial solution 0, whether or not it has a free variable.

  25. A homogeneous linear system is always consistent.

    1. True. The zero vector is always a solution to a homogeneous system.

  26. If x is a nontrivial solution of Ax = 0, then every entry of x is nonzero.

    1. False. Only one entry of x need be nonzero.

  27. Let A be a matrix with more rows than columns. Then the columns of A must be linearly dependent. 

    1. False. The columns of A may be linearly independent or dependent, for example: has dependent columns and has independent columns.

  28. Let A be a matrix with linearly independent columns. Then the equation Ax = b has a solution for all b precisely when it is a square matrix.

    1. True. A matrix A has the property "Ax = b has a solution for all b" precisely when A has a pivot in each row. On the other hand, A has linearly independent columns precisely when it has a pivot in each column. The only way for a matrix which has a pivot in each column to also have a pivot in each row is when it is a square matrix.

  29. The columns of a matrix with dimensions m x n, where m < n, must be linearly dependent.

    1. True. It is impossible for such a matrix to have a pivot in each column. Alternatively, such a matrix must give rise to at least one free variable.

  30. The columns of a matrix A are linearly independent if the equation Ax = b has the trivial solution.

    1. False. The equation Ax = 0 always admits the trivial solution, whether or not the columns of A are linearly independent.

  31. If S is a set of linearly dependent vectors, then every vector in S can be written as a linear combination of the other vectors in S.

    1. False. In order for S to be linearly dependent, only one vector in S needs to be expressible as a linear combination of the others. The others may be independent. 

  32. Two vectors are linearly dependent if and only if they are collinear.

    1. True. If ax + by = 0, with a ≠ 0 (for instance), then x = -(b/a)y, which says that x and y lie on the same line. Conversely, if x and y lie on the same line, then there exists a ≠ 0 such that x = ay (unless y = 0, in which case swap x and y);
      then ay - x = 0 is an equation of linear dependence.

  33. If a set S of vectors contains fewer vectors than there are entries in the vectors, then the set must be linearly independent.

    1. False. For instance, take S = . 

  34. Let V be the subset of R3 consisting of the vectors with abc = 0.

    1. V contains the 0 vector. 

      1. True. The zero vector's entries are all zero so the product of its entries is 0.

    2. V is closed under vector addition, meaning that if u and v are in V then u + v is in V.

      1. False. Consider u = and v = . Then u + v = . This does not satisfy the requirement that the product of the entries is 0.

    3. V is closed under scalar multiplication, meaning that if u is in V and c is a real number then cu is in V.

      1. True. If u is in V then its entries multiplied together is 0 so at least one of its entries has to be 0. A scalar multiplying 0 is still zero, so cu still satisfies the requirement of vectors in V.

    4. V is in a subspace of R3

      1. False. Since V is not closed under addition, it is not a subspace.

  35. The set of all solutions of a system of m homogeneous equations in n unknowns is a subspace of Rn.

    1. True. To solve a system of m equations in n unknowns, we can insert the equations as rows of an m x n matrix, denoted A, and solve the matrix equation Ax = 0. The solution set is the null space of A, which is a subspace of Rn.

  36. The column space of an m x n matrix is a subspace of Rm.

    1. True. By definition, the column space is the span of the columns, and any span is a subspace.

  37. If B is an echelon form of a matrix A, then the pivot columns of B form a basis for the column space of A.

    1. False. A basis of the column space of A consists of the columns of A that correspond to the pivot columns in B.

  38. The null space of an m x n matrix is a subspace of Rm.

    1. False. The null space of an m x n matrix  is the space of all solutions to the matrix equation Ax = 0. A solution of this equation must be a vector in Rn, so the null space is a subspace of Rn.

  39. Any set of n linearly independent vectors in Rn is a basis for Rn.

    1. True. Since Rn has dimension n, we know from the Basis Theorem that any set of n linearly independent vectors Rn will form a basis of Rn.

  40. Let A be an 8 x 9 matrix. What must m and n be if we define the transformation
    T : R
    m → Rn by T(x) = Ax. Then m = 9 and n = 8. 

    1. True. A matrix with 8 rows and 9 columns can only be multiplied with vectors in  R9, and the resulting vectors will be in R8.

  41. Let T be a one-to-one matrix transformation from Rn to Rm. Then n < m. 

    1. False. Let A be the matrix for T. If T is one-to-one, then Ax = 0 has only the trivial solution 0. This means that A has no free variables, so that each column of A has a pivot. This can only happen when there are at least as many rows as columns: n ≤ m. 

  42. Let T : R2 → R2 be the function given by = . 

    1. T(cv) = cT(v) for all vectors v and scalars c. 

      1. False. Use c = -1. 

    2. T(u + v) = T(u) + T(v) for all vectors u and v. 

      1. False. Consider u = and v = . 

    3. T is a linear transformation.

      1. False. Fails the closed under addition requirement from part (b). 

  43.   The following are linear transformations: 

    1.  The transformation T defined by T(x1, x2, x3) = (x1, x2, -x3).

      1. True.

    2.  The transformation T defined by T(x1, x2) = (4x1 — 2x2, 3|x2|).

      1. False. T(0, 1) + T(0, -1) ≠ T(0, 0) = 0.

    3.  The transformation T defined by T(x1, x2) = (2x1 — 3x2, x1 + 4, 5x2). 

      1. False. T(0, 0) = (0, 4, 0) ≠ (0, 0, 0).

    4.  The transformation T defined by T(x1, x2, x3) = (x1, 0, x3).

      1. True. 

    5.  The transformation T defined by T(x1, x2, x3) = (1, x2, x3).

      1. False. T(0, 0, 0) = (1, 0, 0) ≠ (0, 0, 0).

  44. For any matrix A, there exists a matrix B so that A + B = 0.

    1. True. B = (-1)A = -A.

  45. If A is a 5x4 matrix, and B is a 4x3 matrix, then the entry of AB in the 3rd row / 2nd column is obtained by multiplying the 3rd column of A by the 2nd row of B.

    1. False. It is obtained by multiplying the 3rd row of A by the 2nd column of B.

  46.  For any matrix A, we have the equality 2A + 3A = 5A.

    1. True: 2A + 3A = (2 + 3)A = 5A.

  47. For any matrices A and B, if the product AB is defined, then BA is also defined.

    1. False. For instance, A could be a 5x4 matrix, and B could be a 4x3 matrix.

  48. If A is an mxn matrix and B is an nxm matrix then AB and BA are both defined.

    1. True. 

  49. Suppose A and B are invertible nxn matrices. 

    1. (A+B)2 = A2 + B2 + 2AB

      1. False. The order in which the matrices are multiplied together must be respected. (A + B)2 = A(A + B) + B(A + B) = A2 + B2 + AB + BA

    2. A7 is invertible. 

      1. True. A power of an invertible matrix is invertible. 

    3. A + B is invertible. 

      1. False. Two diagonal matrices that sum to the 0 matrix violate this. 

    4. (AB)-1 = A-1B-1

      1. False. (AB)-1 = B-1A-1

    5. (In — A)(In + A) = In — A2.

      1. True. 

  50. A is an nxn matrix. 

    1. If the linear transformation T(x) = Ax is onto, then it is also one-to-one.

      1. True. If T is onto, then every row in A has a pivot. Since A is square, every column also has a pivot, so T is one-to-one.

    2. If the linear transformation T(x) = Ax is one-to-one, then the columns of A form a linearly dependent set.

      1. False. If the linear transformation T(x) = Ax is one-to-one, this means the only vector that can be mapped to the zero vector is the zero vector, meaning the columns of A satisfy the definition of linear independence.

    3. If the equation Ax = 0 has a nontrivial solution, then A has fewer than n pivots.

      1. True. When A has a nontrivial solution, this means there exists at least one free variable. This implies some column does not have a pivot, so there are fewer than n pivots.

    4. If -A is not invertible, then S is also not invertible.

      1. By the Invertible Matrix Theorem, AT being invertible is equivalent to A being invertible. Therefore, if AT is not invertible, then A cannot be invertible.

    5. If A is invertible, then the equation Ax = b has exactly one solution for all b in Rn

      1. True. The solution is x = A-1b.

    6. If A2 is row equivalent to the nxn identity matrix, then the columns of A span Rn.

      1. True. If A2 is row equivalent to the identity matrix, then A2 is invertible, by the Invertible Matrix Theorem. Since A2 is invertible, also A is invertible.

    7. A square matrix with two identical columns can be invertible.

      1. False. If A is invertible, by the Invertible Matrix Theorem, the linear transformation T(x) = Ax is one-to-one. But if two of the columns of A are equal, say the ith column and the jth column, then T(ei) = T(ej) = that column, but ei ≠ ej. Hence if two of the columns are equal, then A is not invertible.

    8. The product of any two invertible matrices is invertible.

      1. True. Let A and B be two invertible nxn matrices. Then A-1 and B-1 exists and are also invertible. Since we have ABB-1A-1 = AA-1 = In = B-1B = B-1A-1AB, it follows that the inverse of AB is B-1A-1.

  51. The following linear transformations from R3 to R3 are invertible: true/false. 

    1. Projection onto the y-axis

      1. False. The transformation is not one-to-one (or onto). The points (1, 0, 1) and (2, 0, 2) both map to (0, 0, 0) under the transformation, but they are not equal.

    2. Identity transformation: T(v) = v for all v

      1. True. The inverse of the identity transformation is itself.

    3. Projection onto the xz-plane.

      1. False. The transformation is not one-to-one (or onto). The points (0, 1, 0) and (0, 2, 0) both map to (0, 0, 0) under the transformation, but they are not equal.

    4. Rotation about the z-axis by π.

      1. True.  The inverse is rotation about the z-axis by -π.

    5. Dilation by a factor of 8.

      1. True. The inverse is a dilation by a factor of ⅛.

    6.  Reflection in the origin

      1. True. The inverse of a reflection in the origin is itself.

  52. Suppose A is a 3x3 matrix and λ is a real number with the property that the equation
    Ax = λx is satisfied by some nonzero vector x. 

    1. A — λI is not invertible.

      1. True. 0 = Ax — λx = (A — λI)x and x is not the zero vector, so A — λI is not invertible.

    2. A — λ is invertible. 

      1. False. The expression A — λ does not make sense, since A is a 3x3 matrix and λ is a number.

    3. A is not invertible. 

      1. False. A might still be invertible. 

  53.  A is an nxn matrix. 

    1. The absolute value of the determinant of A equals the volume of the parallelepiped determined by the columns of A.

      1. True. 

    2. The determinant of a triangular matrix is the sum of the entries of the main diagonal.

      1. False. The determinant of a diagonal matrix is the product of the entries on the main diagonal.

    3. A determinant of an nxn matrix can be defined as a sum of multiples of determinants of (n - 1) x (n - 1) submatrices.

      1. True. This is referring to the calculation of the determinant using cofactor expansion—choose one of the rows or columns of the matrix, multiply each entry (with its appropriate sign using the checkerboard patterns) of the row/column with the determinant of the corresponding (n - 1) x (n - 1) minor.

    4. The i,j minor of a matrix A is the matrix Aij obtained by deleting row i and column j from A.

      1. True. 

    5. The cofactor expansion of det A along the first row of A is equal to the cofactor expansion of det A along any other row.

      1. True. Cofactor expansion along any row or column always gives the same number.

    6. If the columns of A are linearly independent, then det A = 0. 

      1. False. If the columns of a matrix are linearly independent, then by the invertible matrix theorem, this means the matrix is invertible, which in turn implies that its determinant is non-zero.

    7. A row replacement operation does not affect the determinant of a matrix.

      1. True. 

    8. If det A is zero, then two columns of A must be the same, or all of the elements in a row or column of A are zero.

      1. False. 

    9. If two columns of A are the same, then the determinant of that matrix is zero.

      1. True. If a matrix has two columns which are the same, then its columns are linearly dependent. By the invertible matrix theorem, the matrix is not invertible, and thus its determinant must be 0.

    10. det(A + B) = det(A) + det(B)

      1. False. This is not a property of matrices.

  54. If a matrix has a determinant that is 0, it is invertible

    1. False. Having a determinant of zero means it's not invertible.

  55. Row reduction does not change whether or not the determinant is zero.

    1. True. It can only multiply by -1, or the row scaling factors, which are not zero.

  56. The determinant of a transposed matrix is the same as the original matrix

    1. True. Think about the cofactor expansion. The fact that it can be done along a column or row makes this true.

  57. Applying a R2 → R2 transformation whose matrix has determinant 2 will double the area of any shape transformed.

    1. True. Determinants are the factors by which matrix transformations scale areas.

  58. Det(AB) = Det(A)・Det(B)

  1. False, because A and B might not be square matrices.

  2. True if A and B are square.

  1. Eigenvectors with different corresponding eigenvalues are always linearly independent

    1. True. If they were linearly dependent, they’d be part of the same eigenspace, and therefore have the same eigenvalue

  2. An nxn matrix can have more than n eigenvalues 

    1. False, maximum of n.

  3. The zero vector can be an eigenvector sometimes

    1. False, this is part of the definition of eigenvectors

  4. Every eigenvalue has an infinite number of eigenvectors

    1. True, because multiplying any eigenvector by a constant (except 0) gives you another eigenvector

  5. If v1 and v2 are linearly independent eigenvectors of an n × n matrix A, then they must correspond to different eigenvalues 

    1. False, the matrix could have an eigenspace which is 2-D.

  6. The entries on the main diagonal of A are the eigenvalues of A

    1. False, this is true for triangular matrices

  7. The number λ is an eigenvalue of A if and only if there is a nonzero solution to the equation (A− λI)x = 0. 

    1. True, this comes almost straight from the definition of an eigenvalue

  8. If A is invertible and 2 is an eigenvalue of A, then 1/2 is an eigenvalue of A-1

    1. True, A-1 should un-do whatever A did, so if A doubles a vector, A-1 will halve it.

  9. The vector (-1, 1) is an eigenvector of the transformation that reflects across the line y=x

    1. True, The transformation sends (-1,1) to (1, -1) which is -1*(-1, 1)

  10. ​​0 can sometimes be an eigenvalue

    1. True. 

  11. The 0 vector can sometimes be an eigenvector

    1. False, part of the definition

  12. If a matrix has n distinct eigenvalues, it will be diagonalizable

    1. True, because it will be guaranteed to have n linearly independent eigenvectors 

  13. If a matrix is diagonalizable, it will have n distinct eigenvalues

    1. False, it could have a repeated eigenvalue. Example: the identity matrix

  14. All diagonalizable matrices are invertible

    1. False, there's no relationship here

  15. All invertible matrices are diagonalizable

    1. False, there's no relationship here

  16. If zero is an eigenvalue of a matrix, the matrix is not invertible

    1. True (invertible matrix theorem)

  17. If a matrix is similar (using the linear algebra definition of similar) to a diagonalizable matrix, then it is also diagonalizable

    1. True. Similarity is transitive, and a matrix is diagonalizable if it is similar to a diagonal matrix

  18. A 3x3 matrix which has 3 eigenvalues, and 2 linearly independent eigenvectors is diagonalizable 

    1. False, needs to have 3 linearly independent eigenvectors. In this question, the phrase “3 eigenvalues” accounts for the algebraic multiplicity of any eigenvalues.

  19. If A is a 3 × 3 matrix with characteristic polynomial −λ(λ − 5)^2, then the 5- eigenspace is 2-dimensional. 

    1. False, it could be one dimensional

  20. The characteristic polynomial of the zero matrix is 0.

    1. False. The nxn zero matrix has characteristic polynomial (-1)nλn.

  21. If A is a 4x4 matrix with characteristic polynomial λ4 + λ3 + λ2 + λ, then A is not invertible.

    1. True. We see λ = 0 is a root of the characteristic polynomial of A, so 0 is an eigenvalue of A, hence A is not invertible.

  22. Row operations on a matrix do not change its eigenvalues.

    1. False. Row equivalent matrices can have differing eigenvalues. 

  23. If the characteristic polynomial of a 2x2 matrix is λ2 - 5λ + 6, then the determinant is 6.

    1. True. The constant term will always be the determinant in the equation for a 2x2 matrix. 

  24. It is possible for a real matrix to have an odd number of complex eigenvalues

    1. False, They come in conjugate pairs, so there will always be an even number

  25. A is an nxn matrix. 

    1. If A is a positive stochastic matrix, then repeated multiplication by A pushes each vector toward the 1-eigenspace.

      1. True. 

    2. If A is a square matrix, then A and AT must have the same eigenvalues.

      1. True. They will have the same characteristic polynomial. 

    3. If A is a stochastic matrix, then 1 must be an eigenvalue of A.

      1. True. By definition. 

    4. If A is a stochastic matrix, then its 1-eigenspace must be a line.

      1. False. For example, the identity matrix I2 is a stochastic matrix, but its 1-eigenspace is two-dimensional.

  26. If A is an invertible 2 × 2 matrix, then A is diagonalizable. 

    1. False. Counterexample: the sheer matrix

  27. A 3 × 3 matrix A can have a non-real complex eigenvalue with multiplicity 2. 

    1. False. That eigenvalue would need to come in a conjugate pair with another multiplicity-2 non-real, complex eigenvalue, and a 3x3 matrix cannot have 4 eigenvalues.

  28. A is a square matrix. A is invertible if and only if 0 is not an eigenvalue of A.

    1. True. Zero is an eigenvalue if and only if Ax = 0x has a nontrivial solution, if and only if Ax = 0 has a nontrivial solution. By the invertible matrix theorem, this is equivalent to the non-invertibility of A. 

  29. A system of equations with more variables than equations will always have more than 1 solution

    1. False, here's an example

      1. x=1

      2. x=0

      3. w+x+y+z = 0

  30. A homogeneous system of equations with more variables than equations will always have more than 1 solution

    1. True. Think about pivots, the matrix corresponding to this system will have at least one column without a pivot, because it has more columns than rows. This guarantees at least one free variable.

  31. The parametric equation of a plane will always involve 2 parameters

    1. True

  32. It is possible for two row-equivalent matrices to have different solutions to Ax=0

    1. False, row equivalent matrices always have the same solution sets

  33. (5, 1, 3) is in the span of (1, 1, 1), (1, 6, -2), and (-1, 4, -4)

    1. False, could be checked by row reduction

  34. If x is in the span of the row vectors of A, then Ax=b has a solution

    1. False. x could be a 2 element vector, the rows of A could have 3 elements. 

  35. If a matrix has a pivot position in every row, Ax=b will always have a solution

    1. True

  36. If v and w are solutions of Ax=b and b = (7, 5) then (v-w) is a solution to Ax=0

    1. True. Matrix multiplication is distributive

  37. The solution set of Ax=b and the span of the column of A are the same thing

    1. False. The solution set is a set of solutions for x, the column span is the set of b’s that allow for solutions for x.

  38. In a group of vectors, computing the dot product of any 2 gives the result 0. These vectors must be linearly independent. 

    1. False. This means the vectors are orthogonal.

  39. It is possible for 2 eigenvectors with the same eigenvalue to be linearly independent 

    1. True. An eigenvalue which has an eigenspace that contains more than one vector will yield multiple linearly independent eigenvectors with the same eigenvalue.

  40. It is possible for 2 eigenvectors with different eigenvalues to be linearly dependent 

    1. False. Eigenvectors corresponding to distinct eigenvalues are linearly independent. 

  41. The vectors in a parametric vector form solution to Ax=0 can be linearly independent. 

    1. True. For example, if there are 2 free variable columns when A is row reduced, the null space will have two linearly independent vectors. 

  42. If the row vectors of A are linearly dependent, then Ax=b is always inconsistent.

    1. False. Not necessarily. 

  43. The set of all vectors in R3 whose coefficients are rational numbers is a subspace of R3.

    1. False. Fails the “closed under scalar multiplication” condition. The scalar may be an irrational number, meaning a vector of all 1’s would be turned irrational if multiplied by an irrational number. 

  44. The orthogonal complement of any subspace is also a subspace.

    1. True.

  45. The vectors (a, b, c) which satisfy a2 + b2 = c2 is a subspace. 

    1. False. Fails the “closed under addition” condition. 

  46. If a matrix is diagonalizable, and can be written in CDC-1 form, then there is more than one possible matrix C.

    1. C may have column vectors listed in different orders based on the order of eigenvalues listed in D.