If a matrix has 28 elements, what are the possible orders it can have? What if it has 13 elements?
28 × 1, 1 × 28, 4 × 7, 7 × 4, 14 × 2, 2 × 14. If matrix has 13 elements then its order will be either 13 × 1 or 1 × 13.
In the matrix
\( A = \begin{bmatrix} a & 1 & x \\ 2 & \sqrt{3} & x^2 - y \\ 0 & 5 & -\dfrac{2}{5} \end{bmatrix} \), write:
(i) the order of the matrix A, (ii) the number of elements, (iii) the elements \( a_{23}, a_{31}, a_{12} \).
(i) 3 × 3
(ii) 9
(iii) \( a_{23} = x^2 - y,\ a_{31} = 0,\ a_{12} = 1 \)
Construct a 2 × 2 matrix where (i) \( a_{ij} = \dfrac{(i - 2j)^2}{2} \) and (ii) \( a_{ij} = | -2i + 3j | \).
(i) \( \begin{bmatrix} 1 & \dfrac{9}{2} \\ 0 & 2 \end{bmatrix} \)
(ii) \( \begin{bmatrix} 1 & 4 \\ -1 & 2 \end{bmatrix} \)
Construct a 3 × 2 matrix whose elements are given by \( a_{ij} = e^{ix} \sin(jx) \).
\( \begin{bmatrix} e^x\sin x & e^x\sin 2x \\ e^{2x}\sin x & e^{2x}\sin 2x \\ e^{3x}\sin x & e^{3x}\sin 2x \end{bmatrix} \)
Find the values of \( a \) and \( b \) if matrices A and B are equal.
\( a = 2,\ b = 2 \)
If possible, find the sum of the matrices A and B.
Not possible
If \( X = \begin{bmatrix} 3 & 1 & -1 \\ 5 & -2 & -3 \end{bmatrix} \) and \( Y = \begin{bmatrix} 2 & 1 & -1 \\ 7 & 2 & 4 \end{bmatrix} \), find (i) \( X + Y \), (ii) \( 2X - 3Y \), (iii) a matrix \( Z \) such that \( X + Y + Z = 0 \).
(i) \( \begin{bmatrix} 5 & 2 & -2 \\ 12 & 0 & 1 \end{bmatrix} \)
(ii) \( \begin{bmatrix} 0 & -1 & 1 \\ -11 & -10 & -18 \end{bmatrix} \)
(iii) \( Z = \begin{bmatrix} -5 & -2 & 2 \\ -12 & 0 & -1 \end{bmatrix} \)
Find the non-zero value of \( x \) satisfying the matrix equation.
\( x = 4 \)
Show that \( (A + B)(A - B) \neq A^2 - B^2 \).
Find the value of \( x \) if
\( \begin{bmatrix} 1 & 3 & 2 \\ 2 & 5 & 1 \\ 15 & 3 & 2 \end{bmatrix} \begin{bmatrix} 1 \\ 2 \\ x \end{bmatrix} = O \).
\( -2, -14 \)
Show that \( A = \begin{bmatrix} 5 & 3 \\ -1 & -2 \end{bmatrix} \) satisfies the equation \( A^2 - 3A - 7I = O \) and hence find \( A^{-1} \).
\( A^{-1} = -\dfrac{1}{7} \begin{bmatrix} -2 & -3 \\ 1 & 5 \end{bmatrix} \)
Find the matrix \( A \) satisfying the matrix equation:
\( \begin{bmatrix} 2 & 1 \\ 3 & 2 \end{bmatrix} A \begin{bmatrix} -3 & 2 \\ 5 & -3 \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \).
\( A = \begin{bmatrix} 1 & 1 \\ 1 & 0 \end{bmatrix} \)
Find \( A \) if
\( \begin{bmatrix} 4 \\ 1 \\ 3 \end{bmatrix} A = \begin{bmatrix} -4 & 8 & 4 \\ -1 & 2 & 1 \\ -3 & 6 & 3 \end{bmatrix} \).
\( A = [-1\ 2\ 1] \)
If \( A = \begin{bmatrix} 3 & -4 \\ 1 & 1 \\ 2 & 0 \end{bmatrix} \) and \( B = \begin{bmatrix} 2 & 1 & 2 \\ 1 & 2 & 4 \end{bmatrix} \), verify that \( (BA)' \neq B'A^2 \).
If possible, find \( BA \) and \( AB \), where
\( A = \begin{bmatrix} 2 & 1 & 2 \\ 1 & 2 & 4 \end{bmatrix},\ B = \begin{bmatrix} 4 \\ 2 \\ 1 \end{bmatrix} \).
\( AB = \begin{bmatrix} 12 & 9 \\ 12 & 15 \end{bmatrix},\ BA = \begin{bmatrix} 9 & 6 & 12 \\ 7 & 8 & 16 \\ 4 & 5 & 10 \end{bmatrix} \)
Show by an example that for \( A \neq O \), \( B \neq O \), \( AB = O \).
Given \( A = \begin{bmatrix} 2 & 4 & 0 \\ 3 & 9 & 6 \end{bmatrix} \) and \( B = \begin{bmatrix} 1 & 4 \\ 2 & 8 \\ 1 & 3 \end{bmatrix} \), is \( (AB)' = B'A' \)?
Solve for \( x \) and \( y \):
\( x \begin{bmatrix} 2 \\ 1 \end{bmatrix} + y \begin{bmatrix} 3 \\ 5 \end{bmatrix} + \begin{bmatrix} -8 \\ -11 \end{bmatrix} = O \).
\( x = 1,\ y = 2 \)
If \( X \) and \( Y \) are \( 2 \times 2 \) matrices, solve the matrix equations:
\( 2X + 3Y = \begin{bmatrix} 2 & 3 \\ 4 & 0 \end{bmatrix},\ 3X + 2Y = \begin{bmatrix} -2 & 2 \\ 1 & -5 \end{bmatrix} \).
\( X = \begin{bmatrix} -2 & 0 \\ -1 & -3 \end{bmatrix},\ Y = \begin{bmatrix} 2 & 1 \\ 2 & 2 \end{bmatrix} \)
If \( A = [3\ 5] \) and \( B = [7\ 3] \), then find a non-zero matrix \( C \) such that \( AC = BC \).
\( \begin{bmatrix} k & k \\ 2k & 2k \end{bmatrix} \), where \( k \) is a real number
Give an example of matrices \( A, B, C \) such that \( AB = AC \), where \( A \) is non-zero, but \( B \neq C \).
If \( A = \begin{bmatrix} 1 & 2 \\ -2 & 1 \end{bmatrix}, B = \begin{bmatrix} 2 & 3 \\ 3 & -4 \end{bmatrix}, C = \begin{bmatrix} 1 & 0 \\ -1 & 0 \end{bmatrix} \), verify (i) \( (AB)C = A(BC) \), (ii) \( A(B + C) = AB + AC \).
If \( P = \begin{bmatrix} x & 0 & 0 \\ 0 & y & 0 \\ 0 & 0 & z \end{bmatrix} \) and \( Q = \begin{bmatrix} a & 0 & 0 \\ 0 & b & 0 \\ 0 & 0 & c \end{bmatrix} \), prove that \( PQ = QP \).
If \( \begin{bmatrix} 2 & 1 & 3 \end{bmatrix} \begin{bmatrix} -1 & 0 & -1 \\ -1 & 1 & 0 \\ 0 & 1 & 1 \end{bmatrix} = A \), find \( A \).
\( A = [-4] \)
If \( A = \begin{bmatrix} 2 & 1 \end{bmatrix}, B = \begin{bmatrix} 5 & 3 & 4 \\ 8 & 7 & 6 \end{bmatrix}, C = \begin{bmatrix} -1 & 2 & 1 \\ 1 & 0 & 2 \end{bmatrix} \), verify that \( A(B + C) = AB + AC \).
If \( A = \begin{bmatrix} 1 & 0 & -1 \\ 2 & 1 & 3 \\ 0 & 1 & 1 \end{bmatrix} \), then verify that \( A^2 + A = A(A + I) \), where \( I \) is the \( 3 \times 3 \) unit matrix.
If \( A = \begin{bmatrix} 0 & -1 & 2 \\ 4 & 3 & -4 \end{bmatrix} \) and \( B = \begin{bmatrix} 4 & 0 \\ 1 & 3 \\ 2 & 6 \end{bmatrix} \), verify that:
(i) \( (A')' = A \)
(ii) \( (AB)' = B'A' \)
(iii) \( (kA)' = (kA') \)
If \( A = \begin{bmatrix} 1 & 2 \\ 4 & 1 \\ 5 & 6 \end{bmatrix} \) and \( B = \begin{bmatrix} 1 & 2 \\ 6 & 4 \\ 7 & 3 \end{bmatrix} \), then verify that:
(i) \( (2A + B)' = 2A' + B' \)
(ii) \( (A - B)' = A' - B' \)
Show that \( A'A \) and \( AA' \) are both symmetric matrices for any matrix \( A \).
Let \( A \) and \( B \) be square matrices of order \( 3 \times 3 \). Is \( (AB)' = A'B' \)? Give reasons.
True when \( AB = BA \)
Show that if \( A \) and \( B \) are square matrices such that \( AB = BA \), then \( (A + B)^2 = A^2 + 2AB + B^2 \).
Let \( A = \begin{bmatrix} 1 & 2 \\ -1 & 3 \end{bmatrix} \), \( B = \begin{bmatrix} 4 & 0 \\ 1 & 5 \end{bmatrix} \), \( C = \begin{bmatrix} 2 & 0 \\ 1 & -2 \end{bmatrix} \) and \( a = 4, b = -2 \). Show that:
(a) \( A + (B + C) = (A + B) + C \)
(b) \( A(BC) = (AB)C \)
(c) \( (a + b)B = aB + bB \)
(d) \( a(C - A) = aC - aA \)
(e) \( (A')' = A \)
(f) \( (bA)' = bA' \)
(g) \( (AB)' = B'A' \)
(h) \( (A - B)C = AC - BC \)
(i) \( (A - B)' = A' - B' \)
If \( A = \begin{bmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{bmatrix} \), then show that \( A^2 = \begin{bmatrix} \cos 2\theta & \sin 2\theta \\ -\sin 2\theta & \cos 2\theta \end{bmatrix} \).
If \( A = \begin{bmatrix} 0 & -x \\ x & 0 \end{bmatrix} \), \( B = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \) and \( x^2 = -1 \), then show that \( (A + B)^2 = A^2 + B^2 \).
Verify that \( A^2 = I \) when \( A = \begin{bmatrix} 0 & 1 & -1 \\ 4 & -3 & 4 \\ 3 & -3 & 4 \end{bmatrix} \).
Prove by Mathematical Induction that \( (A^n)' = (A')^n \), where \( n \in \mathbb{N} \) for any square matrix \( A \).
Find inverse, by elementary row operations (if possible), of the following matrices:
(i) \( \begin{bmatrix} 1 & 3 \\ -5 & 7 \end{bmatrix} \)
(ii) \( \begin{bmatrix} 1 & -3 \\ -2 & 6 \end{bmatrix} \)
(i) \( \dfrac{1}{22} \begin{bmatrix} 7 & -3 \\ 5 & 1 \end{bmatrix} \)
(ii) not possible
If \( \begin{bmatrix} xy & 4 \\ z + 6 & x + y \end{bmatrix} = \begin{bmatrix} 8 & w \\ 0 & 6 \end{bmatrix} \), find values of \( x, y, z, w \).
\( x = 2, y = 4 \) or \( x = 4, y = 2, z = -6, w = 4 \)
If \( A = \begin{bmatrix} 1 & 5 \\ 7 & 12 \end{bmatrix} \) and \( B = \begin{bmatrix} 9 & 1 \\ 7 & 8 \end{bmatrix} \), find a matrix \( C \) such that \( 3A + 5B + 2C \) is a null matrix.
\( \begin{bmatrix} -24 & -10 \\ -28 & -38 \end{bmatrix} \)
If \( A = \begin{bmatrix} 3 & -5 \\ -4 & 2 \end{bmatrix} \), then find \( A^2 - 5A - 14I \). Hence, obtain \( A^3 \).
\( A^3 = \begin{bmatrix} 187 & -195 \\ -156 & 148 \end{bmatrix} \)
Find the values of \( a, b, c, d \) if
\( 3 \begin{bmatrix} a & b \\ c & d \end{bmatrix} = \begin{bmatrix} a & 6 \\ -1 & 2d \end{bmatrix} + \begin{bmatrix} 4 & a + b \\ c + d & 3 \end{bmatrix} \).
\( a = 2,\ b = 4,\ c = 1,\ d = 3 \)
Find the matrix \( A \) such that
\( \begin{bmatrix} 2 & -1 \\ 1 & 0 \\ -3 & 4 \end{bmatrix} A = \begin{bmatrix} -1 & -8 & -10 \\ 1 & -2 & -5 \\ 9 & 22 & 15 \end{bmatrix} \).
\( \begin{bmatrix} 1 & -2 & -5 \\ 3 & 4 & 0 \end{bmatrix} \)
If \( A = \begin{bmatrix} 1 & 2 \\ 4 & 1 \end{bmatrix} \), find \( A^2 + 2A + 7I \).
\( \begin{bmatrix} 18 & 8 \\ 16 & 18 \end{bmatrix} \)
If \( A = \begin{bmatrix} \cos \alpha & \sin \alpha \\ -\sin \alpha & \cos \alpha \end{bmatrix} \) and \( A^{-1} = A' \), find value of \( \alpha \).
True for all real values of \( \alpha \)
If the matrix \( \begin{bmatrix} 0 & a & 3 \\ 2 & b & -1 \\ c & 1 & 0 \end{bmatrix} \) is a skew-symmetric matrix, find the values of \( a, b, c \).
\( a = -2,\ b = 0,\ c = -3 \)
If \( P(x) = \begin{bmatrix} \cos x & \sin x \\ -\sin x & \cos x \end{bmatrix} \), then show that
\( P(x) P(y) = P(x + y) = P(y) P(x) \).
If \( A \) is a square matrix such that \( A^2 = A \), show that \( (I + A)^3 = 7A + I \).
If \( A, B \) are square matrices of same order and \( B \) is skew-symmetric, show that \( A' B A \) is skew-symmetric.
If \( AB = BA \) for any two square matrices, prove by mathematical induction that \( (AB)^n = A^n B^n \).
Let \( A \) and \( B \) be square matrices of the same order such that \( AB = BA \). We prove by mathematical induction on positive integer \( n \) that \( (AB)^n = A^n B^n \).
Base case (n = 1): For \( n = 1 \), \( (AB)^1 = AB = A^1 B^1 \). Hence the result is true for \( n = 1 \).
Induction hypothesis: Assume that for some \( k \in \mathbb{N} \), the relation holds: \( (AB)^k = A^k B^k \).
Induction step (n = k + 1): Consider
\[ (AB)^{k+1} = (AB)^k (AB). \]
Using the induction hypothesis,
\[ (AB)^{k+1} = A^k B^k AB. \]
Since \( AB = BA \), matrix \( A \) commutes with \( B \). From this it follows that \( A \) also commutes with any power of \( B \): by a simple induction, \( AB^k = B^k A \) and \( A^k B = B A^k \). Hence
\[ A^k B^k A B = A^k A B^k B \]
because \( B^k A = A B^k \). Thus
\[ (AB)^{k+1} = A^{k+1} B^{k+1}. \]
This proves that if the statement holds for \( n = k \), it also holds for \( n = k + 1 \).
Conclusion: By the principle of mathematical induction, \( (AB)^n = A^n B^n \) for all positive integers \( n \), whenever \( AB = BA \).
Find \( x, y, z \) if
\[ A = \begin{bmatrix} 0 & 2y & z \\ x & y & -z \\ x & -y & z \end{bmatrix} \]
satisfies \( A' = A^{-1} \), where \( A' \) is the transpose of \( A \).
The condition \( A' = A^{-1} \) means that \( A \) is an orthogonal matrix, i.e. \( AA' = I = A'A \). Hence the column vectors of \( A \) form an orthonormal set.
The columns of \( A \) are
\[ C_1 = \begin{bmatrix} 0 \\ x \\ x \end{bmatrix}, \quad C_2 = \begin{bmatrix} 2y \\ y \\ -y \end{bmatrix}, \quad C_3 = \begin{bmatrix} z \\ -z \\ z \end{bmatrix}. \]
Unit length conditions:
\( C_1 \cdot C_1 = 0^2 + x^2 + x^2 = 2x^2 = 1 \Rightarrow x^2 = \dfrac{1}{2} \Rightarrow x = \pm \dfrac{1}{\sqrt{2}}. \)
\( C_2 \cdot C_2 = (2y)^2 + y^2 + (-y)^2 = 6y^2 = 1 \Rightarrow y^2 = \dfrac{1}{6} \Rightarrow y = \pm \dfrac{1}{\sqrt{6}}. \)
\( C_3 \cdot C_3 = z^2 + (-z)^2 + z^2 = 3z^2 = 1 \Rightarrow z^2 = \dfrac{1}{3} \Rightarrow z = \pm \dfrac{1}{\sqrt{3}}. \)
Orthogonality conditions: One may verify that for these values, \( C_1 \cdot C_2 = C_1 \cdot C_3 = C_2 \cdot C_3 = 0 \), so the columns are mutually perpendicular and of unit length, hence \( A' = A^{-1} \).
Therefore, the required values are
\[ x = \pm \dfrac{1}{\sqrt{2}}, \quad y = \pm \dfrac{1}{\sqrt{6}}, \quad z = \pm \dfrac{1}{\sqrt{3}}. \]
If possible, using elementary row transformations, find the inverse of the following matrices:
(i) \( A_1 = \begin{bmatrix} 2 & -1 & 3 \\ -5 & 3 & 1 \\ -3 & 2 & 3 \end{bmatrix} \)
(ii) \( A_2 = \begin{bmatrix} 2 & 3 & -3 \\ -1 & -2 & 2 \\ 1 & 1 & -1 \end{bmatrix} \)
(iii) \( A_3 = \begin{bmatrix} 2 & 0 & -1 \\ 5 & 1 & 0 \\ 0 & 1 & 3 \end{bmatrix} \).
To find the inverse of a matrix by row transformations, form the augmented matrix \( [A \mid I] \) and perform elementary row operations until the left side becomes \( I \). The right side then gives \( A^{-1} \).
(i) Inverse of \( A_1 \)
Start with
\[ [A_1 \mid I] = \left[ \begin{array}{ccc|ccc} 2 & -1 & 3 & 1 & 0 & 0 \\ -5 & 3 & 1 & 0 & 1 & 0 \\ -3 & 2 & 3 & 0 & 0 & 1 \end{array} \right]. \]
By a sequence of row operations (such as \( R_2 \leftarrow 2R_2 + 5R_1 \), \( R_3 \leftarrow 2R_3 + 3R_1 \), and further simplifications), the left block can be reduced to the identity matrix. After complete reduction we obtain
\[ [I \mid A_1^{-1}] = \left[ \begin{array}{ccc|ccc} 1 & 0 & 0 & -7 & -9 & 10 \\ 0 & 1 & 0 & -12 & -15 & 17 \\ 0 & 0 & 1 & 1 & 1 & -1 \end{array} \right]. \]
Thus
\[ A_1^{-1} = \begin{bmatrix} -7 & -9 & 10 \\ -12 & -15 & 17 \\ 1 & 1 & -1 \end{bmatrix}. \]
(ii) Inverse of \( A_2 \)
Form \( [A_2 \mid I] \) and perform similar row operations. In the process the left block reduces to a matrix with a zero row, showing that \( \det(A_2) = 0 \). Hence \( A_2 \) is singular and its inverse does not exist.
(iii) Inverse of \( A_3 \)
Start with
\[ [A_3 \mid I] = \left[ \begin{array}{ccc|ccc} 2 & 0 & -1 & 1 & 0 & 0 \\ 5 & 1 & 0 & 0 & 1 & 0 \\ 0 & 1 & 3 & 0 & 0 & 1 \end{array} \right]. \]
Applying suitable row operations (for example, make the first entry in the first column 1 by \( R_1 \leftarrow \tfrac{1}{2} R_1 \), eliminate the other entries in the first column, and proceed similarly for other columns), we eventually obtain
\[ [I \mid A_3^{-1}] = \left[ \begin{array}{ccc|ccc} 1 & 0 & 0 & 3 & -1 & 1 \\ 0 & 1 & 0 & -15 & 6 & -5 \\ 0 & 0 & 1 & 5 & -2 & 2 \end{array} \right]. \]
Thus
\[ A_3^{-1} = \begin{bmatrix} 3 & -1 & 1 \\ -15 & 6 & -5 \\ 5 & -2 & 2 \end{bmatrix}. \]
Therefore:
\( A_1^{-1} = \begin{bmatrix} -7 & -9 & 10 \\ -12 & -15 & 17 \\ 1 & 1 & -1 \end{bmatrix}, \quad A_2^{-1} \) does not exist, and \( A_3^{-1} = \begin{bmatrix} 3 & -1 & 1 \\ -15 & 6 & -5 \\ 5 & -2 & 2 \end{bmatrix}. \)
Express the matrix
\[ A = \begin{bmatrix} 2 & 3 & 1 \\ 1 & -1 & 2 \\ 4 & 1 & 2 \end{bmatrix} \]
as the sum of a symmetric and a skew-symmetric matrix.
For any square matrix \( A \), it can be expressed uniquely as the sum of a symmetric matrix \( S \) and a skew-symmetric matrix \( K \) by the formulae
\[ S = \dfrac{1}{2}(A + A'), \qquad K = \dfrac{1}{2}(A - A'). \]
Here
\[ A = \begin{bmatrix} 2 & 3 & 1 \\ 1 & -1 & 2 \\ 4 & 1 & 2 \end{bmatrix}, \quad A' = \begin{bmatrix} 2 & 1 & 4 \\ 3 & -1 & 1 \\ 1 & 2 & 2 \end{bmatrix}. \]
Compute
\[ A + A' = \begin{bmatrix} 4 & 4 & 5 \\ 4 & -2 & 3 \\ 5 & 3 & 4 \end{bmatrix}, \quad S = \dfrac{1}{2}(A + A') = \begin{bmatrix} 2 & 2 & \tfrac{5}{2} \\ 2 & -1 & \tfrac{3}{2} \\ \tfrac{5}{2} & \tfrac{3}{2} & 2 \end{bmatrix}. \]
Similarly,
\[ A - A' = \begin{bmatrix} 0 & 2 & -3 \\ -2 & 0 & 1 \\ 3 & -1 & 0 \end{bmatrix}, \quad K = \dfrac{1}{2}(A - A') = \begin{bmatrix} 0 & 1 & -\tfrac{3}{2} \\ -1 & 0 & \tfrac{1}{2} \\ \tfrac{3}{2} & -\tfrac{1}{2} & 0 \end{bmatrix}. \]
Matrix \( S \) is symmetric since \( S' = S \), and matrix \( K \) is skew-symmetric since \( K' = -K \).
Finally,
\[ S + K = \begin{bmatrix} 2 & 2 & \tfrac{5}{2} \\ 2 & -1 & \tfrac{3}{2} \\ \tfrac{5}{2} & \tfrac{3}{2} & 2 \end{bmatrix} + \begin{bmatrix} 0 & 1 & -\tfrac{3}{2} \\ -1 & 0 & \tfrac{1}{2} \\ \tfrac{3}{2} & -\tfrac{1}{2} & 0 \end{bmatrix} = \begin{bmatrix} 2 & 3 & 1 \\ 1 & -1 & 2 \\ 4 & 1 & 2 \end{bmatrix} = A. \]
Thus \( A \) has been expressed as the sum of a symmetric matrix and a skew-symmetric matrix:
\[ A = \underbrace{\begin{bmatrix} 2 & 2 & \tfrac{5}{2} \\ 2 & -1 & \tfrac{3}{2} \\ \tfrac{5}{2} & \tfrac{3}{2} & 2 \end{bmatrix}}_{\text{symmetric}} + \underbrace{\begin{bmatrix} 0 & 1 & -\tfrac{3}{2} \\ -1 & 0 & \tfrac{1}{2} \\ \tfrac{3}{2} & -\tfrac{1}{2} & 0 \end{bmatrix}}_{\text{skew-symmetric}}. \]
The matrix \( P = \begin{bmatrix} 0 & 0 & 4 \\ 0 & 4 & 0 \\ 4 & 0 & 0 \end{bmatrix} \) is a
square matrix
diagonal matrix
unit matrix
none
Total number of possible matrices of order \( 3 \times 3 \) with each entry 2 or 0 is
9
27
81
512
If \( \begin{bmatrix} 2x + y & 4x \\ 5x - 7 & 4x \end{bmatrix} = \begin{bmatrix} 7 & 7y - 13 \\ y & x + 6 \end{bmatrix} \), then the value of \( x + y \) is
x = 3, y = 1
x = 2, y = 3
x = 2, y = 4
x = 3, y = 3
If
\( A = \dfrac{1}{\pi} \begin{bmatrix} \sin^{-1}(x \pi) & \tan^{-1}\left(\dfrac{x}{\pi}\right) \\ \sin^{-1}\left(\dfrac{x}{\pi}\right) & \cot^{-1}(x \pi) \end{bmatrix}, \quad B = \dfrac{1}{\pi} \begin{bmatrix} -\cos^{-1}(x \pi) & \tan^{-1}\left(\dfrac{x}{\pi}\right) \\ \sin^{-1}\left(\dfrac{x}{\pi}\right) & -\tan^{-1}(x \pi) \end{bmatrix} \)
then \( A - B \) is equal to
I
O
2I
\( \dfrac{1}{2} I \)
If A and B are two matrices of the order \( 3 \times m \) and \( 3 \times n \), respectively, and \( m = n \), then the order of matrix \( 5A - 2B \) is
m × 3
3 × 3
m × n
3 × n
If \( A = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \), then \( A^2 \) is equal to
\( \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \)
\( \begin{bmatrix} 1 & 0 \\ 1 & 0 \end{bmatrix} \)
\( \begin{bmatrix} 0 & 1 \\ 0 & 1 \end{bmatrix} \)
\( \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \)
If matrix \( A = [a_{ij}]_{2 \times 2} \), where \( a_{ij} = 1 \) if \( i \neq j \) and \( a_{ij} = 0 \) if \( i = j \), then \( A^2 \) is equal to
I
A
0
None of these
The matrix \( \begin{bmatrix} 1 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 4 \end{bmatrix} \) is a
identity matrix
symmetric matrix
skew symmetric matrix
none of these
The matrix \( \begin{bmatrix} 0 & -5 & 8 \\ 5 & 0 & 12 \\ -8 & -12 & 0 \end{bmatrix} \) is a
diagonal matrix
symmetric matrix
skew symmetric matrix
scalar matrix
If A is a matrix of order m × n and B is a matrix such that AB' and B'A are both defined, then order of matrix B is
m × m
n × n
n × m
m × n
If A and B are matrices of same order, then (AB' − BA') is a
skew symmetric matrix
null matrix
symmetric matrix
unit matrix
If A is a square matrix such that \( A^2 = I \), then \( (A^{-1})^3 + (A + I)^3 - 7A \) is equal to
A
I − A
I + A
3A
For any two matrices A and B, we have
AB = BA
AB ≠ BA
AB = O
None of the above
On using elementary column operations \( C_2 \rightarrow C_2 - 2C_1 \) in the following matrix equation
\( \begin{bmatrix} 1 & -3 \\ 2 & 4 \end{bmatrix} = \begin{bmatrix} 1 & -1 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 3 & 1 \\ 2 & 4 \end{bmatrix} \), we have:
\( \begin{bmatrix} 1 & -5 \\ 0 & 4 \end{bmatrix} = \begin{bmatrix} 1 & -1 \\ -2 & 2 \end{bmatrix} \begin{bmatrix} 3 & -5 \\ 2 & 0 \end{bmatrix} \)
\( \begin{bmatrix} 1 & -5 \\ 0 & 4 \end{bmatrix} = \begin{bmatrix} 1 & -1 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 3 & -5 \\ -0 & 2 \end{bmatrix} \)
\( \begin{bmatrix} 1 & -5 \\ 2 & 0 \end{bmatrix} = \begin{bmatrix} 1 & -3 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 3 & 1 \\ -2 & 4 \end{bmatrix} \)
\( \begin{bmatrix} 1 & -5 \\ 2 & 0 \end{bmatrix} = \begin{bmatrix} 1 & -1 \\ -3 & -3 \\ \end{bmatrix} \begin{bmatrix} 3 & -5 \\ 2 & 0 \end{bmatrix} \)
On using elementary row operation \( R_1 \rightarrow R_1 - 3R_2 \) in the following matrix equation:
\( \begin{bmatrix} 4 & 2 \\ 3 & 3 \end{bmatrix} = \begin{bmatrix} 1 & 2 \\ 0 & 3 \end{bmatrix} \begin{bmatrix} 2 & 0 \\ 1 & 1 \end{bmatrix} \), we have:
\( \begin{bmatrix} -5 & -7 \\ 3 & 3 \end{bmatrix} = \begin{bmatrix} 1 & -7 \\ 0 & 3 \end{bmatrix} \begin{bmatrix} 2 & 0 \\ 1 & 1 \end{bmatrix} \)
\( \begin{bmatrix} -5 & -7 \\ 3 & 3 \end{bmatrix} = \begin{bmatrix} 1 & 2 \\ 0 & 3 \end{bmatrix} \begin{bmatrix} -1 & -3 \\ 1 & 1 \end{bmatrix} \)
\( \begin{bmatrix} -5 & -7 \\ 3 & 3 \end{bmatrix} = \begin{bmatrix} 1 & 2 \\ 1 & -7 \end{bmatrix} \begin{bmatrix} 2 & 0 \\ 1 & 1 \end{bmatrix} \)
\( \begin{bmatrix} 4 & 2 \\ -5 & -7 \end{bmatrix} = \begin{bmatrix} 1 & 2 \\ -3 & -3 \end{bmatrix} \begin{bmatrix} 2 & 0 \\ 1 & 1 \end{bmatrix} \)
_____ matrix is both symmetric and skew symmetric matrix.
Null matrix
Sum of two skew symmetric matrices is always _____ matrix.
Skew symmetric matrix
The negative of a matrix is obtained by multiplying it by _____.
-1
The product of any matrix by the scalar _____ is the null matrix.
0
A matrix which is not a square matrix is called a _____ matrix.
Rectangular matrix
Matrix multiplication is _____ over addition.
Distributive
If A is a symmetric matrix, then A\(^3\) is a _____ matrix.
Symmetric matrix
If A is a skew symmetric matrix, then A\(^2\) is a ____.
Symmetric matrix
If A is skew symmetric, then kA is a _____ (k is any scalar).
Skew symmetric matrix
If A is symmetric matrix, then B'AB is ____.
Symmetric matrix
If A and B are symmetric matrices of same order, then AB is symmetric if and only if ____.
AB = BA
In applying one or more row operations while finding A\(^{-1}\) by elementary row operations, we obtain all zeros in one or more, then A\(^{-1}\) ____.
does not exist
If A and B are square matrices of the same order, then (i) (AB)' = ____.
B' A'
(ii) (kA)' = ____ (k is any scalar)
kA
(iii) [k(A − B)]' = ____.
k(A − B)
If A and B are symmetric matrices, then (i) AB − BA is a ____.
Skew symmetric matrix
(ii) BA − 2AB is a ____.
neither symmetric nor skew symmetric matrix
A matrix denotes a number.
False
Matrices of any order can be added.
False
Two matrices are equal if they have same number of rows and same number of columns.
False
Matrices of different order can not be subtracted.
True
Matrix addition is associative as well as commutative.
True
Matrix multiplication is commutative.
False
A square matrix where every element is unity is called an identity matrix.
False
If A and B are two square matrices of the same order, then A + B = B + A.
True
If A and B are two matrices of the same order, then A − B = B − A.
False
If matrix AB = O, then A = O or B = O or both A and B are null matrices.
False
Transpose of a column matrix is a column matrix.
False
If A and B are two square matrices of the same order, then AB = BA.
False
If each of the three matrices of the same order are symmetric, then their sum is a symmetric matrix.
True
If A and B are any two matrices of the same order, then (AB)' = A'B'.
False
If (AB)' = B'A', where A and B are not square matrices, then number of rows in A is equal to number of columns in B and number of columns in A is equal to number of rows in B.
True
If A, B and C are square matrices of same order, then AB = AC always implies that B = C.
False
AA' is always a symmetric matrix for any matrix A.
True
If A = \( \begin{bmatrix} 2 & 3 & -1 \\ 1 & 4 & 2 \end{bmatrix} \) and B = \( \begin{bmatrix} 2 & 3 \\ 4 & 5 \\ 2 & 1 \end{bmatrix} \), then AB and BA are defined and equal.
False
If A is skew symmetric matrix, then A\(^2\) is a symmetric matrix.
True
(AB)-1 = A-1 B-1, where A and B are invertible matrices satisfying commutative property with respect to multiplication.
True