#### Question

Show that the equation 2(a^{2} + b^{2})x^{2} + 2(a + b)x + 1 = 0 has no real roots, when a ≠ b.

#### Solution

The quadric equation is 2(a^{2} + b^{2})x^{2} + 2(a + b)x + 1 = 0

Here,

a = 2(a^{2} + b^{2}), b = 2(a + b) and c = 1

As we know that D = b^{2} - 4ac

Putting the value of a = 2(a^{2} + b^{2}), b = 2(a + b) and c = 1

D = {2(a + b)}^{2} - 4 x (2(a^{2} + b^{2})) x (1)

= 4(a^{2} + 2ab + b^{2}) - 8(a^{2} + b^{2})

= 4a^{2} + 8ab + 4b^{2} - 8a^{2} - 8b^{2}

= 8ab - 4a^{2} - 4b^{2}

= -4(a^{2} - 2ab + b^{2})

= -4(a - b)^{2}

We have,

a ≠ b

a - b ≠ 0

Thus, the value of D < 0

Therefore, the roots of the given equation are not real

Hence, proved

Is there an error in this question or solution?

#### APPEARS IN

Solution Show that the Equation 2(A2 + B2)X2 + 2(A + B)X + 1 = 0 Has No Real Roots, When A ≠ B. Concept: Nature of Roots.