Advertisements
Advertisements
प्रश्न
If the roots of the equation (c2 – ab)x2 – 2(a2 – bc)x + (b2 – ac) = 0 are real and equal, show that either a = 0 or a3 + b3 + c3 = 3abc.
[Hint: D = 4a(a3 + b3 + c3 – 3abc). So, D = 0 ⇒ a = 0 or a3 + b3 + c3 = 3abc.]
Advertisements
उत्तर
Given: The quadratic (c2 – ab)x2 – 2(a2 – bc)x + (b2 – ac) = 0 has real and equal roots.
Step-wise calculation:
1. For a quadratic Ax2 + Bx + C = 0, equal roots ⇒ discriminant D = B2 – 4AC = 0.
2. Here A = c2 – ab
B = –2(a2 – bc)
C = b2 – ac
3. Compute B2:
B2 = 4(a2 – bc)2
= 4(a4 + b2c2 – 2a2bc)
4. Compute 4AC:
4AC = 4(c2 – ab)(b2 – ac)
= 4(b2c2 – ac3 – ab3 + a2bc)
5. Subtract: D = B2 – 4AC
= 4[a4 + b2c2 – 2a2bc – (b2c2 – ac3 – ab3 + a2bc)]
= 4[a4 + ac3 + ab3 – 3a2bc]
= 4a(a3 + b3 + c3 – 3abc)
Thus D = 4a(a3 + b3 + c3 – 3abc).
6. Since roots are equal, D = 0
⇒ 4a(a3 + b3 + c3 – 3abc ) = 0
From D = 0 we get either a = 0 or a3 + b3 + c3 = 3abc.
