Advertisements
Advertisements
Question
If = a cos mx + b sin mx, then show that y2 + m2y = 0.
Advertisements
Solution
y = a cos mx + b sin mx
y1 = a `"d"/"dx"` (cos mx) + b `"d"/"dx"` (sin mx)
`[ ∵ "d"/"dx" (sin "m"x) = cos "m"x "d"/"dx" ("m"x) = (cos "m"x) . "m"]`
= a(-sin mx) . m + b(cos mx) . m
= -am sin mx + bm cos mx
y2 = -am(cos mx) . m + bm(-sin mx) . m
= -am2 cos mx – bm2 sin mx
= -m2 [a cos mx + b sin mx]
= -m2y
∴ y2 + m2y = 0
APPEARS IN
RELATED QUESTIONS
Differentiate the following with respect to x.
`5/x^4 - 2/x^3 + 5/x`
Differentiate the following with respect to x.
x3 ex
Differentiate the following with respect to x.
`(sqrtx + 1/sqrtx)^2`
Differentiate the following with respect to x.
`e^x/(1 + x)`
Differentiate the following with respect to x.
x3 ex
Find `"dy"/"dx"` for the following function
xy – tan(xy)
Differentiate the following with respect to x.
xsin x
Differentiate the following with respect to x.
(sin x)tan x
Find y2 for the following function:
y = e3x+2
If y = `(x + sqrt(1 + x^2))^m`, then show that (1 + x2) y2 + xy1 – m2y = 0
