An immersion heater rated 1000 W, 220 V is used to heat 0.01 m^{3} of water. Assuming that the power is supplied at 220 V and 60% of the power supplied is used to heat the water, how long will it take to increase the temperature of the water from 15°C to 40°C?

#### Solution

Given the operating voltage V and power consumed P, the resistance of the immersion heater,

\[R = \frac{V^2}{P} = \frac{\left( 220 \right)^2}{1000} = 48 . 4 \Omega\]

Mass of water, \[m = \frac{1}{100}\times1000=10 kg\]

Specific heat of water, s = 4200 Jkg^{-1}K^{-1}

Rise in temperature, θ = 25°C

Heat required to raise the temperature of the given mass of water,

Q = msθ = 10 × 4200 × 25 = 1050000 J

Let t be the time taken to increase the temperature of water. The heat liberated is only 60%. So,

\[\left( \frac{V^2}{R} \right)\times t \times 60\% = 1050000 J\]

\[\Rightarrow \frac{(220 )^2}{48 . 4} \times t \times \frac{60}{100} = 1050000\]

⇒ t = 29.17 minutes