Correct option is B
Concept:
When decreasing the load factor from 100% to 50%, fixed charges per kilowatt-hour (kWh) typically increase.
Load factor refers to the ratio of average power demand to the peak power demand over a specific period, usually a month or a year.
When the load factor decreases, it means that the average power demand is getting closer to the peak power demand.
This implies that the utility needs to invest more in infrastructure to meet the higher peak demand, even though the average demand remains the same.
As a result, the fixed costs associated with maintaining the infrastructure, such as generation, transmission, and distribution systems, are spread over fewer kWh of energy consumed, leading to an increase in fixed charges per kWh.
In contrast, when the load factor is high (close to 100%), the infrastructure costs are spread over a larger number of kWh, resulting in lower fixed charges per kWh.
Therefore, decreasing the load factor from 100% to 50% would likely lead to an increase in fixed charges per kWh.
Load factor = Average Load/ peak load
Its value is always less than one because maximum demand is never lower than average demand.
Fixed Charges: Fixed charges are costs that do not change with the level of electricity consumption. These include costs associated with infrastructure, maintenance, and other fixed expenses.
When we distribute these fixed charges over the energy consumed (kWh), a higher load factor (indicating higher utilization) will spread the fixed costs over more kWh, thus reducing the fixed charges per kWh. Conversely, a lower load factor will spread the same fixed costs over fewer kWh, thus increasing the fixed charges per kWh.
Impact of Decreasing Load Factor: If the load factor decreases from 100% to 50%, the energy consumed over a given period decreases while the fixed charges remain the same. Therefore, the fixed charges per kWh will increase because they are spread over a smaller amount of energy consumption.