Submitted by jjb94941 on Sat, 2013-10-05 10:41
Imagine the following two scenarios:
1. Accelerate HARD to 60 mph, drive for a short while and then decelerate to a stop for an average speed of (say) 40 mph.
2. Accelerate very gently to slightly above 60 mph such that the average speed for the entire run is also 40 mph, then decelerate to a stop as in scenario 1.
So, apart from the (slight?) difference due to additional tire distortion in 1 vs. 2, the total energy consumed should be the same, right? But I'd bet that the car's energy consumption in 1 is likely to be greater than in 2. If I'm right, where did the energy go?