Quote:
Originally Posted by serialk11r
Perhaps BMEP would be the best way to see how the losses compare...
Throttling losses are not 100% of the energy it takes to create the vacuum, since the vacuum helps pull the cylinder back up a little. At idle I believe a typical figure for manifold pressure is a bit over 0.3 bar (absolute). Let's say 1/10 of this is recovered during compression. That puts the throttling loss at about 60kPa maximum. At idle the engine burns off the rest of the energy by overexpansion I believe, but I'm not sure on this.
I can't find very good figures for friction, but for a F1 engine spinning at 18k, the FMEP was about 1/10 of the BMEP which is on the order of 1.3MPa or so (typical good naturally aspirated engines produce a bit over 100Nm/L specific torque), and I imagine even with their super long rods, advanced coatings, and less accessories, 18k rpm is not going to be friendly for friction at all. I saw some chart that indicated around 50kPa FMEP at idle though, so I'd say 50-150kPa seems like a pretty good range for specific friction.
So pumping losses are at worst around the same as friction losses.
If you run your engine at say 20% load, that means you're producing perhaps 250kPa useful specific torque. The air theoretically needed would be 20% of peak, but you obviously need quite a bit more than this because you need to overcome friction and pumping, as well as the lower compression ratio sending more heat to waste. So say you need to throttle the engine down to 0.4bar manifold pressure. Then you're wasting about 50kPa on pumping, about 100kPa on friction, and producing 250kPa useful specific torque. That is, 3/8 of the useful work produced is wasted. But the useful work was produced at a pathetic compression ratio where the ideal efficiency was lowered significantly.
On high compression engines this is something like going from 12:1 to 5:1, which in ideal situations is a loss of 25% efficiency or so. But on your run of the mill 10:1 compression ratio engine, the difference is bigger. I believe this is the reason why manufacturers keep trying to bump the compression ratio up despite only very small gains at full power.
So the biggest 2 culprits are reduced compression and friction, and throttling is a close 3rd. Diesels have more respectable efficiency at low loads since they have no throttling losses, and maintain good charge compression. At extremely low loads though (such as idle) they still do horribly since overcoming friction is significant.
|
I have been thinking about the first principles behind the cycle efficiency for a while in another context, compression ignition vs spark ignition. Carnot efficiency is related the ratio of the temperature the heat is accepted at to the temperature it is rejected at. the heat added varies with the amount of fuel in the cylinder when it fires, which depends on the trottle setting, the temperature rise depends on this and also on the thermal capacity of the charge, which also varies with trottle setting. this means that the temperature rise is constant with varying throttle, all else being equal! in a diesel, it still has full charge at low load, but less fuel, so temperature rise is less. I think if it was not for this the difference in part load efficiency between a diesel and spark ignition engine would be greater. Anyway, this means that efficiency is not actually lost because of reduced effective compression ratio?