Bumping an old post after I thought about it some more recently.
I'm starting to like increasing EGR more than lean burn. Wideband O2 controllers are not all that expensive, but you need to change them out for emissions testing, you have to richen a bunch of parts of the fuel map, and you're only going to get maybe 10-15% dilution before you have ignition problems.
The beauty of EGR is it's basically a better warm air intake. The hot exhaust brings the temperature of the intake air up, reducing its density quite a lot more, but you don't need the hardware of a WAI to make it happen. The higher heat capacity ratio of the exhaust gas makes it less likely to knock compared to WAI at the same temperature, so you can go hotter. At 20% exhaust dilution (which is the limit for the usual spark ignition engine found by Toyota as well as some academic research), you're looking at on the order of 100K increase in temperature, which is around a 25% drop in air density.
So if we think about some older engines (intake VVT only or non-i VTEC) that don't really utilize significant internal EGR, boosting EGR dilution by 15% via an external tube could bring the MAP up from say 40kPa to 60kPa, or 50kPa to 75kPa, which is a massive difference.
After some thinking I realized manufacturers probably decided to avoid large volume EGR to prevent heat soak for aluminum intake manifolds, which can be mitigated by spraying the inside of the manifold with paint as thermal insulation.
If you have a car with external EGR controlled by the ECU, this is definitely the first place to look for mpg, IMO. Ignition advance does need to be increased quite a bit which can take some time, but a rough guideline would be to copy the value from the cell that's x% lower in load if you're using x% exhaust dilution.
|