06-05-2007, 10:55 AM
Join Date: Jul 2006
Total Cats: 2
they are rated at 100% most of the time. so 305*your max DC say 90% being agressive your FP would be the normal 50 at 0 vac + the number of boost in psi. so say 13 psi. making 63 psi FP. all togethor would give 10% mass a 110 cc/min requirement. it isnt that hard. there was a person on .net who worked it through calories as well that was real interesting.
Originally Posted by FormerDatsun510Man
Interesting Corky, then based on what you have said I know my calculations aren't far off. Here is the longhand.
Taking for example a Coldside MP62 running a 105/65 pulley at 7000rpm:
cfm = CID(blower) * VE * RPM(blower) / 12^3
cfm = 62 * 75% * (7000*105/65) / 12^3
cfm = 395.6
m^3/min = 395.6 * .3048^3
m^3/min = 11.2
airflow gram/min = 11.2 * airdensity
airflow gram/min = 11.2 * 1200g/m^3
airflow gram/min = 13441.7
For a 12:1 a/f ratio this would mean we would inject a total of 13441.7/12 grams of fuel per min.
fuelflow gram/min = 13441.7/12
fuelflow gram/min = 1120.1
It takes 1 Joule to raise the temp of 1 gram of air 1 deg C. Likewise, removing 1 Joule of energy from 1 gram of air reduces its temperature 1 deg C. So for our massflow of the air above there is a total of 13441.7 Joules of heat energy needed to be removed per minute to lower the temperature 1 deg C.
The latent heat capacity of gasoline is 349 Joules/gram, meaning 1 gram of gasoline absorbs 349 Joules of heat energy in going from a liquid to a gas. So for the total fuel flow of 1120.1 gram/min this comes out to:
Joules absorbed = 1120.1 gram/min * 349 Joules/Gram = 390914.9 Joules/min
Dividing this by the Joules per deg C per minute for the air massflow comes out to:
Temp drop Deg C = 390914.9 Joules/min / 13441.7 Joules/min-deg C
Temp drop Deg C = 29.1
Converting to Deg F:
Temp drop Deg F = 29.1 deg C * 9/5
Temp drop Deg F = 52.4 deg F
So with a 12:1 a/f ratio it looks like the total amount of fuel going into vapor state would cause a drop of air temp of around 52 deg F.
Now the thing I am interested in is what temp drop would E-Cool (aka the extra injector) would cause in the intake manifold. I am figuring at most the extra injector would inject 1/3 of the total. So at most, in the intake manifold, the temp drop because of E-Cool would be around 52.4 deg F / 3 = 17.5 deg F
So, something else must be the cause for an observed 100 deg F temp drop from E-Cool?
In comparison to gasoline with 349 Joules/gram, water has a latent heat capacity of 2260 Joules/gram... about 6.5 times as much! Running water injection for the above calculation with a typical 15% water to fuel ratio yields:
fuelflow gram/min = 1120.1
waterflow gram/min = 1120.1 * 15%
waterflow gram/min = 168.0
Joules absorbed = 168.0 * 2260 Joules/min
Joules absorbed = 379680 Joules/min
Temp drop Deg C = 379680 Joules/min / 13441.7 Joules/min-deg C
Temp drop Deg C = 28.2
Temp drop Deg F = 28.2 deg C * 9/5
Temp drop Deg F = 50.8 deg F
Interesting the total temp drop from injecting water at a 15% water to fuel ratio is slightly less than the total temp drop from injecting the fuel... however you inject both so you get a double temp drop. With the E-Cool it looks like you would get 18 deg F temp drop in the intake manifold and then the remaining temp drop of 34 deg F in the combustion chamber (before ignition) for a total of 52 deg F temp drop. With WI you would see a 51 deg F temp drop in the intake manifold and then a 52 deg F temp drop in the combustion chamber from the fuel injected (before ignition) for a total of 103 deg F temp drop. The thing that confuses me is from this analysis it looks like the temp drop would be the same for a larger injector setup vs. an E-Cool setup if they are running the same a/f ratio. Unless, not all of the fuel is vaporized that is injected in the combustion chamber?