I picked this segment up on Google ... bit techie in places but gets the point across:
"Cold Air:
Feeding cool air to an engine will increase the power output. Increasing inlet pressure will also increase the power (hence forced induction such as superchargers and turbos). The reason for this is that the engine power depends upon the mass flow rate of air into the engine. Higher pressure and cooler air will be more dense and will be pushed into the engine harder.
The air mass flow rate is proportional to pressure/square root of temperature
This means that doubling the inlet pressure from 1 bar (atmospheric pressure) to 2 bar would theoretically double the power.
Reducing the air temperature from t1 (in degrees C) to t2 (C) would increase power by the square root of ( (273+t1)/(273+t2) ). The 273 is used to convert from deg C to deg Kelvin (absolute temperature units, 0 Kelvin which is -273 deg C is absolute zero, below which you can't go). Eg: going from 35C to 15C, power increase is square root of ( (273+35)/(273+15) ), ie square root of (308/288) = 1.034, or 3.4%.
A rule of thumb is 1% power and torque increase for each 5.5C of temperature drop.
This is 1% power for each 10F because an increase of 1C would be increase of 1.8F.
If you want to estimate the power increase for pressure boost then things are a bit more ticky. The theory so far would give a 100% rise per 1 bar, which is 6.9% per psi, since there are 14.5 psi per bar. However, the compression process will raise the air temperature. This will make the air less dense and thus reduce power. The risk of detonation (pinking/knocking)will also be increased. Cooling the high pressure air with an intercooler or charge cooler is the way round this. There will be a pressure loss across the cooler though."
This segment raised another question for me. If ambient air temperature drops as you climb above Sea Level, then does this mean BHP increases with height? The answers in there somewhere ...