moleculo
Footballguy
good point - forgot about absolute pressure. It's a 37 degree difference to lose 2 PSI. I suppose that's conceivable to see a 37 degree drop between inside and outside, or at least, close enough for rounding/gauge errors to creep in.However, thinking this thru further - I doubt that when the NFL investigated and measured the pressure of the game balls, they were in 40 degree weather. They likely were measured in a league office at standard room temperature, making the whole temp/pressure drop irrelevant.I've seen this posted in a few places and everyone makes the same mistakes: You have to convert temp to Kelvin and people are measuring the pump pressure, not the absolute pressure. Taken from another board:I just ran some numbers w/ ideal gas law. to get from 12.5 PSI to 10.5 PSI takes a drop in temperature of 87 degrees F, so we rule out natural depressurization.
Your math is off, remember that you're using the actual pressure for these calculations (a 12.5 PSI ball has 27.2 PSI of total pressure in it). If you use the nominal PSI, you get results saying that it only loses about 0.4 PSI from the temperature change; that's wrong. It's about 1.6 PSI from a 70F to 40F, or just over 1 PSI lost from 70F to 50F.
If they were 12.5 to begin with, temperature difference could take them to about 10.9 PSI (going from 70F to 40F). Still not enough to explain the whole difference, but perhaps approaching the point where repeated measurements letting air out, game abuse, leaking over time, and precision of the meter could explain the difference.
			
				Last edited by a moderator: 
			
		
	
								
								
									
								
							
							
	
 
  Believe what you need to believe, I guess.