Yesterday's post about Citi Field versus Shea Stadium got me thinking about homerun parks versus hitter's parks and what kind of correlation exists from one to the other. Unfortunately, my high-level math skills are, much like the clutch hitter, virtually non-existent. I barely know what a regression analysis is let alone how to run one, so if anyone out there wants to do the dirty work and report back here, we'd all be much obliged.
Anyway, the best I can offer is a relative +/- list of all big league parks in terms of run scoring versus homerun hitting. I took those park factors, courtesy of ESPN.com, and calculated the raw difference and percentage difference between runs and homeruns. For ESPN's park factors, a factor of 1.000 is neutral and a delta of 0 means the park is exactly as conducive to run scoring as to homerun hitting. The higher the delta (and delta %) the more prone the ballpark is to allowing runs versus homeruns. The lower the delta (and delta %) the more prone the ballpark is to allowing homeruns versus runs.
|Park Name||Runs||HR||Delta||Delta %|
|Citizens Bank Park||1.029||1.022||0.007||0.007|
|Minute Maid Park||1.036||1.155||-0.119||-0.115|
|U.S. Cellular Field||1.122||1.353||-0.231||-0.206|
Fenway Park was the fifth-best park in which to score runs last season but the fifth-worst park out of which to hit homeruns. Shea was the fifth-worst run-scoring park but the ninth-best homerun park, the latter of which is a little bit surprising. Camden Yards was slightly above average for runs scored but was the most homerun-friendly park in baseball. Petco Park was the toughest place to score any type of run, homerun or otherwise.