The run expectancy chart, using data taken from thousands of games played, calculates the average odds of a run scoring from a certain scenario. In real life, using a very large database, a walk on average is worth about 1/4 of the runs/PA of a HR, meaning 4 walks in a game roughly equals one home run per game. I understand there are some gaps in sabrmetrics, but I don't see how you can argue with real empirical data.
i don't see how you can argue with a guaranteed minimum of one run per game over the opportunity to score a run each game. you're talking probability data and i'm telling you i'll take the numbers that don't have anything to do with probabilities...because they're already tallied on the scoreboard.
It's all about the probabilities. If you take your one run a game guaranteed, and my guy scores two runs, my team wins. You have to figure how often Player A is going to score more runs than Player B. How do you do that? By figuring the probabilities.
In a game where we take career batting averages, and players averages with RISP as dogma of what "should" happen... favoring probabilities isn't irresponsible at all. A player never making an out could be more unlikely than 162 HR's in a season.
fine. good luck with your team! i'm taking the guy who I KNOW scores me a run each game. no matter what anyone else does, i know i got at least one run a game out of this guy. at least. as much as cumulative stats get dismissed..it's ultimately cumulative stats that are kept on the scoreboard in each individual game.