I thought I'd do some back-of-the-envelope calculations (seriously, I found a spare envelope and sketched this out) to illustrate the point. Let's assume that any given fielder has a certain throwing accuracy, which we can measure in degrees. To visualize this, put your nose up to the circle at the center of this protractor:
The longer the distance traveled, however, the more a minor angular error in the trajectory he fires a baseball will result in a ball sailing away from his target. If we take an outfielder who is 270 feet from home plate, lobbing a baseball at a runner heading home, small angular errors can result in the ball missing the plate by quite a bit:
|At 270 Feet|
|Angular Error (Plus/minus Degrees)||Misses Home Plate By (Feet)|
This is where the cutoff man comes into play. Let's assume (as David did in his article) that the cutoff man is 90 feet from home plate, rather than 270 feet. Let's further assume that the average MLB fielder's throwing accuracy, under pressure, is +2 degrees. This is a guess, but it seems sort of reasonable. Here's what happens (graphic is showing a blimp's-eye view of the field):
Therefore, not only is hitting the cutoff man allowing a more direct route (lower launch angles negate the time it takes the infielder to catch, transfer, and throw again), and not only does doing so help prevent the hitter from advancing to second base on a throw home, but it also means that the ball will usually arrive in a better location for the catcher to make a play on the runner than if it was thrown all the way from the outfield.