Predicting how much grain a farmer will have at the end of the season informs loans, the logistics for companies to transport grain out of the farms, crop insurance, and a farmer’s basic economic well-being. The current “gold standard,” according to the University of Illinois’s press release, is the USDA’s World Agricultural Supply and Demand Estimates, or WASDE.
But those estimates are hardly ever right on the money. This is fine and to be expected; it’s incredibly difficult to look at a mid-season crop and guess exactly how many bushels of grain that crop will produce. But the University of Illinois thinks it has found a better way.
WASDE’s corn predictions are based on a combination of factors, especially surveys with farmers. Researchers at Illinois, though, added in much more data, especially seasonal climate data and, notably, data from satellite imagery. That satellite imagery reveals the patterns and speed of crop growth, and when combined with climate data, the researchers say, gives a more accurate prediction of the final tally.
Just as an example, the WASDE data was off for the month of June, between 2010 and 2016, by an average of 17.66 bushels per acre. Not bad, considering an acre of land can provide nearly 200 bushels of corn. But the new system came up with an error rate of 12.75 bushels per acre, significantly more accurate.
There’s nothing, in particular, stopping the USDA from implementing satellite data, and maybe even research from the Illinois study, in their next prediction models. And that’s good for everyone: farmers, buyers, processors, and market-watchers.