MADISON, WI, JULY 7, 2008 -- The use of on-the-go crop and soil sensors has greatly increased the precision with which farmers can manage their crops. Recently released research in Agronomy Journal questions whether more precise management is necessarily more efficient. They discovered that the law of diminishing returns applies to precision agriculture, calculating how large of an application area is optimal for precision management techniques. According to the authors, this change could present significant cost savings for farmers.
In their article, "Spatial Analysis of Early Wheat Canopy Normalized Difference Vegetative Index: Determining Appropriate Observation Scale," E.M. Pena-Yewtukhiw, West Virginia University; G.J. Schwab and J.H. Grove, University of Kentucky; L.W. Murdock, University of Kentucky and the West Kentucky Research and Education Center; and J.T. Johnson, Clark County Cooperative Extension Center, examine how precise sensor and application grids should be for optimal efficiency.
To determine the ideal amount of data needed for precision management, the researchers calculated the optimal combination of physical sensor density (number of sensors along the applicator apparatus) and sensor output density (sensor readings per unit distance along the travel path).
The researchers found that sensor grid size can be increased from the current smallest size of .5 square meters to 5.1 square meters with no significant impact on the overall mapping of a crop's canopy or field variation. The larger grid requires fewer sensors and makes fertilizer application easier and more cost efficient. This tenfold increase in grid size could have significant cost savings for farmers using precision management techniques.
Source: American Society of Agronomy