Is there a reliable and inexpensive way to use image-sensing technology to forecast block-level crop yields?
While digital imaging technology can be used to predict grapevine yield, new research suggests that the methods used need further development to provide a substantial improvement to current manual estimation techniques.
A recently completed three-year project funded by Wine Australia, led by the NSW Department of Primary Industries and investigated by Dr Mark Whitty and his team at the University of NSW, aimed to find a reliable and inexpensive technological replacement for manual estimation.
The current ‘best practice’ manual estimation method is time consuming and error-prone, with an accepted estimation error rate of ±15 per cent at best, even at harvest. This leads to problems in planning harvesting and intake at wineries and has flow-on effects for both growers and wineries.
Dr Whitty’s project showed that it is often more accurate to use long-term averages, and growers’ knowledge about their vineyards, than the manual method.
The researchers assessed a range of digital options, with the best results reported from GoPro cameras mounted to farm vehicles to video an entire block at the shoot stage (E-L 9) in combination with smartphone photos of marked bunches at flowering, pea-sized and harvest stages.
Result of the flower detection algorithm applied to a single image – fine red spots show detected florettes.
At the flowering stage, an impressive average error of just 5.5 per cent against tonnage delivered to the winery was achieved across four trial blocks – one each of Chardonnay and Shiraz in Orange and Clare.
‘Image processing can count flowers and berries on a single inflorescence or bunch and measure berry diameters, hence fruit-set ratios can be quickly determined non-destructively’, said Dr Whitty.
‘Simple cameras such as GoPros can give relative potential yield maps very early in the season, providing farmers with the tools to deal with variability within a block in the current season.’
However, when the measurements were repeated later in the season, the error margin increased to 14 per cent at pea-sized and 12 per cent at harvest.
The sensing method investigated by the project is reliant on visual imaging and, in a typical sprawl canopy, bunches hidden within the canopy cannot be readily seen. Later season estimates depend on historical occlusion factors or manual calibration. However, the researchers say there is potential for this technology to have increased accuracy in cane-pruned vines due to greater visibility of shoots and fruit.
The image processing components of the method provided an unexpected bonus, with it leading to the development of an improved, automated flower counting system. Using images captured with a smartphone camera, the accuracy of counting was found to be 84 per cent per image across 12 Australian datasets. This underpinned the yield estimation and reduced the need for arduous manual counting of flowers.
Wine Australia is currently assessing what the next steps might be. Providing access to this new image processing technology would allow growers to estimate yield accurately early in the season, but not being able to ground truth these values during the season with the same level of accuracy limits its usefulness.
Ideally, producers need a sensor technology that is able to see through a canopy and is capable of discerning fruit from shoots, wires, posts, etc. LiDAR has already been shown to not be particularly useful for this purpose, as it relies on being able to ‘see’ the target (i.e. the fruit). A current project with CSIRO (CSA 1602) is trialling ultrawideband radar which can see through the canopy. It is too early to know how feasible this technology might be in accurately detecting fruit, but initial results are promising. Microwave based technology is being trialled in New Zealand to estimate yields of Sauvignon Blanc. An alternative approach taken in another Wine Australia project (MQ 1401) is to use machine learning and historical data sets. This project is further advanced and the results look promising.
Watch this space!