浙江大学学报(农业与生命科学版)  2017, Vol. 43 Issue (5): 649-659
文章快速检索     高级检索
覆膜玉米冠层图像分割方法[PDF全文]
张万红, 刘文兆    
西北农林科技大学水土保持研究所,陕西 杨凌 712100
摘要: 在弱光条件下,采用色调(H)和饱和度(S)颜色分量的K均值聚类分析结合相应色差运算方法,对覆膜玉米冠层图像进行分割,并将分割所得影像的二值图分别与超绿、超红和超绿-超红算法分割结果进行比较。结果表明,该方法更能精确反映玉米的冠层形状。将该方法得到的玉米冠层覆盖度计算结果与Samplepoint软件分析结果进行比较发现,前者均方根误差取值较小,仅为0.004 2, 分割误差率低至3.37%,分割图像准确率高。综合分析表明, 在弱光背景下,基于HS颜色分量的K均值聚类分析结合色差运算的分割方法对覆膜玉米冠层的分割结果准确可靠。
关键词: 色调    饱和度    K均值聚类    图像分割    玉米冠层    
Image segmentation method of plastic-film corn canopy
ZHANG Wanhong, LIU Wenzhao    
Institute of Soil and Water Conservation, Northwest A & F University, Yangling 712100, Shaanxi, China
Summary: Percent ground cover of vegetation is an important parameter which received attention of both agronomists and ecologists. Not only does it reflect dynamic growth of plants in a long time, but also it is associated with abstraction of photosynthesis available radiation (APAR) of plants. So far as the maize crop cover is concerned, current researches mainly focused on calculating percent ground cover of maize on bare ground. It is a fact that plastic film mulching has been widely adopted for maize planting due to its effect on reducing water loss, regulating soil temperature, improving the infiltration of rainwater into the soil, enhancing soil water retention, accelerating crop growth, and significantly increasing crop yield. In addition, the recent advances in image analysis software offered potential for analyzing the digital camera images of habitat to objectively quantify ground cover of vegetation in a repeatable and timely manner too. Here we evaluated use of Matlab software for analyzing the digital photographs of plastic-film maize to quantify the percent ground cover. In this study, the images of plastic-film maize were firstly taken by smart phone under weak light condition, which were JPEG (joint photographic expert group) format here and were in 1 358×1 314 resolution. Then the method combined the K-mean clustering analysis of hue (H) and saturation (S) color components with performing a corresponding mathematical operation was proposed to discriminate the maize and background. The proposed method was comprised of three main steps. First, color images yielding red (R), green (G), and blue (B) subimages were mathematically transformed to hue (H), saturation (S), and intensity (I) ones. And then, the images were respectively segmented using the methods of excess green (ExG), excess red (ExR), excess green minus excess red (ExG-ExR), and Otsu thresholding of excess green, excess red and excess green minus excess red. Second, the K-mean clustering analysis of H and S color components was carried out. Finally, the color difference operation between the K-mean clustering analysis of H and S color components was performed for segmentation of target object. Results of images processing indicated that the images, which were segmented respectively by excess green, excess red, excess green minus excess red, and Otsu thresholding of excess green, excess red and excess green minus excess red, showed incomplete construct of maize and plastic film, but relatively satisfactory results were achieved by clustering analysis of H and S color components. Specifically, the K-mean clustering analysis of H color component clearly delineated leaf edge of maize, and the K-mean clustering analysis of S color component produced complete plastic film construct. The maize plant was successfully separated from plastic film, soil and other backgrounds by application of the color difference operation between the K-mean clustering analysis of H and S color components. Root mean square error (RMSE) and error rate were calculated to verify the reliability of the method proposed in this paper for segmentation of maize plant. The results showed that the RMSE and error rate of segmentation were 0.004 2 and 3.37%, respectively. The low RMSE and error rate further confirmed the rationality of the method used in this paper. In conclusion, the method presented in this paper for image segmentation of plastic-film corn canopy is reliable under the weak light condition.
Key words: hue    saturation    K-mean clustering analysis    image segmentation    corn canopy    

植被的冠层覆盖度是指植被在地面的垂直投影面积占统计区总面积的百分比[1],它既能反映植物生长期内的动态变化,又能间接说明植物的蒸腾作用和光合作用[2]。因此,在农田条件下准确地估测植被的冠层覆盖度对监测作物生长状态和预测作物产量具有重要意义[3-4]

传统估测作物冠层覆盖度的方法有目测法、尺测法、逐点目视判断并计数垂直成像照片法[2, 5-6]。虽然这些方法很简单,但目测法主观随意性很强,不同观测者可能结果不同,甚至相差很大;尺测法受天气影响大,估测结果带方向性,且劳动强度大;逐点目视判断并计数垂直成像照片法估测精确,但劳动强度也大,耗费时间长[7]

目前,利用数字相机拍摄作物冠层照片,在计算机中将影像分成作物和非作物(土壤、作物残留物等),并利用二值图像计算作物覆盖度的方法操作简单,结果准确率高,是一种适宜的方法[5]。对于背景简单的田间作物图像,例如大田玉米图像中背景仅包含土壤和少许作物残留,通常采取单阈值的方法即可快速实现对目标的识别与分割,但对于多背景的影像,单阈值分割法的准确率低,往往会产生过度分割。为了实现对多背景目标影像的准确分割,通常将RGB颜色空间转化为HIS[7-11]、HSV[12-13]和Lab[14-16]等颜色空间并结合最大类间方差法(Otsu)[14]K 均值聚类[15, 17]和模糊C 均值聚类(fuzzy C-mean clustering, FCM)[18]等算法对图像进行分割。这些方法虽然能准确将目标影像分割,但目前仍没有统一的算法来实现对不同环境条件下所有特定作物图像的分割[19]

地膜覆盖可以改善农田土壤的水热状况,提高养分有效性和水分利用效率,该种植方式已在玉米田得到广泛推广和应用[20-21]。但目前将玉米植株从地膜、土壤等背景中分离并获取玉米冠层覆盖度的方法鲜有相关文献报道。基于此,本试验拟采用图像处理方法将玉米植株从地膜、土壤等背景中分离,并最终实现对玉米冠层覆盖度的准确计算。此方法首先将覆膜玉米影像从RGB颜色空间转换到HIS颜色空间,然后分别提取HS颜色分量,通过对HS颜色分量进行K均值聚类分析[22],选取合适的2类聚类图进行相应色差运算,以分割覆膜玉米冠层图像。将图像分割结果分别与超绿(excess green, ExG)[23]、超红(excess red, ExR)[24]以及超绿减超红(ExG-ExR)[25]算法的分割结果进行比较,在比较的基础上选用合理的分割图像进行玉米冠层覆盖度计算。

1 材料与方法 1.1 大田玉米图像采集

试验于2016年6月4日—6日(玉米苗期)在中国科学院长武黄土高原农业生态试验站覆膜玉米试验田进行。首先用1 m2的样方框将玉米植株框定,然后使用华为荣耀7手机,在下午和早晨太阳光较弱的时间段,采用自然曝光模式,垂直于每个框定样方在地面上2 m处进行拍照,收集光照均匀、少有阴影的图片备用。在进行图像分割处理前,为方便图像处理,在不影响图像中目标与背景形状及颜色的前提下,将图像统一变换为1 358×1 314像素,以JPEG格式导入计算机,如图 1所示。

图1 覆膜玉米影像 Fig. 1 Original image of plastic-film corn
1.2 玉米植株图像分割

覆膜玉米冠层图像分割流程如图 2所示。

图2 玉米冠层图像分割方法流程图 Fig. 2 Flow chart for image segmentation method of plastic-film corn canopy

1)获取RGB颜色空间下的RGB颜色分量,分别计算影像的超绿、超红以及超绿-超红算法结果;超绿、超红以及超绿-超红算法结合Otsu阈值分割算法[26]对图像进行分割。超绿及超红算法如公式(1)~(2)[23-24]所示。

$ {\rm{ExG}} = 2 \times G-R-B. $ (1)
$ {\rm{ExR}} = 1.4 \times R-G. $ (2)

式中RGB分别代表红、绿、蓝颜色分量。

2)将RGB颜色空间转换为HIS颜色空间[见公式(3)~(6)[10]], 并提取HS颜色分量;使用K均值聚类算法分别对HS颜色分量进行聚类分析(聚类数为3),获取2类颜色分量的二值图像,对二值图进行去噪和形态学开运算,断开细小黏连,去除毛刺使图像更为平滑;对去噪及形态学运算后的HS颜色分量二值图进行相应的数学运算,获取目标图像。

$ \theta = {\rm{arccos}}\left\{ {\frac{{\frac{1}{2}\left[{\left( {R-B} \right) + \left( {R + B} \right)} \right]}}{{\sqrt {{{\left( {R -B} \right)}^2} + \left( {R -G} \right)\left( {G -B} \right)} }}} \right\}, $ (3)
$ H = \left\{ \begin{array}{l} \theta, B \le G\\ 360-\theta, B > G \end{array} \right., $ (4)
$ S = 1- \frac{3}{{\left( {R + G + B} \right)}}\left[{\min \left( {R, G, B} \right)} \right], $ (5)
$ I = \frac{1}{3}\left( {R + G + B} \right). $ (6)
1.3 数据处理

采用Excel 2013进行数据处理及运算。

2 结果与分析 2.1 超绿、超红以及超绿-超红算法结合Otsu阈值分割法

图 3为ExG算法图像分割结果。该分割方法能准确识别土壤、塑料膜、作物残留等背景目标,识别后的背景目标在分割后的图像中被标识为黑色,但ExG算法不能准确识别前景目标(玉米)。在图 3中,部分玉米叶脉、叶尖以及下垂叶的叶缘区域呈现为黑色,表明ExG算法对前景目标产生了过度分割。图 4为采用ExG算法获取的图像经Otsu阈值分割法处理后的二值图。从中可以看出,前景目标被标识为白色,背景目标被标识为黑色,前景目标图像中的零星黑色斑块表明ExG算法结合Otsu阈值分割法同样也产生了对图像的过度分割现象。图 5为采用ExR算法获取的分割图像。在该图中,除了叶片中有小部分区域与背景色基本一致外,大部分叶片的颜色与背景颜色呈现出明显的差异,这种颜色差异更有利于对目标和背景图像进行分割。图 6为采用Otsu阈值分割法对ExR算法获取的图像进行自适应阈值处理后的二值图,图中背景部分的土壤以及塑料膜在部分区域中呈现出与前景目标一致的白色,说明前景与背景的分割效果差。图 7图 8分别为采用ExG-ExR算法以及ExG-ExR算法结合Otsu阈值分割法处理后的分割图。这2种分割方法的分割结果分别与ExG以及ExG结合Otsu分割法的结果较为一致。

图3 超绿算法分割的影像 Fig. 3 Image segmented by ExG method

图4 超绿算法结合自适应阈值法分割的影像 Fig. 4 Image segmented by ExG and Otsu methods

图5 超红算法分割的影像 Fig. 5 Image segmented by ExR method

图6 超红算法结合自适应阈值法分割的影像 Fig. 6 Image segmented by ExR and Otsu methods

图7 超绿减超红算法分割的影像 Fig. 7 Image segmented by ExG-ExR method

图8 超绿减超红算法结合自适应阈值法分割的影像 Fig. 8 Image segmented by ExG-ExR and Otsu methods
2.2 基于HS颜色分量的K均值聚类分割法 2.2.1 HS颜色分量的K均值聚类分析

获取HS颜色分量(图 9~10)后,对HS颜色分量分别进行K均值聚类分析,然后选取适宜用于图像分割的聚类图做后续分割处理。基于H 颜色分量的K均值聚类分析如图 11所示,基于S颜色分量的K均值聚类分析如图 12所示。图 11准确显示了前景目标的形状,但在背景的塑料膜区域(图 1中间部分所示)中,由于塑料膜与土壤接触的紧密程度以及膜厚度的不均一,导致凝结在塑料膜下的露珠区域呈现出斑块状与点状交织在一起的白色,与土壤接触紧密的区域部分呈现出接近干土的颜色,覆膜边缘区域呈现出与裸露土壤接近的颜色。这种现象导致对图像进行聚类分析后,前景目标的部分叶边缘出现过多噪点,小的叶缝隙间出现了黏连,增加了图像分割的难度。但是,基于S颜色分量的K 均值聚类分析几乎准确呈现了覆膜背景区域(图 12中间部分的黑色区域),覆膜区域呈现干净的黑色,很少有噪点产生,更重要的是,对比图 11中覆膜区域的玉米叶,图 12准确呈现了覆膜区域复杂背景下的叶子形状。

图9 H颜色分量影像 Fig. 9 Image of hue

图10 S颜色分量影像 Fig. 10 Image of saturation

图11 H颜色分量的聚类分析 Fig. 11 Clustering analysis for image of hue

图12 S颜色分量的聚类分析 Fig. 12 Clustering analysis for image of saturation
2.2.2 运用“Bwareaopen”程序及相应数学运算分割图像

综合以上分析,运用Matlab软件中的“Bwareaopen”命令对图 11中的噪点进行清除。清除后的结果(图 13)显示,图中噪点清除很干净,但前景目标的部分叶子间隙有黏连且部分叶缘处有过多白色附着物,白色附着物与叶缘紧密结合在一起。为了消除叶缘附着物并恢复叶间隙,对图 12图 13进行减法运算,再将运算结果与图 13相减,通过减法运算及去噪处理,部分叶子间的黏连以及叶缘处的附着物消失,显示出了清晰的玉米冠层轮廓(图 14)。

图13 经Bwareaopen软件程序处理后的图像 Fig. 13 Image treated by Bwareaopen program

图14 最终分割的图像 Fig. 14 Segmented image
2.3 实验结果及分析

为了验证算法的准确性,运用上述分割算法对采集到的20幅覆膜玉米图像(每幅图像代表的实际土地面积为1 m2)进行分割,并根据计算公式(7)[27]和(8)分别计算玉米冠层图像分割误差率和均方根误差(RMSE)。

$ E = \frac{{\left( {{{\bar C}_1}-{{\bar C}_2}} \right)}}{{{{\bar C}_1}}} \times 100\%, $ (7)
$ {\rm{RMSE = }}\sqrt {\frac{{{{\sum\limits_{i = 1}^n {\left( {{C_{1i}}-{C_{2i}}} \right)} }^2}}}{n}} 。$ (8)

式中:E为误差率;C1为根据Samplepoint软件(以人机交互的方式对土壤、植物、岩石等目标物进行判别)[28]测定的玉米冠层覆盖度结果;C2为基于HS颜色分量的K均值聚类分析和色差运算分割图像后计算所得的玉米冠层覆盖度;n为玉米冠层图像数目。

计算结果显示,RMSE取值较小,仅为0.004 2,误差率低达3.37%:表明利用本文算法分割图像后计算所得的玉米冠层覆盖度与Samplepoint软件测定结果非常接近,分割结果可靠。

3 讨论与结论

准确分割玉米冠层图像对研究玉米生理生态具有重要意义。为了准确分割覆膜条件下的玉米冠层图像,本文提出了基于HS颜色分量的K均值聚类的算法,通过聚类分析分别获取基于HS颜色分量的聚类分析图,根据2类图所反映的前景目标及背景的差异,通过色差运算的方法实现了对玉米冠层图像的分割。为了证明这种方法对覆膜条件下玉米冠层图像分割的有效性,分别选取ExG、ExR和ExG-ExR算法结合Otsu阈值分割法对覆膜玉米图像进行分割,将分割结果与基于HS颜色分量聚类分析的色差运算分割结果进行比较。结果显示,基于HS颜色分量K均值聚类分析的色差运算分割方法的分割结果优于以上算法的分割结果。对基于HS颜色分量聚类分析的色差运算分割结果进行统计分析表明,基于HS颜色分量K均值聚类分析的色差运算分割方法对覆膜玉米的分割误差率低达3.37%,分割图像准确率高。综上所述,基于HS颜色分量的K均值聚类分析结合相应色差运算的方法适宜对弱光条件下覆膜玉米影像进行分割,且分割结果可靠。该方法对其他矮秆覆膜作物的冠层覆盖度计算也具有一定的参考价值。

本研究是在弱光线条件下(如阴天、早晨或傍晚)进行的,图像的光照比较均匀,基于S颜色分量的聚类分析补偿了H 颜色分量的聚类分析对白色覆膜区域不能准确识别的不足,同时,基于H 颜色分量的聚类分析则弥补了S颜色分量聚类分析不能对绿色植株部分准确识别的不足,两者通过色差运算实现了对覆膜玉米的准确分割。但在强光线条件下,一些覆膜区域会产生阴影或反光,植株叶片受强光照射部分会产生反射光,土壤部分也存在明暗交替的光斑等现象,这些干扰项均会增加图像分割的难度和不确定性。因此,本文提出的基于HS颜色分量的K均值聚类分析结合色差运算的分割方法仅限于阴天、早晨或傍晚等弱光条件下进行影像分割处理。

参考文献
[1] GUEVARA-ESCOBAR A, TELLEZ J, GONZALEZ-SOSA E. Use of digital photography for analysis of canopy closure. Agroforestry Systems, 2005, 65(3): 175-185. DOI:10.1007/s10457-005-0504-y
[2] ADAMS J E, ARKIN G F. A light interception method for measuring row crop ground cover. Soil Science Society of America Journal, 1977, 41(4): 789-792. DOI:10.2136/sssaj1977.03615995004100040037x
[3] 张学艺, 郭建茂, 韩颖娟, 等. 基于植被指数的宁夏灌区春小麦叶面积指数模型. 中国农业气象, 2011, 32(2): 279-282.
ZHANG X Y, GUO J M, HAN Y J, et al. LAI model of spring wheat in Ningxia irrigated area based on MODIS-VI. Chinese Journal of Agrometeorology, 2011, 32(2): 279-282. (in Chinese with English abstract)
[4] 瞿瑛, 刘素红, 谢云. 植被覆盖度计算机模拟模型与参数敏感性分析. 作物学报, 2008, 34(11): 1964-1969.
QU Y, LIU S H, XIE Y. Computer simulation model of fractional vegetation cover and its parameters sensitivity. Acta Agronomica Sinica, 2008, 34(11): 1964-1969. (in Chinese with English abstract)
[5] ARMBRUST D V. Rapid measurement of crop canopy cover. Agronomy Journal, 1990, 82(6): 1170-1171. DOI:10.2134/agronj1990.00021962008200060030x
[6] EWING R P, HORTON R. Quantitative color image analysis of agronomic images. Agronomy Journal, 1999, 91(1): 148-153. DOI:10.2134/agronj1999.00021962009100010023x
[7] 李存军, 王纪华, 刘良云, 等. 基于数字照片特征的小麦覆盖度自动提取研究. 浙江大学学报(农业与生命科学版), 2004, 30(6): 650-656.
LI C J, WANG J H, LIU L Y, et al. Automated digital image analyses for estimating percent ground cover of winter wheat based on object features. Journal of Zhejiang University (Agriculture and Life Sciences), 2004, 30(6): 650-656. (in Chinese with English abstract)
[8] 黄芬, 于琪, 姚霞, 等. 小麦冠层图像H分量的K均值聚类分割. 计算机工程与应用, 2014, 50(3): 129-134.
HUANG F, YU Q, YAO X, et al. K-means clustering segmentation for H weight of wheat canopy image. Computer Engineering and Applications, 2014, 50(3): 129-134. (in Chinese with English abstract)
[9] 李晓斌, 王玉顺, 付丽红. 用K-means图像法和主成分分析法监测生菜生长势(英文). 农业工程学报, 2016, 32(12): 179-186.
LI X B, WANG Y S, FU L H. Monitoring lettuce growth using Kmeans color image segmentation and principal component analysis method. Transactions of the Chinese Society of Agricultural Engineering, 2016, 32(12): 179-186. (in Chinese with English abstract) DOI:10.11975/j.issn.1002-6819.2016.12.026
[10] 张洪超, 侯德文. 一种基于HIS颜色空间的分割复杂背景图像算法. 山东师范大学学报(自然科学版), 2015, 30(3): 49-52.
ZHANG H C, HOU D W. An image segmentation algorithm with complex background based on his color space. Journal of Shandong Normal University(Natural Science), 2015, 30(3): 49-52. (in Chinese with English abstract)
[11] 通霏, 武佩, 韩丁, 等. 基于颜色特征的牧草图像分割方法研究. 农机化研究, 2014(5): 43-47.
TONG F, WU P, HAN D, et al. Study on the image segmentation of forage using color features. Journal of Agricultural Mechanization Research, 2014(5): 43-47. (in Chinese with English abstract)
[12] 陈毅, 刘晓玉, 蒋峥, 等. 基于RGB和HSV的胶囊异囊缺陷识别方法. 计算机工程与设计, 2014, 35(11): 3888-3892.
CHEN Y, LIU X Y, JIANG Z, et al. Capsule Yinang defect recognition based on RGB and HSV color space. Computer Engineering and Design, 2014, 35(11): 3888-3892. (in Chinese with English abstract) DOI:10.3969/j.issn.1000-7024.2014.11.035
[13] MORA M, AVILA F, CARRASCO-BENAVIDES M, et al. Automated computation of leaf area index from fruit trees using improved image processing algorithms applied to canopy cover digital photograpies. Computers and Electronics in Agriculture, 2016, 123: 195-202. DOI:10.1016/j.compag.2016.02.011
[14] 张武, 黄帅, 汪京京, 等. 复杂背景下小麦叶部病害图像分割方法研究. 计算机工程与科学, 2015, 37(7): 1349-1354.
ZHANG W, HUANG S, WANG J J, et al. A segmentation method for wheat leaf images with disease in complex background. Computer Engineering & Science, 2015, 37(7): 1349-1354. (in Chinese with English abstract)
[15] CHITADE A Z, KATIYAR D S K. Colour based image segmentation using K-means clustering. International Journal of Engineering Science and Technology, 2010, 2(10): 5319-5325.
[16] BARBEDO J G A. A novel algorithm for semi-automatic segmentation of plant leaf disease symptoms using digital image processing. Tropical Plant Pathology, 2016, 41(4): 210-224. DOI:10.1007/s40858-016-0090-8
[17] LU H, CAO Z G, XIAO Y, et al. Region-based colour modelling for joint crop and maize tassel segmentation. Biosystems Engineering, 2016, 147: 139-150. DOI:10.1016/j.biosystemseng.2016.04.007
[18] PREETI, AHUJA K. Colour image segmentation using K-means, fuzzy C-means and density based clustering. International Journal for Research in Applied Science and Engineering Technology (Ijraset), 2014, 2(Ⅵ): 31-35.
[19] 李小琦. 基于Matlab的图像阈值分割算法研究. 软件导刊, 2014, 13(12): 76-78.
LI X Q. Research on image threshold segmentation algorithm based on Matlab. Software Guide, 2014, 13(12): 76-78. (in Chinese with English abstract) DOI:10.11907/rjdk.143592
[20] LIU Y, HAN J, LIU D D, et al. Effect of plastic film mulching on the grain filling and hormonal changes of maize under different irrigation conditions. PLoS One, 2015, 10(4): 1-17.
[21] 张仙梅, 黄高宝, 李玲玲, 等. 覆膜方式对旱作玉米硝态氮时空动态及氮素利用效率的影响. 干旱地区农业研究, 2011, 29(5): 26-32.
ZHANG X M, HUANG G B, LI L L, et al. Effects of mulching patterns on spatio-temporal variation of soil nitrate and nitrogen utilization efficiency of maize on dry land. Agricultural Research in the Arid Areas, 2011, 29(5): 26-32. (in Chinese with English abstract)
[22] SELIM S Z, ISMAIL M A. K-means-type algorithms: A generalized convergence theorem and characterization of local optimality. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1984, 6(1): 81-87.
[23] WOEBBECKE D M, MEYER G E, VON BARGEN K, et al. Color indices for weed identification under various soil, residue, and lighting conditions. Transactions of the ASAE, 1995, 38(1): 259-269. DOI:10.13031/2013.27838
[24] PéREZ A J, LóPEZ F, BENLLOCH J V, et al. Colour and shape analysis techniques for weed detection in cereal fields. Computers and Electronics in Agriculture, 2000, 25(3): 197-212. DOI:10.1016/S0168-1699(99)00068-X
[25] MEYER G E, NETO J C. Verification of color vegetation indices for automated crop imaging applications. Computers and Electronics in Agriculture, 2008, 63(2): 282-293. DOI:10.1016/j.compag.2008.03.009
[26] OTSU N. A threshold selection method from gray-level histograms. IEEE Transactions on Systems, Man, and Cybernetics, 1979, 9(1): 62-66. DOI:10.1109/TSMC.1979.4310076
[27] ABEDINPOUR M, SARANGI A, RAJPUT T B S, et al. Performance evaluation of AquaCrop model for maize crop in a semi-arid environment. Agricultural Water Management, 2012, 110(3): 55-66.
[28] BOOTH D T, COX S E, BERRYMAN R D. Point sampling digital imagery with "Samplepoint". Environmental Monitoring and Assessment, 2006, 123(1/2/3): 97-108.