張建 謝田晉 楊萬能 周廣生
摘要: 株高是動態(tài)衡量作物健康和整體生長狀況的關(guān)鍵指標,廣泛用于估測作物的生物學產(chǎn)量和最終籽粒產(chǎn)量。傳統(tǒng)的人工測量方式存在規(guī)模小、效率低以及耗時長等問題。近十年來,近地遙感技術(shù)在農(nóng)業(yè)領域發(fā)展迅速,使得高精度、高頻次、高效率的作物株高采集成為可能。本文首先回顧了國內(nèi)外基于遙感手段獲取株高研究的論文發(fā)表情況;其次對獲取株高的不同平臺以及傳感器的基本原理、優(yōu)勢及其局限性進行了介紹和評述,重點論述了激光雷達和可見光相機兩種傳感器的測高流程與涉及的關(guān)鍵技術(shù);在此基礎上歸納了株高在作物生物量估算、倒伏監(jiān)測、產(chǎn)量預測和輔助育種等方面的應用研究進展;最后對近地遙感技術(shù)在株高獲取上存在的問題進行討論分析,并從測高平臺和傳感器、裸土探測和插值算法、株高應用研究及農(nóng)學與遙感測高差異四個方向進行了展望,可為今后近地遙感測高的研究與方法應用提供參考。
關(guān)鍵詞: 株高;近地遙感;作物;無人機;可見光相機;激光雷達
中圖分類號: S127;TP79? ? 文獻標志碼: A? ? 文章編號: 202102-SA033
引用格式:張建, 謝田晉, 楊萬能, 周廣生. 近地遙感技術(shù)在大田作物株高測量中的研究現(xiàn)狀與展望[J]. 智慧農(nóng)業(yè)(中英文), 2021, 3 (1): 1-15.
Citation:ZHANG Jian, XIE Tianjin, YANG Wanneng, ZHOU Guangsheng. Research status and prospect on height estimation of field crop using near-field remote sensing technology[J]. Smart Agriculture, 2021, 3 (1): 1-15. (
1? 引? 言
株高是作物的重要生長指標,合理株高是實現(xiàn)作物穩(wěn)產(chǎn)、高產(chǎn)的基礎。在第一次綠色革命中,通過培育具有半矮稈基因的小麥和水稻品種,使作物株高降低的同時,抗倒伏能力和生產(chǎn)潛力得到提高[1,2]。但過低的作物株高也會存在減產(chǎn)風險[3],因此通過研究株高遺傳機制,有針對性地制定育種方案,對作物增產(chǎn)具有重要現(xiàn)實意義。此外,株高與作物生物量、產(chǎn)量密切相關(guān)[4,5],通過對作物高度變化的監(jiān)測,確定作物的健康程度和生長狀況,可為加強施肥、除草和收獲等農(nóng)作物生產(chǎn)管理與調(diào)控提供重要的參考依據(jù)。
農(nóng)學領域一般采用標尺人工測量植株高度(簡稱“測高”),其測量方式包括自然株高法、生理株高法和葉枕株高法3種[6]。最常見的是通過測量地面到自然狀態(tài)下主莖頂部的垂直距離作為作物自然株高。由于環(huán)境、基因型或者管理等因素影響,作物之間生長形態(tài)會有較大差異。例如小麥等直立株型作物,上部葉片會出現(xiàn)下披現(xiàn)象[7],為此會將作物葉片展開或者挑選直立生長的植株測量其基部至頂部的垂直距離作為生理株高。表示地面至植株最上部葉枕距離的葉枕株高也是使用較多的高度測量方式。然而,人工測高方式需要大量實地調(diào)查,效率低并且數(shù)據(jù)精度易受主觀因素影響,通過抽樣方式獲取的株高數(shù)據(jù)無法代表大田的全部狀況。遙感技術(shù)的發(fā)展為作物測高研究和應用提供了一種新的解決思路,本文旨在對基于遙感手段的作物株高提取研究進行全面綜述,從不同傳感器類型和平臺出發(fā),總結(jié)作物株高獲取方法及其不足,歸納株高在作物表型特征提取、倒伏監(jiān)測、產(chǎn)量估測和育種等方面的研究應用現(xiàn)狀,并對近地遙感技術(shù)在作物株高獲取方面的發(fā)展趨勢和未來挑戰(zhàn)進行展望。
2? 全球遙感技術(shù)測高研究態(tài)勢
為了解遙感技術(shù)測高領域的研究進展,總結(jié)現(xiàn)有研究所使用的主流傳感器、平臺和重點觀測對象,在“Web of Science”和“ScienceDirect”平臺搜索引擎檢索近10年(2010—2019年)全球發(fā)表的有關(guān)遙感技術(shù)測高的學術(shù)論文,檢索標題關(guān)鍵詞規(guī)則為:"canopy or crop or plant or vegetation or wheat or maize or corn or rice or barley or soybean or sorghum or rapeseed" and "height or lodging or biomass or yield or lai",結(jié)果如圖1所示。圖1(a)統(tǒng)計了遙感測高中使用頻率最高的無人機(Unmanned Arial Vehicle,UAV)平臺搭載可見光相機、激光雷達(Light Detection and Ranging,LiDAR)、合成孔徑雷達和超聲波傳感器4種主要方法及其他方法的文章數(shù)量,其整體上呈現(xiàn)增加的趨勢。其中,由于低空遙感技術(shù)及計算機視覺方法的高速發(fā)展,無人機平臺搭載可見光相機成為最常用的作物株高獲取手段,其次為激光雷達方法。圖1(b)顯示,作物測高論文多數(shù)以小麥、玉米、水稻等大宗糧食作物為研究對象。并且,作物測高以地面及無人機等近地遙感平臺為主(圖1(c)),近地觀測方式更適用于具有低矮、種植密集結(jié)構(gòu)特征的作物。
3? 近地遙感技術(shù)測高的研究進展
按傳感器工作方式的不同可將近地遙感測高技術(shù)分為主動式遙感和被動式遙感兩種。主動式遙感的傳感器帶有能發(fā)射電磁波、聲波等訊號的輻射源,同時能夠接收并記錄目標物反射回來的訊號,因此受光照條件影響小,可以晝夜工作;被動式遙感則是利用傳感器直接接收并記錄目標物反射自然輻射源的電磁波或自身發(fā)出的電磁波而進行探測。該類傳感器通常成本較低,但更易受到光照條件影響,且不具有穿透性。
3.1 主動式遙感測高及特點
激光雷達通過記錄激光飛行時間(Time of Flight,ToF)能夠準確地定位激光束接觸到物體的光斑。由于其較強的穿透能力,通過脈沖的多次回波反射通??梢酝瑫r記錄被測物體冠層和土壤的點云信息,通過分類和濾波處理后可以得到物體高度信息。在20世紀90年代,搭載于直升機或固定翼飛機的激光雷達系統(tǒng)發(fā)展迅速,廣泛應用于精準林業(yè)的測量,如林冠高度、林分平均樹高、地上生物量等[8,9]。然而,航空激光雷達的飛行高度通常在百米至千米級,其較低的測距精度難以實現(xiàn)土壤與低矮作物的區(qū)分[10,11]。此外,數(shù)據(jù)采集成本及處理復雜性增加了基于激光雷達的多時相作物株高監(jiān)測的難度[12]。地基激光雷達系統(tǒng)能夠獲取毫米級精度的三維點云數(shù)據(jù),更適合于作物的株高提取。其地基平臺主要包括地面固定平臺[13-15]、機器人平臺[16]和車載平臺[17,18]等。在玉米[19]、小麥[20,21]、水稻[22,23]以及高粱[24]等多種作物中均取得了高精度的作物株高估算結(jié)果。但是,地面固定平臺存在數(shù)據(jù)采集效率低、遮擋嚴重等問題,而地面移動平臺易受不同大田作物行距、株距以及作物高度等因素的限制,同時直接田間行駛也容易造成土壤壓實[25]。隨著激光雷達的輕小型化發(fā)展,其搭載于無人機上成為可能,在提高作物觀測頻次和效率的同時,可以減少數(shù)據(jù)獲取時對作物造成的干擾,填補了數(shù)據(jù)精度高但效率低的地基激光雷達系統(tǒng)與能夠進行大面積數(shù)據(jù)采集但細節(jié)信息較少的有人機激光雷達系統(tǒng)之間的空白[26]。
超聲波(Ultrasonic)傳感器能夠發(fā)射頻率高于20 kHz的超聲波脈沖,通過記錄聲音發(fā)射與返回的時間差計算傳感器到物體的距離,具有部署容易、數(shù)據(jù)處理簡單、成本較低等優(yōu)勢,并可以通過微調(diào)實現(xiàn)厘米級的株高測量精度[24,27,28]。但超聲波傳感器屬于近端遙感,信號衰減較快,測高精度會隨著傳感器與目標作物距離增加而下降。因此,超聲波傳感器測量距離大多選擇在10 m以內(nèi)[29],且多搭載于地面移動平臺[30,31]。
3.2 被動式遙感測高及特點
微軟公司發(fā)布的Kinect v2能夠以30 f/s的速度獲取分辨率為512 × 424 px的深度圖像,深度檢測范圍為0.5~4.5 m[32,33]。系統(tǒng)成像速度較快,基本能夠滿足實時測量的需求。然而深度相機測量尺度有限,獲取的圖像分辨率較低,因此通常搭載于地面平臺[34,35],且多用于溫室盆栽作物株高監(jiān)測[36]。
與深度相機相比,可見光相機能夠獲取更高分辨率的圖像[37]。有研究在被測作物周邊放置已知高度的參考物,通過采集同時包含參考物和作物的單幅可見光圖像,計算作物高度[38,39]。隨著計算機視覺技術(shù)的發(fā)展,出現(xiàn)了通過在地面平臺布設多個可見光相機或者移動單相機來產(chǎn)生多視角圖像,并基于雙目立體匹配[40]、多視圖立體視覺[41,42]等方法的作物測高方式,但其效率較低、測量范圍較小。近十年來,無人機搭載高分辨率可見光相機因其成本低、分辨率高且易于部署等優(yōu)點已成為近地遙感技術(shù)中最廣泛使用的作物株高獲取方法[43-46]。通過可見光相機采集重疊的圖像,基于運動恢復結(jié)構(gòu)(Structure from Motion,SfM)對重疊圖像進行特征檢測與匹配[47]。特征匹配構(gòu)建稀疏點云后基于多視圖立體視覺法進行密集點云重構(gòu),最后通過點云插值生成表示作物或土壤高程的影像。目前,該技術(shù)已應用于玉米[48-51]、水稻[52]、高粱[53,54]、小麥[5,46,55]、大豆[56]、棉花[57,58]等作物,其精度與地面實測數(shù)據(jù)相比,多數(shù)作物的決定系數(shù)R2可控制在0.80以上,均方根誤差(Root Mean Square Error,RMSE)在10 cm以下。
總之,激光雷達、超聲波、深度相機和可見光相機均能夠在地面平臺上獲取較好的大田作物高度估算精度。但是,地面平臺的行進效率和靈活性在一定程度上限制了其應用范圍。無人機平臺能夠較好彌補上述不足。目前,無人機結(jié)合可見光相機在作物測高研究中已得到廣泛應用,而隨著激光雷達的不斷輕小型化,讓無人機搭載模式成為可能,該模式也逐步在作物測高領域得到應用。
3.3 近地遙感測高流程與關(guān)鍵技術(shù)
本節(jié)以地基激光雷達和無人機搭載可見光相機兩種作物測高主流方法為例,介紹株高獲取的主要流程(圖2)和涉及的關(guān)鍵技術(shù)。
3.3.1 地基激光雷達測高方法
為避免地基測高平臺受到遮擋影響,一般會從不同高度和角度設置多個數(shù)據(jù)采集站點。因此,在地基激光雷達株高提取前需要利用最鄰近點迭代算法(Iterative Closest Point,ICP)[59,60]或者加入外置特征地物(如標靶球)[13,14]等方式實現(xiàn)多站點數(shù)據(jù)的配準。對點云數(shù)據(jù)進行配準和去噪等預處理步驟后,準確地提取作物頂部及土壤區(qū)域是實現(xiàn)精確測高關(guān)鍵步驟之一[61]。程曼等[62]利用多項式曲線針對花生冠層輪廓特征使用激光點云進行擬合,結(jié)果顯示5階曲線擬合效果最佳,通過計算擬合曲線極大值點和極小值點以判斷作物冠層輪廓的邊界,得到花生株高,如圖3所示。多項式曲線擬合方法適合冠層均勻的圓形葉片,對于不均勻的尖頂作物,容易造成低估且產(chǎn)生過高的擬合階數(shù)。蘇偉等[19]將單株玉米點云數(shù)據(jù)從群體點云中分離后,遍歷單株玉米點云獲得其空間坐標與點距,株高即為點云高度坐標的最大值和最小值兩點間的歐式距離。此外,在播種后即進行土壤基面的點云數(shù)據(jù)獲取也能夠有效地減輕植株遮擋對土壤點提取產(chǎn)生的影響。通過在激光雷達測高系統(tǒng)上搭載可見光相機,獲取每個測量點空間坐標的同時,能夠記錄被測物體的顏色和紋理信息,所產(chǎn)生的著色點云數(shù)據(jù)有助于作物與土壤的分類,也能夠?qū)崿F(xiàn)土壤和作物冠層點的精確提取。
3.3.2 無人機搭載可見光相機測高方法
被動式遙感傳感器無法穿透植被冠層獲取土壤基底,在作物還未封行的生長初期或作物種植間隔本身較大的情況下,可以提取土壤區(qū)域,并通過克里金插值[63,64]、反距離權(quán)重插值[53,65]、自然鄰域插值[45]等方法獲得完整、準確的數(shù)字地形模型(Digital Terrain Model,DTM)(圖4)。該方式可以同步獲取DTM和作物表面高程(Digital Surface Model,DSM),減少成本的同時避免人為因素或惡劣天氣造成的土壤高程變化。然而,農(nóng)業(yè)生產(chǎn)中往往以較高的播種密度種植經(jīng)濟作物,使作物迅速封閉冠層,抑制雜草生長[66,67],增加了從DSM中推導DTM的難度。因此對于種植密集、冠層均勻或者基底起伏較大的壟作型作物,如高粱、油菜、馬鈴薯、甜菜等,多數(shù)實驗選擇在作物出苗前或收獲后進行額外一次的飛行任務獲取DTM[5,18,68,69]。
基于SfM算法的作物株高提取方法中需要精確地檢測及匹配圖像中大量特征信息,以獲取高質(zhì)量的冠層重建[70]。該過程通常采用尺度不變特征變換(Scale-Invariant Feature Transform,SIFT)算法尋找同名像點。然而與建筑物、樹林等地物相比,農(nóng)田作物存在大量的自遮擋現(xiàn)象且紋理信息單一,在多個視圖中難以實現(xiàn)葉片目標的精準匹配,從而增大特征匹配誤差,造成細節(jié)形態(tài)和紋理信息的缺失。對于郁閉度較高的農(nóng)田目標,通??梢栽O置較高的無人機圖像采集重疊度,以及更高的飛行高度來應對上述問題。Hasheminasab等[71]使用高精度全球定位系統(tǒng)/慣性傳感單元(GPS/IMU)以減少特征匹配的搜索空間,代替?zhèn)鹘y(tǒng)的窮舉搜索,能夠減輕由重復紋理引起的匹配模糊問題。此外,無人機飛行高度較低時而產(chǎn)生的氣流或者多風的環(huán)境條件均會使冠層發(fā)生移動,導致不同圖像中葉片、穗部等結(jié)構(gòu)的位置改變,也會對特征匹配產(chǎn)生一定的不利影響[72]。
4? 近地遙感技術(shù)測高在農(nóng)業(yè)中的應用
作物株高由于能夠通過遙感方式無損、高精度地直接測量,因此常被作為模型變量應用于作物生理生化指標反演、倒伏識別、產(chǎn)量預測和育種等方面(表1)。
(1)生物量估算。當前研究主要通過株高或聯(lián)合光譜指數(shù)、冠層覆蓋度等作物表型參數(shù)并采用線性回歸模型[87,88]、指數(shù)回歸模型[69,73,89]、偏最小二乘回歸模型[90]、隨機森林[76,91]、支持向量機[92]等建模方法預測地上部生物量。此外,通過株高在指定面積上進行累加構(gòu)建作物體積模型也能夠獲取準確的生物量預測結(jié)果[93,94]。與光譜指標相比,形態(tài)指標受光照條件影響較小,并且光譜指數(shù)在作物生長后期會出現(xiàn)飽和[95],因此由株高計算得到的生物量結(jié)果更加精確、穩(wěn)定。
(2)倒伏監(jiān)測。倒伏是指作物直立部位發(fā)生永久位移[96],作物的抗倒伏能力是重要的遺傳特性以及育種重要的選擇標準[97],通??梢酝ㄟ^提取光譜特征、紋理信息或倒伏前后株高變化測定倒伏面積和倒伏程度[98,99]。Singh等[96]將小麥倒伏前的一期DSM減去倒伏后的DSM得到差分DSM,并提取差分DSM中各個小區(qū)高程的均值,與人工打分得到的倒伏發(fā)生率、倒伏嚴重程度與倒伏指數(shù)相比,相關(guān)性在0.77~0.93之間。Su等[97]使用灰度共生矩陣提取玉米倒伏前后可見光圖像紋理特征,同時也通過倒伏前后DSM數(shù)據(jù)相減方式,分別獲得倒伏面積,其估計誤差分別為10.00%和0.85%。說明相對紋理指標,株高更能準確地測定倒伏程度和面積。
(3)產(chǎn)量預測。株高也是預測產(chǎn)量重要的指標之一。Li等[100]通過UAV搭載可見光和多光譜傳感器提取株高和多種植被指數(shù)預測小麥產(chǎn)量,發(fā)現(xiàn)在套索算法和隨機森林兩種模型中,灌漿期株高產(chǎn)量估測的重要性得分均排在首位。并且,通過株高建立模型進行估產(chǎn),一般認為越接近收獲期模型估測精度越高[63,101]。
(4)輔助育種。株高是由多基因控制的數(shù)量性狀[102],易受環(huán)境、基因型及其互作的影響,通過高度的變化特征可以更好地研究作物生長的遺傳機制[85,103,104]。Hassan等[85]對株高性狀進行全基因組和數(shù)量性狀基因座(Quantitative Trait Locus,QTL)標記,結(jié)果發(fā)現(xiàn)無人機估測小麥株高所預測的基因組值與實際值相關(guān)性在0.47~0.53之間,與地面實測值比較,二者呈現(xiàn)相似的基因組預測能力。目前,越來越多研究通過遙感方式獲取作物株高用于育種[105,106],遙感手段已被大量試驗證明可以獲取高頻次、高精度、重復性好的連續(xù)冠層高度分布數(shù)據(jù),對作物育種具有重要現(xiàn)實意義[97]。
綜上,在近地遙感測高的農(nóng)業(yè)應用研究中,由于經(jīng)驗統(tǒng)計回歸方法具有技術(shù)門檻低、反演參數(shù)少、方法簡單有效等優(yōu)勢[107],作物長勢參數(shù)大多是通過該方法進行估測(表1)。但是該模型需要大量實測數(shù)據(jù)進行反演,缺乏明確的物理意義。部分應用通過株高等性狀建立機器學習模型優(yōu)化作物長勢反演結(jié)果,但在實際應用中,一般需要單獨建模以適應作物品種、生育期等因素的變化。Yu等[82]嘗試將甘蔗的株高性狀與農(nóng)田水文模型耦合,構(gòu)建了新的數(shù)據(jù)同化系統(tǒng),有助于提高禾本科作物的產(chǎn)量估計精度。株高數(shù)據(jù)與作物模型的同化,可提高作物性狀的反演精度,并實現(xiàn)模型的時空擴展。因此,未來研究可以通過將株高融入作物模型等方式改進作物長勢參數(shù)反演精度,解決經(jīng)驗模型和傳統(tǒng)機器學習模型普適性弱、穩(wěn)定性差的問題。
5? 存在的問題與展望
5.1 測高精度與成本的平衡問題
無人機平臺搭載可見光相機可以通過多種空間輔助數(shù)據(jù)來優(yōu)化測高精度,如DTM、DSM、地面控制點(Ground Control Point,GCP)、地面實測株高數(shù)據(jù)等??茖W研究或農(nóng)業(yè)生產(chǎn)等不同領域為滿足測高精度與成本需求會對空間輔助數(shù)據(jù)有所取舍。文獻[108]系統(tǒng)地評估了多種空間輔助數(shù)據(jù)組合下無人機搭載可見光相機的測高精度與成本。本研究結(jié)合文獻[108]討論結(jié)果,將決定系數(shù)R2和均方根誤差RMSE作為精度評價指標,人工、時間和操作成本作為成本評價指標,共同評估DTM、GCP、株高實測數(shù)據(jù)和冠層密度4種數(shù)據(jù)不同組合下的測高結(jié)果,如表2所示。
類別1同時采集了DTM和GCP數(shù)據(jù),結(jié)合DSM能夠?qū)崿F(xiàn)高精度的株高測量。此外,通過加入地面實測數(shù)據(jù)建立線性回歸模型能夠有效降低株高的絕對誤差。使用完備的空間輔助數(shù)據(jù)會提升數(shù)據(jù)采集成本,但對于精細的株高提取是必要的。例如,株高性狀的基因定位對于作物育種具有重要的理論和應用價值[109],育種學家會關(guān)注作物全生育期的株高變化,上述情況需要收集完備的空間輔助數(shù)據(jù)以實現(xiàn)高精度的估測結(jié)果[110]。
DTM數(shù)據(jù)一般在作物出苗前或收獲后采集,以避免作物對土壤造成的遮擋。然而,在作物冠層稀疏的情況下,可以從DSM中提取裸土高程作為基底,該方式能降低數(shù)據(jù)采集成本,并且裸露土壤較多的情況下能夠獲取與完備條件下近似的精度(具體見3.3.2)。然而,當作物冠層已經(jīng)密閉且地形起伏大時,難以通過對裸地進行插值的方式實現(xiàn)準確地DTM構(gòu)建,該種情況下,采集單獨一期的DTM數(shù)據(jù)是有必要的。
在無人機圖像拼接時,通過導入同一套GCP能夠?qū)崿F(xiàn)多幅圖像間的配準并提高圖像質(zhì)量,因此可以觀測特定作物植株或群體株高的動態(tài)變化。然而,在地勢崎嶇、區(qū)域分散或有灌溉系統(tǒng)的大田中,GCP的布設難以實現(xiàn)[75]。同時需要實時動態(tài)(Real-Time Kinematic,RTK)等儀器測量GCP的空間位置,其信號強度易受周圍環(huán)境影響,如高壓電線、變壓器或地形等。因此,從農(nóng)業(yè)生產(chǎn)的角度而言,進行GCP布設和位置信息收集的難度較大,成本較高。在難以實現(xiàn)GCP的田間布設時,從圖像中直接提取GCP可以有助于多期數(shù)據(jù)間的配準,但通常測高精度會低于完備數(shù)據(jù)的情況。
表2通過從測高精度要求與數(shù)據(jù)獲取成本的雙重角度進行評估,能夠為科學研究和實際農(nóng)業(yè)生產(chǎn)中的作物測高方案制定提供參考,從而在保證測高精度需求的前提下合理地選擇空間輔助數(shù)據(jù)。
5.2 無人機遙感平臺的精細測高問題
低空無人機被動遙感以成像方式構(gòu)建三維模型獲取大田作物株高,對于玉米、水稻、小麥等葉片直立而狹長的作物,尤其是穗部或葉尖處的高度信息提取難度大,易造成作物株高的低估。Liu等[111]使用Mavic Pro2采集圖像后構(gòu)建冠層三維點云直接進行株高量測,飛行高度為5 m,但仍然不能恢復其完整的穗部結(jié)構(gòu)。
由可見光圖像導出作物高程是一種間接測高方法,而激光雷達是通過點云數(shù)據(jù)直接測量,其測高精度通常優(yōu)于可見光方式[20,21]。目前,已有一些輕小型化的激光雷達搭載于無人機上對作物測高研究進行了初步探索(表3),其重量均在4 kg以下,測量精度為0.5~5 cm。然而,由表3可見,只有通過降低測量高度情況下(測量高度低于20 m)作物的株高測量精度較好,表3其余作物株高測量結(jié)果的R2均在0.8以下。而且,這些激光雷達仍未解決成本高問題。以上是限制無人機搭載激光雷達模式在農(nóng)業(yè)領域發(fā)展的重要原因。大疆創(chuàng)新科技有限公司于2020年10月發(fā)布了一款“禪思L1”,該系統(tǒng)集成了低成本、輕小型的Livox系列激光雷達AVIA。Hu等[112]人評估了同品牌的MID 40在森林資源清查中的應用,在100 m飛行高度下可以獲得密度大于464 pts/m2的點云數(shù)據(jù),能夠精確地計算樹高、林冠覆蓋率、林隙分數(shù)等森林表型信息。AVIA相較于MID 40具有更大的FOV(視場角)和點云數(shù)據(jù)率,從而可以提高數(shù)據(jù)采集效率和點云密度,然而,還未有研究將其應用于大田作物的表型提取。目前,激光雷達系統(tǒng)的高成本以及點云密度、測距精度等性能無法滿足農(nóng)田作物表型精確測量仍是遙感測高中急需解決的問題。
5.3 遙感測高與農(nóng)學測高的差異問題
作物的形態(tài)結(jié)構(gòu)會因栽培措施、環(huán)境、品種等因素產(chǎn)生變化。并且,農(nóng)學上的作物株高測量通常不包含禾本科作物的芒及豆科作物的卷須等部分[115]。然而,遙感測高一般獲取田間作物在自然狀態(tài)下植株全部結(jié)構(gòu)的頂端到地面的垂直距離,因此會與農(nóng)學測高結(jié)果不同。為方便表述,下文使用自然株高和植株長度分別表示遙感和農(nóng)學測取的作物高度(圖5)。例如,受栽培措施影響,小麥株型會產(chǎn)生變化,根據(jù)旗葉形態(tài)結(jié)構(gòu)可以劃分為“直立型”和“下披型”[116],通過遙感手段容易對“下披型”作物的真實植株長度產(chǎn)生低估;遙感方式獲取的自然株高能夠輔助倒伏區(qū)域識別、倒伏程度測定,但無法提供倒伏后的真實植株長度;同樣,遙感測高方式會將作物的芒計算在內(nèi),以上情況皆不能獲取作物真實的植株長度,進而影響估產(chǎn)等應用。遙感測高需要根據(jù)農(nóng)業(yè)應用需求有針對性地制定株高測量方案。當作物的自然株高與真實植株長度有差異時,可以嘗試通過多種傳感器協(xié)同進行株高提取。例如通過可見光相機獲取紋理圖像識別植株主體部分,再使用激光雷達進行植株的骨架提取,針對彎折部分采取分段式株高量測。多相機傾斜攝影測量技術(shù)能夠獲得地物豐富的紋理信息[117,118],通過建立作物三維模型也能夠為傾斜狀態(tài)下的作物植株長度量測提供一種解決思路。
5.4 未來研究方向展望
近十年來,近地遙感技術(shù)在大田作物測高研究中得到廣泛應用,能夠?qū)崿F(xiàn)大面積作物的同步監(jiān)測,獲得高精度、重復性好的作物株高數(shù)據(jù),考慮到近地遙感技術(shù)仍面臨多種問題,未來該領域主要可以從以下4個方面展開科學研究。
(1)無人機作為作物株高獲取的主要平臺,需要提高有效載荷和續(xù)航能力,而測高傳感器需要向輕小型、低成本方向發(fā)展,實現(xiàn)作物株高高效、大面積的觀測。
(2)被動式傳感器無法穿透作物冠層,需要單獨執(zhí)行一次飛行任務采集裸地高程或者通過從DSM中提取土壤部分進行插值獲得DTM,前者會增加數(shù)據(jù)采集成本,而后者在裸露土壤較少時DTM提取精度較差。因此需要改進裸地探測算法與插值算法,以實現(xiàn)小樣本量的裸地插值以及在復雜大田環(huán)境下的精準裸地探測,從而提高數(shù)據(jù)采集效率、改善測高精度。
(3)作物株高在農(nóng)業(yè)中應用廣泛,一方面株高可用于作物多種長勢參數(shù)估測,但其反演方式以經(jīng)驗統(tǒng)計、傳統(tǒng)機器學習方法為主,需要探索針對不同作物、不同生育期、不同環(huán)境中的長勢反演通用模型。另一方面,加強遙感與遺傳育種研究的結(jié)合,可為株高遺傳機制研究提供高通量的株高數(shù)據(jù),突破現(xiàn)有作物形態(tài)指標數(shù)據(jù)獲取效率瓶頸,能夠促進大田作物遺傳育種研究,實現(xiàn)糧食產(chǎn)量與質(zhì)量的提高。
(4)遙感和農(nóng)學測高方式存在一定差異,需要結(jié)合作物株型結(jié)構(gòu)特點和科學問題,有針對性地進行作物株高提取方法研究以滿足科學研究和實際應用的需求。
參考文獻:
[1] PENG J, RICHARDS D E, HARTLEY N M, et al. 'Green revolution' genes encode mutant gibberellin response modulators[J]. Nature, 1999, 400: 256-261.
[2] SASAKI A, ASHIKARI M, UEGUCHI-TANAKA M, et al. Green revolution: A mutant gibberellin-synthesis gene in rice[J]. Nature, 2002, 416: 701-702.
[3] KUMAR K, NEELAM K, BHATIA D, et al. High resolution genetic mapping and identification of a candidate gene(s) for the purple sheath color and plant height in an interspecific F-2 population derived from Oryza nivara Sharma & Shastry × Oryza sativa L. cross[J]. Genetic Resources and Crop Evolution, 2020, 67(1): 97-105.
[4] LIU K, DONG X, QIU B, et al. Analysis of cotton height spatial variability based on UAV-LiDAR[J]. International Journal of Precision Agricultural Aviation, 2020, 3(3): 72-76.
[5] 劉治開, 牛亞曉, 王毅, 等. 基于無人機可見光遙感的冬小麥株高估算[J]. 麥類作物學報, 2019, 39(7): 859-866.
LIU Z, NIU Y, WANG Y, et al. Estimation of plant height of winter wheat based on UAV visible image[J]. Journal of Triticeae Crops, 2019, 39(7): 859-866.
[6] 黃瑞冬, 李廣權(quán). 玉米株高整齊度及其測定方法的比較[J]. 玉米科學, 1995, 3(2): 61-63.
HUANG R, LI G. Plant height consistencies in maize population and a comparison of their measuring techniques[J]. Maize Science, 1995, 3(2): 61-63.
[7] 趙廣才. 關(guān)于調(diào)查小麥株高標準的討論[J]. 北京農(nóng)業(yè)科學, 1996, 14(1): 18.
ZHAO G. Discussion on investigating high standard of wheat plant[J]. Beijing Agricultural Sciences, 1996, 14(1): 18.
[8] 劉建剛, 趙春江, 楊貴軍, 等. 無人機遙感解析田間作物表型信息研究進展[J]. 農(nóng)業(yè)工程學報, 2016, 32(24): 98-106.
LIU J, ZHAO C, YANG G, et al. Review of field-based phenotyping by unmanned aerial vehicle remote sensing platform[J]. Transactions of the CSAE, 2016, 32(24): 98-106.
[9] HMIDA S BEN, KALLEL A, PGASTELLU-ETCHEGORRY J, et al. Crop biophysical properties estimation based on LiDAR full-waveform inversion using the DART RTM[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2017, 10(11): 4853-4868.
[10] 陳松堯, 程新文. 機載LiDAR系統(tǒng)原理及應用綜述[J]. 測繪工程, 2007, 16(1): 27-31.
CHEN S, CHENG X. The principle and application of airborne LiDAR[J]. Engineering of Surveying and Mapping, 2007, 16(1): 27-31.
[11] LI W, NIU Z, WANG C, et al. Combined use of airborne LiDAR and satellite GF-1 data to estimate leaf area index, height, and aboveground biomass of maize during peak growing season[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2015, 8(9): 4489-4501.
[12] EITEL J U, H?FLE B, VIERLING L A, et al. Beyond 3-D: The new spectrum of LiDAR applications for earth and ecological sciences[J]. Remote Sensing of Environment, 2016, 186: 372-392.
[13] FRIEDLI M, KIRCHGESSNER N, GRIEDER C, et al. Terrestrial 3D laser scanning to track the increase in canopy height of both monocot and dicot crop species under field conditions[J]. Plant Methods, 2016, 12: ID 9.
[14] CROMMELINCK S, HOEFLE B. Simulating an autonomously operating low-cost static terrestrial LiDAR for multitemporal maize crop height measurements[J]. Remote Sensing, 2016, 8(3): ID 205.
[15] EITEL J U H, MAGNEY T S, VIERLING L A, et al. An automated method to quantify crop height and calibrate satellite-derived biomass using hypertemporal LiDAR[J]. Remote Sensing of Environment, 2016, 187: 414-422.
[16] QIU Q, SUN N, BAI H, et al. Field-based high-throughput phenotyping for maize plant using 3D LiDAR point cloud generated with a "Phenomobile"[J]. Frontiers in Plant Science, 2019, 10: ID 554.
[17] ANDUJAR D, ESCOLA A, ROSELL-POLO J R, et al. Potential of a terrestrial LiDAR-based system to characterise weed vegetation in maize crops[J]. Computers and Electronics in Agriculture, 2013, 92: 11-15.
[18] MALAMBO L, POPESCU S C, MURRAY S C, et al. Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery[J]. International Journal of Applied Earth Observation and Geoinformation, 2018, 64: 31-42.
[19] 蘇偉, 蔣坤萍, 郭浩, 等. 地基激光雷達提取大田玉米植株表型信息[J]. 農(nóng)業(yè)工程學報, 2019, 35(10): 125-130.
SU W, JIANG K, GUO H, et al. Extraction of phenotypic information of maize plants in field by terrestrial laser scanning[J]. Transactions of the CSAE, 2019, 35(10): 125-130.
[20] YUAN W, LI J, BHATTA M, et al. Wheat height estimation using LiDAR in comparison to ultrasonic sensor and UAS[J]. Sensors, 2018, 18(11): ID 3731.
[21] MADEC S, BARET F, DE SOLAN B, et al. High-throughput phenotyping of plant height: Comparing unmanned aerial vehicles and ground LiDAR Estimates[J]. Frontiers in Plant Science, 2017, 8: ID 2002.
[22] PHAN A T T, TAKAHASHI K, RIKIMARU A, et al. Method for estimating rice plant height without ground surface detection using laser scanner measurement[J]. Journal of Applied Remote Sensing, 2016, 10(4): ID 046018.
[23] JIMENEZ-BERNI J A, DEERY D M, PABLO R L, et al. Throughput determination of plant height, ground cover, and above-ground biomass in wheat with LiDAR[J]. Frontiers in Plant Science, 2018, 9: ID 237.
[24] WANG X, SINGH D, MARLA S, et al. Field-based high-throughput phenotyping of plant height in sorghum using different sensing technologies[J]. Plant Methods, 2018, 14: ID 53.
[25] BARKER J, ZHANG N, SHARON J, et al. Development of a field-based high-throughput mobile phenotyping platform[J]. Computers and Electronics in Agriculture, 2016, 122: 74-85.
[26] HARKEL T J, BARTHOLOMEUS H, KOOISTRA L, et al. Biomass and crop height estimation of different crops using UAV-based LiDAR[J]. Remote Sensing, 2020, 12(1): ID 17.
[27] THOMPSON A L, THORP K R, CONLEY M M, et al. Comparing nadir and multi-angle view sensor technologies for measuring in-field plant height of upland cotton[J]. Remote Sensing, 2019, 11: ID 700.
[28] SCHIRRMANN M, HAMDORF A, GIEBEL A, et al. Regression kriging for improving crop height models fusing ultra-sonic sensing with UAV imagery[J]. Remote Sensing, 2017, 9(7): ID 665.
[29] YUAN H, BENNETT R S, WANG N, et al. Development of a peanut canopy measurement system using a ground-based LiDAR sensor[J]. Frontiers in Plant Science, 2019, 10: ID 203.
[30] BARMEIER G, MISTELE B, SCHMIDHALTER U, et al. Referencing laser and ultrasonic height measurements of barley cultivars by using a herbometre as standard[J]. Crop & Pasture Science, 2016, 67(12): 1215-1222.
[31] PITTMAN J J, ARNALL D B, INTERRANTE S M, et al. Estimation of biomass and canopy height in bermudagrass, alfalfa, and wheat using ultrasonic, laser, and spectral sensors[J]. Sensors, 2015, 15(2): 2920-2943.
[32] 馮佳睿, 馬曉丹, 關(guān)海鷗, 等. 基于深度信息的大豆株高計算方法[J]. 光學學報, 2019, 39(5): 258-268.
FENG J, MA X, GUAN H, et al. Calculation method of soybean plant height based on depth information[J]. Acta Optica Sinica, 2019, 39(5): 258-268.
[33] MARTINEZ-GUANTER J, RIBEIRO A, PETEINATOS G G, et al. Low-cost three-dimensional modeling of crop plants[J]. Sensors, 2019, 19: ID 2883.
[34] VAZQUEZ-ARELLANO M, PARAFOROS D S, REISER D, et al. Determination of stem position and height of reconstructed maize plants using a time-of-flight camera[J]. Computers and Electronics in Agriculture, 2018, 154: 276-288.
[35] HAEMMERLE M, HOEFLE B. Mobile low-cost 3D camera maize crop height measurements under field conditions[J]. Precision Agriculture, 2018, 19(4): 630-647.
[36] MA X, ZHU K, GUAN H, et al. High-throughput phenotyping analysis of potted soybean plants using colorized depth images based on a proximal platform[J]. Remote Sensing, 2019, 11(9): ID 1085.
[37] XIONG X, YU L, YANG W, et al. A high-throughput stereo-imaging system for quantifying rape leaf traits during the seedling stage[J]. Plant Methods, 2017, 13: ID 7.
[38] MANO M. Precise and continuous measurement of plant heights in an agricultural field using a time-lapse camera[J]. Journal of Agricultural Meteorology, 2017, 73(3): 100-108.
[39] SRITARAPIPAT T, RAKWATIN P, KASETKASEM T, et al. Automatic rice crop height measurement using a field server and digital image processing[J]. Sensors, 2014, 14(1): 900-926.
[40] CAI J, KUMAR P, CHOPIN J, et al. Land-based crop phenotyping by image analysis: Accurate estimation of canopy height distributions using stereo images[J]. PloS One, 2018, 13(5): ID e0196671.
[41] BROCKS S, BARETH G. Evaluating dense 3D reconstruction software packages for oblique monitoring of crop canopy surface[C]// The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Prague, Czech Republic: XXIII ISPRS Congress, 2016: 785-789.
[42] ZHANG Y, TENG P, AONO M, et al. 3D monitoring for plant growth parameters in field with a single camera by multi-view approach[J]. Journal of Agricultural Meteorology, 2018, 74(4): 129-139.
[43] BENDIG J, BOLTEN A, BARETH G, et al. UAV-based imaging for multi-temporal, very high resolution crop surface models to monitor crop growth variability[J]. Photogramm Fernerkund Geoinf, 2013, 47(6): 551-562.
[44] HOLMAN F H, RICHE A B, MICHALSKI A, et al. High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing[J]. Remote Sensing, 2016, 8(12): ID 1031.
[45] CHANG A, JUNG J, MAEDA M M, et al. Crop height monitoring with digital imagery from Unmanned Aerial System (UAS)[J]. Computers and Electronics in Agriculture, 2017, 141: 232-237.
[46] 陶惠林, 徐良驥, 馮海寬, 等. 基于無人機數(shù)碼影像的冬小麥株高和生物量估算[J]. 農(nóng)業(yè)工程學報, 2019, 35(19): 107-116.
TAO H, XU L, FENG H, et al. Estimation of plant height and biomass of winter wheat based on UAV digital image[J]. Transactions of the CSAE, 2019, 35(19): 107-116.
[47] LOWE D G. Distinctive image features from scale-invariant keypoints[J]. International Journal of Computer Vision, 2004, 60(2): 91-110.
[48] BELTON D, HELMHOLZ P, LONG J, et al. Crop height monitoring using a consumer-grade camera and UAV technology[J]. Journal of Photogrammetry Remote Sensing and Geoinformation Science, 2019, 87: 249-262.
[49] HAN L, YANG G, DAI H, et al. Fuzzy clustering of maize plant-height patterns using time series of UAV remote-sensing images and variety traits[J]. Frontiers in Plant Science, 2019, 10: ID 926.
[50] TIRADO S B, HIRSCH C N, SPRINGER N M, et al. UAV-based imaging platform for monitoring maize growth throughout development[J]. Plant Direct, 2020, 4(6): ID e00230.
[51] 牛慶林, 馮海寬, 楊貴軍, 等. 基于無人機數(shù)碼影像的玉米育種材料株高和LAI監(jiān)測[J]. 農(nóng)業(yè)工程學報, 2018, 34(5): 73-82.
NIU Q, FENG H, YANG G, et al. Monitoring plant height and leaf area index of maize breeding material based on UAV digital images[J]. Transactions of the CSAE, 2018, 34(5): 73-82.
[52] LIU H, ZHANG J, PAN Y, et al. An efficient approach based on UAV orthographic imagery to map paddy with support of field-level canopy height from point cloud data[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2018, 11(6): 2034-2046.
[53] HU P, CHAPMAN S C, WANG X, et al. Estimation of plant height using a high throughput phenotyping platform based on unmanned aerial vehicle and self-calibration: Example for sorghum breeding[J]. European Journal of Agronomy, 2018, 95: 24-32.
[54] HAN X, THOMASSON J A, BAGNALL G C, et al. Measurement and calibration of plant-height from fixed-wing UAV images[J]. Sensors, 2018, 18(12): ID 4092.
[55] SONG Y, WANG J. Winter wheat canopy height extraction from UAV-based point cloud data with a moving cuboid filter[J]. Remote Sensing, 2019, 11(5): ID 1239.
[56] BORRA-SERRANO I, DE SWAEF T, QUATAERT P, et al. Closing the phenotyping gap: High resolution UAV time series for soybean growth analysis provides objective data from field trials[J]. Remote Sensing, 2020, 12(10): ID 1644.
[57] XU R, LI C, PATERSON A H, et al. Multispectral imaging and unmanned aerial systems for cotton plant phenotyping[J]. PloS One, 2019, 14(2): ID e0205083.
[58] WU M, YANG C, SONG X, et al. Evaluation of orthomosics and digital surface models derived from aerial imagery for crop type mapping[J]. Remote Sensing, 2017, 9(3): ID 239.
[59] GUO T, FANG Y, CHENG T, et al. Detection of wheat height using optimized multi-scan mode of LiDAR during the entire growth stages[J]. Computers and Electronics in Agriculture, 2019, 165: ID 104959.
[60] HOFFMEISTER D, WALDHOFF G, KORRES W, et al. Crop height variability detection in a single field by multi-temporal terrestrial laser scanning[J]. Precision Agriculture, 2016, 17(3): 296-312.
[61] 郭新年, 周恒瑞, 張國良, 等. 基于激光視覺的農(nóng)作物株高測量系統(tǒng)[J]. 農(nóng)業(yè)機械學報, 2018, 49(2): 22-27.
GUO X, ZHOU H, ZHANG G, et al. Crop height measurement system based on laser vision[J]. Transactions of the CSAM, 2018, 49(2): 22-27.
[62] 程曼, 蔡振江, 袁洪波, 等. 基于地面激光雷達的田間花生冠層高度測量系統(tǒng)研制[J]. 農(nóng)業(yè)工程學報, 2019, 35(1): 180-187.
CHENG M, CAI Z, YUAN H, et al. System design for peanut canopy height information acquisition based on LiDAR[J]. Transactions of the CSAE, 2019, 35(1): 180-187.
[63] TAO H, FENG H, XU L, et al. Estimation of the yield and plant height of winter wheat using UAV-based hyperspectral images[J]. Sensors, 2020, 20(4): ID 1231.
[64] HAN L, YANG G, YANG H, et al. Clustering field-based maize phenotyping of plant-height growth and canopy spectral dynamics using a UAV remote-sensing approach[J]. Frontiers in Plant Science, 2018, 9: ID 1638.
[65] VARELA S, ASSEFA Y, PRASAD P V V, et al. Spatio-temporal evaluation of plant height in corn via unmanned aerial systems[J]. Journal of Applied Remote Sensing, 2017, 11(3): ID 036013.
[66] MHLANGA B, CHAUHAN B S, THIERFELDER C, et al. Weed management in maize using crop competition: A review[J]. Crop Protection, 2016, 88: 28-36.
[67] YOUNGERMAN C Z, DITOMMASO A, CURRAN W S, et al. Corn density effect on interseeded cover crops, weeds, and grain yield[J]. Agronomy Journal, 2018, 110(6): 2478-2487.
[68] ENCISO J, AVILA C A, JUNG J, et al. Validation of agronomic UAV and field measurements for tomato varieties[J]. Computers and Electronics in Agriculture, 2019, 158: 278-283.
[69] BENDIG J, BOLTEN A, BENNERTZ S, et al. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging[J]. Remote Sensing, 2014, 6(11): 10395-10412.
[70] POUND M P, FRENCH A P, MURCHIE E H, et al. Automated recovery of three-dimensional models of plant shoots from multiple color images[J]. Plant Physiology, 2014, 166(4): 1688-1698.
[71] HASHEMINASAB S M, ZHOU T, HABIB A, et al. GNSS/INS-Assisted structure from motion strategies for UAV-Based imagery over mechanized agricultural fields[J]. Remote Sensing, 2020, 12(3): ID 351.
[72] DANDRIFOSSE S, BOUVRY A, LEEMANS V, et al. Imaging wheat canopy through stereo vision: Overcoming the challenges of the laboratory to field transition for morphological features extraction[J]. Frontiers in Plant Science, 2020, 11: ID 96.
[73] 邱小雷, 方圓, 郭泰, 等. 基于地基LiDAR高度指標的小麥生物量監(jiān)測研究[J]. 農(nóng)業(yè)機械學報, 2019, 50(10): 159-166.
QIU X, FANG Y, GUO T, et al. Monitoring of wheat biomass based on terrestrial-LiDAR height metric[J]. Transactions of the CSAM, 2019, 50(10): 159-166.
[74] BUELVAS R M, ADAMCHUK V I, LEKSONO E, et al. Biomass estimation from canopy measurements for leafy vegetables based on ultrasonic and laser sensors[J]. Computers and Electronics in Agriculture, 2019, 164: ID 104896.
[75] YUE J, YANG G, LI C, et al. Estimation of winter wheat above-ground biomass using Unmanned Aerial Vehicle-based snapshot hyperspectral sensor and crop height improved models[J]. Remote Sensing, 2017, 9(7): ID 708.
[76] CEN H, WAN L, ZHU J, et al. Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras[J]. Plant Methods, 2019, 15: ID 32.
[77] BALLESTEROS R, FERNANDO ORTEGA J, HERNANDEZ D, et al. Onion biomass monitoring using UAV-based RGB imaging[J]. Precision Agriculture, 2018, 19(5): 840-857.
[78] ZHOU L, GU X, CHENG S, et al. Analysis of plant height changes of lodged maize using UAV-LiDAR data[J]. Agriculture, 2020, 10(5): ID 146.
[79] CHU T, STAREK M J, BREWER M J, et al. Assessing lodging severity over an experimental maize (Zea mays L.) field using UAS images[J]. Remote Sensing, 2017, 9(9): ID 923.
[80] WILKE N, SIEGMANN B, KLINGBEIL L, et al. Quantifying lodging percentage and lodging severity using a UAV-based canopy height model combined with an objective threshold approach[J]. Remote Sensing, 2019, 11(5): ID 515.
[81] GEIPEL J, LINK J, CLAUPEIN W, et al. Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system[J]. Remote Sensing, 2014, 6(11): 10335-10355.
[82] YU D, ZHA Y, SHI L, et al. Improvement of sugarcane yield estimation by assimilating UAV-derived plant height observations[J]. European Journal of Agronomy, 2020, 121: ID 126159.
[83] FENG A, ZHOU J, VORIES E D, et al. Yield estimation in cotton using UAV-based multi-sensor imagery[J]. Biosystems Engineering, 2020, 193: 101-114.
[84] LI B, XU X, ZHANG L, et al. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging[J]. Isprs Journal of Photogrammetry and Remote Sensing, 2020, 162: 161-172.
[85] HASSAN M A, YANG M, FU L, et al. Accuracy assessment of plant height using an unmanned aerial vehicle for quantitative genomic analysis in bread wheat[J]. Plant Methods, 2019, 15: ID 37.
[86] WANG X, ZHANG R, SONG W, et al. Dynamic plant height QTL revealed in maize through remote sensing phenotyping using a high-throughput unmanned aerial vehicle (UAV)[J]. Scientific Reports, 2019, 9: ID 3458.
[87] ROTH L, STREIT B. Predicting cover crop biomass by lightweight UAS-based RGB and NIR photography: An applied photogrammetric approach[J]. Precision Agriculture, 2018, 19(1): 93-114.
[88] BENDIG J, YU K, AASEN H, et al. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley[J]. International Journal of Applied Earth Observation and Geoinformation, 2015, 39: 79-87.
[89] LI J, SHI Y, VEERANAMPALAYAM-SIVAKUMAR AN, et al. Elucidating sorghum biomass, nitrogen and chlorophyll contents with spectral and morphological traits derived from unmanned aircraft system[J]. Frontiers in Plant Science, 2018, 9: ID 1406.
[90] MICHEZ A, BAUWENS S, BROSTAUX Y, et al. How far can consumer-grade uav rgb imagery describe crop production? A 3D and multitemporal modeling approach applied to Zea mays[J]. Remote Sensing, 2018, 10(11): ID 1798.
[91] HAN L, YANG G, DAI H, et al. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data[J]. Plant Methods, 2019, 15: ID 10.
[92] ZHU W, SUN Z, PENG J, et al. Estimating maize above-ground biomass using 3D point clouds of multi-source unmanned aerial vehicle data at multi-spatial scales[J]. Remote Sensing, 2019, 11(22): ID 2678.
[93] GREAVES H E, VIERLING L A, EITEL J U H, et al. Estimating aboveground biomass and leaf area of low-stature Arctic shrubs with terrestrial LiDAR[J]. Remote Sensing of Environment, 2015, 164: 26-35.
[94] GIL-DOCAMPO M L, ARZA-GARCIA M, ORTIZ-SANZ J, et al. Above-ground biomass estimation of arable crops using UAV-based SfM photogrammetry[J]. Geocarto International, 2020, 35(7): 687-699.
[95] 楊琦, 葉豪, 黃凱, 等. 利用無人機影像構(gòu)建作物表面模型估測甘蔗LAI[J]. 農(nóng)業(yè)工程學報, 2017, 33(8): 104-111.
YANG Q, YE H, HUANG K, et al. Estimation of leaf area index of sugarcane using crop surface model based on UAV image[J]. Transactions of the CSAE, 2017, 33(8): 104-111.
[96] SINGH D, WANG X, KUMAR U, et al. High-throughput phenotyping enabled genetic dissection of crop lodging in wheat[J]. Frontiers in Plant Science, 2019, 10: ID 394.
[97] SU W, ZHANG M, BIAN D, et al. Phenotyping of corn plants using Unmanned Aerial Vehicle (UAV) images[J]. Remote Sensing, 2019, 11(17): ID 2021.
[98] HAN L, YANG G, FENG H, et al. Quantitative identification of maize lodging-causing feature factors using unmanned aerial vehicle images and a nomogram computation[J]. Remote Sensing, 2018, 10(10): ID 1528.
[99] ACORSI M G, MARTELLO M, ANGNES G, et al. Identification of maize lodging: A case study using a remotely piloted aircraft system[J]. Engenharia Agricola, 2019, 39: 66-73.
[100] LI J, VEERANAMPALAYAM-SIVAKUMAR AN, BHATTA M, et al. Principal variable selection to explain grain yield variation in winter wheat from features extracted from UAV imagery[J]. Plant Methods, 2019, 15: ID 123.
[101] ZHOU G, YIN X. Relationship of cotton nitrogen and yield with normalized difference vegetation index and plant height[J]. Nutrient Cycling in Agroecosystems, 2014, 100(2): 147-160.
[102] XUE H, TIAN X, ZHANG K, et al. Mapping developmental QTL for plant height in soybean [Glycine max (L.) Merr.] using a four-way recombinant inbred line population[J]. PloS One, 2019, 14(11): ID e0224897.
[103] HERTER C P, EBMEYER E, KOLLERS S, et al. Rht24 reduces height in the winter wheat population 'Solitar x Bussard' without adverse effects on Fusarium head blight infection[J]. Theoretical and Applied Genetics, 2018, 131(6): 1263-1272.
[104] MA X, FENG F, WEI H, et al. Genome-wide association study for plant height and grain yield in rice under contrasting moisture regimes[J]. Frontiers in Plant Science, 2016, 7: ID 1801.
[105] WATANABE K, GUO W, ARAI K, et al. High-throughput phenotyping of sorghum plant height using an Unmanned Aerial Vehicle and its application to genomic prediction modeling[J]. Frontiers in Plant Science, 2017, 8: ID 421.
[106] KAKERU W, WEI G, KEIGO A, et al. High-throughput phenotyping of sorghum plant height using an unmanned aerial vehicle and its application to genomic prediction modeling[J]. Frontiers in Plant Science, 2017, 8: ID 421.
[107] 劉忠, 萬煒, 黃晉宇, 等. 基于無人機遙感的農(nóng)作物長勢關(guān)鍵參數(shù)反演研究進展[J]. 農(nóng)業(yè)工程學報, 2018, 34(24): 60-71.
LIU Z, WAN W, HUANG J, et al. Progress on key parameters inversion of crop growth based on unmanned aerial vehicle remote sensing[J]. Transactions of the CSAE, 2018, 34(24): 60-71.
[108] XIE T, LI J, YANG C, et al. Crop height estimation based on UAV images: methods, errors, and strategies[J]. Computers and Electronics in Agriculture, 2021, 185: ID 106155.
[109] WALTER J D C, EDWARDS J, MCDONALD G, et al. Estimating biomass and canopy height with LiDAR for field crop breeding[J]. Frontiers in Plant Science, 2019, 10: ID 1145.
[110] WANG H, WANG R, LIU B, et al. QTL analysis of salt tolerance in Sorghum bicolor during whole—plant growth stages[J]. Plant Breeding, 2020, 139(3): 455-465.
[111] LIU F, HU P, ZHENG B, et al. A field-based high-throughput method for acquiring canopy architecture using unmanned aerial vehicle images[J]. Agricultural and Forest Meteorology, 2021, 296: ID 108231.
[112] HU T, SUN X, SU Y, et al. Development and performance evaluation of a very low-cost UAV-LiDAR system for forestry applications[J]. Remote Sensing, 2020, 13(1): ID 77.
[113] LUO S, LIU W, ZHANG Y, et al. Maize and soybean heights estimation from unmanned aerial vehicle (UAV) LiDAR data[J]. Computers and Electronics in Agriculture, 2021, 182: ID 106005.
[114] 管賢平, 劉寬, 邱白晶, 等. 基于機載三維激光掃描的大豆冠層幾何參數(shù)提取[J]. 農(nóng)業(yè)工程學報, 2019, 35(23): 96-103.
GUAN X, LIU K, QIU B, et al. Extraction of geometric parameters of soybean canopy by airborne 3D laser scanning[J]. Transactions of the CSAE, 2019, 35(23): 96-103.
[115] VIKHE P, VENKATESAN S, CHAVAN A, et al. Mapping of dwarfing gene Rht14 in durum wheat and its effect on seedling vigor, internode length and plant height[J]. The Crop Journal, 2019, 7(2): 187-197.
[116] 劉永康, 李明軍, 李景原, 等. 小麥旗葉直立轉(zhuǎn)披動態(tài)過程對其高光效的影響[J]. 科學通報, 2009(15): 2205-2211.
LIU Y, LI M, LI J, et al. Dynamic changes in flag leaf angle contribute to high photosynthetic capacity[J]. Chinese Science Bulletin, 2009, 54(15): 2205-2211.
[117] CHENG T, LU N, WANG W, et al. Estimation of nitrogen nutrition status in winter wheat from unmanned aerial vehicle based multi-angular multispectral imagery[J]. Frontiers in Plant Science, 2019, 10: ID 1601.
[118] CHE Y, WANG Q, XIE Z, et al. Estimation of maize plant height and leaf area index dynamic using unmanned aerial vehicle with oblique and nadir photography[J]. Annals of Botany, 2020, 126(4): 765-773.
Research Status and Prospect on Height Estimation of Field Crop Using Near-Field Remote Sensing Technology
ZHANG Jian1*, XIE Tianjin1, YANG Wanneng2, ZHOU Guangsheng3
(1.College of Resources and Environmental Sciences/Macro Agriculture Research Institute, Huazhong Agricultural University, Wuhan 430070, China; 2.National Key Laboratory of Crop Genetic Improvement, Huazhong Agricultural University, Wuhan 430070, China; 3.College of Plant Science and Technology, Huazhong Agricultural University, Wuhan 430070, China)
Abstract: Plant height is a key indicator to dynamically measure crop health and overall growth status, which is widely used to estimate the biological yield and final grain yield of crops. The traditional manual measurement method is subjective, inefficient, and time-consuming. And the plant height obtained by sampling cannot evaluate the height of the whole field. In the last decade, remote sensing technology has developed rapidly in agriculture, which makes it possible to collect crop height information with high accuracy, high frequency, and high efficiency. This paper firstly reviewed the literature on obtaining plant height by using remote sensing technology for understanding the research progress of height estimation in the field. Unmanned aerial vehicle (UAV) platform with visible-light camera and light detection and ranging (LiDAR) were the most frequently used methods. And main research crops included wheat, corn, rice, and other staple food crops. Moreover, crop height measurement was mainly based on near-field remote sensing platforms such as ground, UAV, and airborne. Secondly, the basic principles, advantages, and limitations of different platforms and sensors for obtaining plant height were analyzed. The altimetry process and the key techniques of LiDAR and visible-light camera were discussed emphatically, which included extraction of crop canopy and soil elevation information, and feature matching of the imaging method. Then, the applications using plant height data, including the inversion of biomass, lodging identification, yield prediction, and breeding of crops were summarized. However, the commonly used empirical model has some problems such large measured data, unclear physical significance, and poor universality. Finally, the problems and challenges of near-field remote sensing technology in plant height acquisition were proposed. Selecting appropriate data to meet the needs of cost and accuracy, improving the measurement accuracy, and matching the plant height estimation of remote sensing with the agricultural application need to be considered. In addition, we prospected the future development was prospected from four aspects of 1) platform and sensor, 2) bare soil detection and interpolation algorithm, 3) plant height application research, and 4) the measurement difference of plant height between agronomy and remote sensing, which can provide references for future research and method application of near-field remote sensing height measurement.
Key words: plant height; near-field remote sensing; crop; unmanned aerial vehicle; visible-light camera; LiDAR