Chen Y.-TSHIH-FANG CHEN2021-09-022021-09-0220201681699https://www.scopus.com/inward/record.uri?eid=2-s2.0-85079884992&doi=10.1016%2fj.compag.2020.105298&partnerID=40&md5=b6135a4dad292e5d7a87fe0f9fa358c0https://scholars.lib.ntu.edu.tw/handle/123456789/581601Tea (Camellia sinensis) is one of the most commonly consumed beverages worldwide. Tea leaves are plucked by hand or using a machine. Hand plucking prevents broken leaves and ensures that only the most essentially favorable parts are plucked (e.g., one tip with two leaves [OTTL]); thus, hand-plucked tea usually has high economic value. Localizing the plucking points of tea is the first step to develop a high-quality tea-plucking machine. In this study, we proposed to identify the plucking points of tea shoots in fields using machine vision and deep learning. In the study, images of various tea bush varieties were acquired from tea gardens using digital cameras and cell phones. A faster region-based convolutional neural network (Faster R-CNN) was first trained to detect OTTL regions in the images. A fully convolutional network (FCN) was subsequently trained to identify the plucking points in the OTTL regions. The Faster R-CNN model achieved a precision of 79% and a recall of 90%. The FCN achieved a mean accuracy of 84.91% and a mean intersection-over-union of 70.72%. The experiment demonstrated that the developed algorithms can also be used on varieties other than those used to train the model. ? 2020 Elsevier B.V.Computer vision; Convolution; Deep learning; Deep neural networks; Learning systems; Tea; Camellia sinensis; Cell phone; CNN models; Convolutional networks; Economic values; High quality; Region-based; Tea gardens; Convolutional neural networks; artificial neural network; computer vision; identification method; leaf; machine learning; tea; Camellia sinensis[SDGs]SDG3Localizing plucking points of tea leaves using deep convolutional neural networksjournal article10.1016/j.compag.2020.1052982-s2.0-85079884992