Repository logo
  • English
  • 中文
Log In
Have you forgotten your password?
  1. Home
  2. College of Bioresources and Agriculture / 生物資源暨農學院
  3. Biomechatronics Engineering / 生物機電工程學系
  4. Automatic monitoring of lactation frequency of sows and movement quantification of newborn piglets in farrowing houses using convolutional neural networks
 
  • Details

Automatic monitoring of lactation frequency of sows and movement quantification of newborn piglets in farrowing houses using convolutional neural networks

Journal
Computers and Electronics in Agriculture
Journal Volume
189
Date Issued
2021
Author(s)
Ho K.-Y
Tsai Y.-J
YAN-FU KUO  
DOI
10.1016/j.compag.2021.106376
URI
https://www.scopus.com/inward/record.uri?eid=2-s2.0-85113277240&doi=10.1016%2fj.compag.2021.106376&partnerID=40&md5=780e57938451ef481f83ab1a81299bbc
https://scholars.lib.ntu.edu.tw/handle/123456789/605951
Abstract
Pork is an essential source of protein in Taiwan and many other countries globally, and to meet increasing demand, maintaining the weaning rate of piglets is essential. Newborn piglets are relatively fragile and require more attention; however, manual observation is time-consuming and labor-intensive. This study aimed to develop an automated approach to recognize the lactating frequencies of sows, localize piglets, track individual piglets, and quantify their movements in videos. Embedded systems integrated with cameras were developed to capture bird's-eye-view videos of sows and piglets in a farrowing house, which were then transmitted to a cloud server and converted to images. A combination of EfficientNet and long short-term memory (LSTM) was trained to recognize the lactation behavior from the videos, and a refined rotation RetinaNet (R3Det) model was trained to localize the piglets. Subsequently, the simple online and real-time tracking (SORT) algorithm was applied to track individual piglets and quantify their movements. The combination of EfficientNet and LSTM achieved an overall accuracy of 97.67% in lactation behavior recognition within a test time of 7.7 ms per 1-min video by using a GPU. The trained R3Det model achieved an overall mean average precision of 87.90%, precision of 93.52%, recall of 88.52%, and processing speed of 10.2 fps using a GPU. The piglet tracking using SORT achieved an overall multiple object tracking accuracy of 97.35%, multiple object tracking precision of 96.97%, IDF1 of 98.30%, and processing speed of 171.6 fps using a CPU. Thus, the feasibility of the proposed approaches to be used in typical pig farms was proven. ? 2021 Elsevier B.V.
Subjects
Lactation behavior detection
Long short-term memory (LSTM)
Pig tracking
Piglet movement
Agriculture
Behavioral research
Brain
Embedded systems
Graphics processing unit
Mammals
Object tracking
Long short-term memory
Newborn piglet
Online time
Processing speed
Real time tracking
Short term memory
Simple++
algorithm
artificial neural network
detection method
movement
precision
protein
tracking
videography
Taiwan
SDGs

[SDGs]SDG8

[SDGs]SDG13

Type
journal article

臺大位居世界頂尖大學之列,為永久珍藏及向國際展現本校豐碩的研究成果及學術能量,圖書館整合機構典藏(NTUR)與學術庫(AH)不同功能平台,成為臺大學術典藏NTU scholars。期能整合研究能量、促進交流合作、保存學術產出、推廣研究成果。

To permanently archive and promote researcher profiles and scholarly works, Library integrates the services of “NTU Repository” with “Academic Hub” to form NTU Scholars.

總館學科館員 (Main Library)
醫學圖書館學科館員 (Medical Library)
社會科學院辜振甫紀念圖書館學科館員 (Social Sciences Library)

開放取用是從使用者角度提升資訊取用性的社會運動,應用在學術研究上是透過將研究著作公開供使用者自由取閱,以促進學術傳播及因應期刊訂購費用逐年攀升。同時可加速研究發展、提升研究影響力,NTU Scholars即為本校的開放取用典藏(OA Archive)平台。(點選深入了解OA)

  • 請確認所上傳的全文是原創的內容,若該文件包含部分內容的版權非匯入者所有,或由第三方贊助與合作完成,請確認該版權所有者及第三方同意提供此授權。
    Please represent that the submission is your original work, and that you have the right to grant the rights to upload.
  • 若欲上傳已出版的全文電子檔,可使用Open policy finder網站查詢,以確認出版單位之版權政策。
    Please use Open policy finder to find a summary of permissions that are normally given as part of each publisher's copyright transfer agreement.
  • 網站簡介 (Quickstart Guide)
  • 使用手冊 (Instruction Manual)
  • 線上預約服務 (Booking Service)
  • 方案一:臺灣大學計算機中心帳號登入
    (With C&INC Email Account)
  • 方案二:ORCID帳號登入 (With ORCID)
  • 方案一:定期更新ORCID者,以ID匯入 (Search for identifier (ORCID))
  • 方案二:自行建檔 (Default mode Submission)
  • 方案三:學科館員協助匯入 (Email worklist to subject librarians)

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science