https://scholars.lib.ntu.edu.tw/handle/123456789/640225
標題: | Solving Linguistic Olympiad Problems with Tree-of-Thought Prompting | 作者: | Lin, Zheng Lin Yen, Chiao Han Xu, Jia Cheng Watty, Deborah SHU-KAI HSIEH |
關鍵字: | Generative Pre-trained Transformer | Large Language Models | Linguistic Olympiad | Machine Reasoning | Tree-of-Thought Prompting | 公開日期: | 1-一月-2023 | 起(迄)頁: | 262 - 269 | 來源出版物: | ROCLING 2023 - Proceedings of the 35th Conference on Computational Linguistics and Speech Processing | 摘要: | In this study, we delve into the efficacy of the Tree-of-Thought Prompting technique as a mechanism to address linguistic challenges and augment the reasoning capabilities of expansive language models. Specifically, we scrutinize the reasoning prowess of the Generative Pre-trained Transformer (GPT) model, which has garnered significant attention within the research and practitioner community. Utilizing the Tree-of-Thought Prompting methodology, we assess its utility in enhancing both the precision and response latency of the GPT model, especially for Linguistic Olympiad tasks demanding elevated reasoning competencies. Concurrently, we delineate inherent limitations within this approach and proffer avenues for future research to refine and optimize it. |
URI: | https://scholars.lib.ntu.edu.tw/handle/123456789/640225 | ISBN: | 9789869576963 |
顯示於: | 語言學研究所 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。