https://scholars.lib.ntu.edu.tw/handle/123456789/640225
Title: | Solving Linguistic Olympiad Problems with Tree-of-Thought Prompting | Authors: | Lin, Zheng Lin Yen, Chiao Han Xu, Jia Cheng Watty, Deborah SHU-KAI HSIEH |
Keywords: | Generative Pre-trained Transformer | Large Language Models | Linguistic Olympiad | Machine Reasoning | Tree-of-Thought Prompting | Issue Date: | 1-Jan-2023 | Start page/Pages: | 262 - 269 | Source: | ROCLING 2023 - Proceedings of the 35th Conference on Computational Linguistics and Speech Processing | Abstract: | In this study, we delve into the efficacy of the Tree-of-Thought Prompting technique as a mechanism to address linguistic challenges and augment the reasoning capabilities of expansive language models. Specifically, we scrutinize the reasoning prowess of the Generative Pre-trained Transformer (GPT) model, which has garnered significant attention within the research and practitioner community. Utilizing the Tree-of-Thought Prompting methodology, we assess its utility in enhancing both the precision and response latency of the GPT model, especially for Linguistic Olympiad tasks demanding elevated reasoning competencies. Concurrently, we delineate inherent limitations within this approach and proffer avenues for future research to refine and optimize it. |
URI: | https://scholars.lib.ntu.edu.tw/handle/123456789/640225 | ISBN: | 9789869576963 |
Appears in Collections: | 語言學研究所 |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.