Shen, L.-H.L.-H.ShenTai, P.-L.P.-L.TaiWu, C.-C.C.-C.WuSHOU-DE LIN2021-05-052021-05-052019https://www.scopus.com/inward/record.url?eid=2-s2.0-85087441060&partnerID=40&md5=c36558e630ac420952b58456dff5fca2https://scholars.lib.ntu.edu.tw/handle/123456789/559204An acrostic is a form of writing for which the first token of each line (or other recurring features in the text) forms a meaningful sequence. In this paper we present a generalized acrostic generation system that can hide certain message in a flexible pattern specified by the users. Different from previous works that focus on rule-based solutions, here we adopt a neural-based sequence-to-sequence model to achieve this goal. Besides acrostic, users can also specify the rhyme and length of the output sequences. To the best of our knowledge, this is the first neural-based natural language generation system that demonstrates the capability of performing micro-level control over output sentences. © 2019 Association for Computational Linguistics.Flexible patterns; Generation systems; Natural language generation systems; Output sequences; Rule based; Sequence modeling; Sequence models; Natural language processing systemsControlling sequence-to-sequence models - A demonstration on neural-based acrostic generatorconference paper2-s2.0-85087441060