https://scholars.lib.ntu.edu.tw/handle/123456789/632322
標題: | TRACE: A Fast Transformer-based General-Purpose Lossless Compressor | 作者: | Mao Y Cui Y TEI-WEI KUO Xue C.J. |
關鍵字: | byte stream; computational efficient model; general-purpose compressor; lossless data compression; neural networks; transformer | 公開日期: | 2022 | 起(迄)頁: | 1829-1838 | 來源出版物: | WWW 2022 - Proceedings of the ACM Web Conference 2022 | 摘要: | Deep-learning-based compressor has received interests recently due to much improved compression ratio. However, modern approaches suffer from long execution time. To ease this problem, this paper targets on cutting down the execution time of deep-learning-based compressors. Building history-dependencies sequentially (e.g., recurrent neural networks) is responsible for long inference latency. Instead, we introduce transformer into deep learning compressors to build history-dependencies in parallel. However, existing transformer is too heavy in computation and incompatible to compression tasks. This paper proposes a fast general-purpose lossless compressor, TRACE, by designing a compression-friendly structure based on a single-layer transformer. We first design a new metric to advise the selection part of compression model structures. Byte-grouping and Shared-ffn schemes are further proposed to fully utilize the capacity of the single-layer transformer. These features allow TRACE to achieve competitive compression ratio and a much faster speed. In addition, we further accelerate the compression procedure by designing a controller to reduce the parameter updating overhead. Experiments show that TRACE achieves an overall 3x speedup while keeps a comparable compression ratio to the state-of-the-art compressors. The source code for TRACE and links to the datasets are available at https://github.com/mynotwo/A-Fast-Transformer-based-General-Purpose-LosslessCompressor. © 2022 ACM. |
URI: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85129894057&doi=10.1145%2f3485447.3511987&partnerID=40&md5=e98f906ece814cadb91805807dd3ff66 https://scholars.lib.ntu.edu.tw/handle/123456789/632322 |
DOI: | 10.1145/3485447.3511987 | SDG/關鍵字: | Data compression; Model structures; Multilayer neural networks; Recurrent neural networks; Building history; Byte streams; Computational efficient model; General-purpose compressor; History dependencies; Lossless; Lossless data compression; Neural-networks; Single layer; Transformer; Compressors |
顯示於: | 資訊工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。