Title: Adjusting BERT’s Pooling Layer for Large-Scale Multi-Label Text Classification
Authors: Lehečka, Jan
Švec, Jan
Ircing, Pavel
Šmídl, Luboš
Citation: LEHEČKA, J., ŠVEC, J., IRCING, P., ŠMÍDL, L. Adjusting BERT’s Pooling Layer for Large-Scale Multi-Label Text Classification. In: Text, Speech, and Dialogue 23rd International Conference, TSD 2020, Brno, Czech Republic, September 8-11, 2020, Proceedings. Cham: Springer, 2020. s. 214-221. ISBN 978-3-030-58322-4, ISSN 0302-9743.
Issue Date: 2020
Publisher: Springer
Document type: konferenční příspěvek
conferenceObject
URI: 2-s2.0-85091136861
http://hdl.handle.net/11025/42716
ISBN: 978-3-030-58322-4
ISSN: 0302-9743
Keywords in different language: Text classification, BERT model
Abstract in different language: In this paper, we present our experiments with BERT models in the task of Large-scale Multi-label Text Classification (LMTC). In the LMTC task, each text document can have multiple class labels, while the total number of classes is in the order of thousands. We propose a pooling layer architecture on top of BERT models, which improves the quality of classification by using information from the standard [CLS] token in combination with pooled sequence output. We demonstrate the improvements on Wikipedia datasets in three different languages using public pre-trained BERT models.
Rights: Plný text není přístupný.
© Springer
Appears in Collections:Konferenční příspěvky / Conference papers (NTIS)
Konferenční příspěvky / Conference Papers (KKY)
OBD

Files in This Item:
File SizeFormat 
Lehečka2020_Chapter_AdjustingBERTSPoolingLayerForL.pdf299,49 kBAdobe PDFView/Open    Request a copy


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/42716

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

search
navigation
  1. DSpace at University of West Bohemia
  2. Publikační činnost / Publications
  3. OBD