Title: Empirical study on label smoothing in neural networks
Authors: Mezzini, Mauro
Citation: WSCG '2018: short communications proceedings: The 26th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision 2016 in co-operation with EUROGRAPHICS: University of West Bohemia, Plzen, Czech Republic May 28 - June 1 2018, p. 200-205.
Issue Date: 2018
Publisher: Václav Skala - UNION Agency
Document type: konferenční příspěvek
conferenceObject
URI: wscg.zcu.cz/WSCG2018/!!_CSRN-2802.pdf
http://hdl.handle.net/11025/34673
ISBN: 978-80-86943-41-1
ISSN: 2464-4617
Keywords: neuronové sítě;vyhlazování štítků;regulace;softmax;viuální domény
Keywords in different language: neural networks;label smoothing;regularization;softmax;visual domain
Abstract: Neural networks are now day routinely employed in the classification of sets of objects, which consists in predicting the class label of an object. The softmax function is a popular choice of the output function in neural networks. It is a probability distribution of the class labels and the label with maximum probability represents the prediction of the neural network, given the object being classified. The softmax function is also used to compute the loss function, which evaluates the error made by the network in the classification task. In this paper we consider a simple modification to the loss function, called label smoothing. We experimented this modification by training a neural network using 12 data sets, all containing a total of about 1:5 106 images. We show that this modification allow a neural network to achieve a better accuracy in the classification task.
Rights: © Václav Skala - UNION Agency
Appears in Collections:WSCG '2018: Short Papers Proceedings

Files in This Item:
File Description SizeFormat 
Mezzini.pdfPlný text742,38 kBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/34673

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.