z-logo
open-access-imgOpen Access
Analysis of the impact of parameters in TextGCN
Author(s) -
Henrique Varella Ehrenfried,
Eduardo Todt
Publication year - 2021
Publication title -
anais do xii computer on the beach - cotb '21
Language(s) - English
Resource type - Conference proceedings
DOI - 10.14210/cotb.v12.p014-019
Subject(s) - dropout (neural networks) , computer science , work (physics) , variation (astronomy) , machine learning , artificial intelligence , engineering , mechanical engineering , physics , astrophysics
Deep learning models uses many parameters to work properly. Asthey become more complex, the authors of these novel models cannotexplore in their papers the variation of each parameter of theirmodel. Therefore, this work describes an analysis of the impact offour different parameters (Early Stopping, Learning Rate, Dropout,and Hidden 1) in the TextGCN Model. This evaluation used fourdatasets considered in the original TextGCN publication, obtainingas a side-effect small improvements in the results of three of them.The most relevant conclusion is that these parameters influence theconvergence and accuracy, although they individually do not constitutestrong support when aiming to improve the model’s resultsreported as the state-of-the-art.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here