Build a Text Classification Model using LONGFORMER in Python
- Status: Pending
- Præmier: $25
- Modtagne indlæg: 3
I am doing some research and want to see how longformer model works. How the attention_mask, global_attention_mask works. lets say I have a dataset with 50000 research papers. I want to classify lets say into 4 categories/labels. General, Management,Technical and Operational. It has to go through compelte article.
Note : I dont need any workaround. Dont need LSTM, BERT; Roberta or any other transformer.
# Please write Some explanation Documenatation may be few lines for the functions used as :
why you choose sliding window.. If the document is greater than 5000 words what can be done. why this performance matrix, why this obtimizer, Role of Global_attention as that makes it different from Bert
#The input parameters shouldnt just be one .
Lets say we have 4 features, Title of the article, date of the article, Author, URL of the article.
# It is Multi Class Text Classification NOT Multi Label
#It has to be LONGFORMER.
# End result should be a working model. You can take any test dataset and show the output. TRAIN/EVAL/TEST