python (65.1k questions)
javascript (44.2k questions)
reactjs (22.7k questions)
java (20.8k questions)
c# (17.4k questions)
html (16.3k questions)
r (13.7k questions)
android (12.9k questions)
Using the encoder part only from T5 model
I want to build a classification model that needs only the encoder part of language models. I have tried Bert, Roberta, xlnet, and so far I have been successful.
I now want to test the encoder part on...
ls_grep
Votes: 0
Answers: 1
How does the finetune on transformer (t5) work?
I am using pytorch lightning to finetune t5 transformer on a specific task. However, I was not able to understand how the finetuning works. I always see this code :
tokenizer = AutoTokenizer.from_pret...
Chan Wing
Votes: 0
Answers: 1
Hugging Face not able to reload all weights after training
I recently being using a RobertaLarge model, which I perform a down stream Training, using "Trainer" package.
All goes well, I see the loss going down, and compare manually some results with...
Joao de Oliveira
Votes: 0
Answers: 1
How to load huggingface's BERT after fine-tuning with Pytorch Lightning?
I fine-tuned a pre-trained BERT model from Huggingface on a custom dataset for 10 epochs using pytorch-lightning. I did logging with Weights and Biases logger.
When I load from checkpoint like so:
mod...
Ilia
Votes: 0
Answers: 1