merve HF staff commited on
Commit
a9be413
·
1 Parent(s): 4eb4680

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +1 -1
app.py CHANGED
@@ -49,4 +49,4 @@ st.subheader("Add Personas to Your Conversational Agent using GPT-2")
49
  st.write("When trained, language models like GPT-2 or DialoGPT is capable of talking like any character you want. If you have a friend-like chatbot (instead of a chatbot built for RPA) you can give your users options to talk to their favorite character. There are couple of ways of doing this, you can either fine-tune DialoGPT with sequences of conversation turns, maybe movie dialogues, or infer with a large model like GPT-J. Note that these models might have biases and you will not have any control over output, unless you make an additional effort to filter it.")
50
  st.write("You can see an [example](https://huggingface.co/docs/transformers/model_doc/dialogpt) of a chatbot that talks like Gandalf, that is done simply by sending a request to GPT-J through Inference API.")
51
 
52
- I've written the inferences in this blog post with only three lines of code, using [pipelines](https://huggingface.co/docs/transformers/main_classes/pipelines). (yes 🤯) Check out the code of the post [here](https://huggingface.co/spaces/merve/chatbot-blog/blob/main/app.py) on how you can do it too! 🤗
 
49
  st.write("When trained, language models like GPT-2 or DialoGPT is capable of talking like any character you want. If you have a friend-like chatbot (instead of a chatbot built for RPA) you can give your users options to talk to their favorite character. There are couple of ways of doing this, you can either fine-tune DialoGPT with sequences of conversation turns, maybe movie dialogues, or infer with a large model like GPT-J. Note that these models might have biases and you will not have any control over output, unless you make an additional effort to filter it.")
50
  st.write("You can see an [example](https://huggingface.co/docs/transformers/model_doc/dialogpt) of a chatbot that talks like Gandalf, that is done simply by sending a request to GPT-J through Inference API.")
51
 
52
+ st.write("I've written the inferences in this blog post with only three lines of code, using [pipelines](https://huggingface.co/docs/transformers/main_classes/pipelines). (yes 🤯) Check out the code of the post [here](https://huggingface.co/spaces/merve/chatbot-blog/blob/main/app.py) on how you can do it too! 🤗 ")