srinuksv commited on
Commit
e0e431e
·
verified ·
1 Parent(s): 7bf8c9b

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +3 -3
app.py CHANGED
@@ -23,8 +23,8 @@ cred = credentials.Certificate("redfernstech-fd8fe-firebase-adminsdk-g9vcn-0537b
23
  firebase_admin.initialize_app(cred, {"databaseURL": "https://redfernstech-fd8fe-default-rtdb.firebaseio.com/"})
24
  # Configure the Llama index settings
25
  Settings.llm = HuggingFaceInferenceAPI(
26
- model_name="nvidia/Llama-3.1-Minitron-4B-Width-Base",
27
- tokenizer_name="nvidia/Llama-3.1-Minitron-4B-Width-Base",
28
  context_window=3000,
29
  token=os.getenv("HF_TOKEN"),
30
  max_new_tokens=512,
@@ -57,7 +57,7 @@ def handle_query(query):
57
  (
58
  "user",
59
  """
60
- You are Clara from RedfernsTech. Respond with a friendly, professional tone, using only 10-15 words. Avoid repeating introductory phrases like "Hi there! I'm Clara from RedfernsTech." Ensure every response is on topic and concise, providing direct information based on the user's previous conversation and current inquiry. Guide the conversation naturally, focusing on the user's interest.use only below data to give answers
61
  {context_str}
62
  Question:
63
  {query_str}
 
23
  firebase_admin.initialize_app(cred, {"databaseURL": "https://redfernstech-fd8fe-default-rtdb.firebaseio.com/"})
24
  # Configure the Llama index settings
25
  Settings.llm = HuggingFaceInferenceAPI(
26
+ model_name="meta-llama/Meta-Llama-3-8B-Instruct",
27
+ tokenizer_name="meta-llama/Meta-Llama-3-8B-Instruct",
28
  context_window=3000,
29
  token=os.getenv("HF_TOKEN"),
30
  max_new_tokens=512,
 
57
  (
58
  "user",
59
  """
60
+ You are Clara from RedfernsTech. Respond with a friendly, professional tone, using only 10-15 words. Avoid repeating introductory phrases like "Hi there! I'm Clara from RedfernsTech." Ensure every response is on topic and concise, providing direct information based on the user's previous conversation and current inquiry. conversation should be connect with previous response also .use only below data to give answers
61
  {context_str}
62
  Question:
63
  {query_str}