Text Generation
Transformers
Safetensors
English
llama
nlp
llm
text-generation-inference
AmberChat / README.md
richardmfan's picture
Update README.md
a425008
|
raw
history blame
3.63 kB
metadata
license: apache-2.0
datasets:
  - WizardLM/WizardLM_evol_instruct_V2_196k
  - icybee/share_gpt_90k_v1
language:
  - en
widget:
  - example_title: AmberChat
    text: How do I mount a tv to drywall safely?
    output:
      text: >-
        Mounting a TV to drywall safely is crucial to ensure the longevity of
        your TV and the wall. Here's a step-by-step guide on how to do it
        safely:
         1. Choose the right mount: Choose a mount that is compatible with your TV and the wall. Make sure the mount is rated for drywall and not just wood or concrete.
         2. Prepare the wall: Before mounting the TV, make sure the wall is clean and free from any debris or paint. Use a stud finder to locate the studs in the wall.
         3. Install the mount: Follow the manufacturer's instructions for mounting the TV to the wall. Use appropriate hardware and tools to secure the mount to the wall.
         4. Level the TV: Level the TV on the mount using a spirit level or a leveling kit provided by the mount manufacturer.
         5. Attach the TV to the mount: Attach the TV to the mount using the appropriate hardware and tools. Tighten the bolts and screws to ensure the TV is securely attached.
         6. Connect the cables: Connect the TV cables to the appropriate ports on the back of the TV and the mount.
         7. Test the mount: Test the mount to ensure it's secure and stable. Adjust the mount as needed to ensure the TV is level and secure.
         Mounting a TV to drywall safely is crucial to avoid damaging the wall or the TV. Follow these steps carefully and use appropriate tools and hardware to ensure a secure and stable installation.
library_name: transformers
pipeline_tag: text-generation
tags:
  - nlp
  - llm

AmberChat

We present AmberChat, an instruction following model finetuned from LLM360/Amber.

Model Description

Loading AmberChat

from transformers import LlamaTokenizer, LlamaForCausalLM

tokenizer = LlamaTokenizer.from_pretrained("LLM360/AmberChat")
model = LlamaForCausalLM.from_pretrained("LLM360/AmberChat")

input_text = "How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids

outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))

AmberChat Finetuning Details

DataMix

Subset Number of rows License
WizardLM/WizardLM_evol_instruct_V2_196k 143k
icybee/share_gpt_90k_v1 90k cc0-1.0
Total 233k

Hyperparameters

Hyperparameter Value
Total Parameters 6.7B
Hidden Size 4096
Intermediate Size (MLPs) 11008
Number of Attention Heads 32
Number of Hidden Lyaers 32
RMSNorm ɛ 1e^-6
Max Seq Length 2048
Vocab Size 32000

Evaluation

Model MT-Bench
LLM360/Amber 359 2.48750
LLM360/AmberChat 5.428125

Citation

BibTeX:

@article{xxx,
  title={XXX},
  author={XXX},
  journal={XXX},
  year={2023}
}