File size: 816 Bytes
1cfb47d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
---
base_model:
- gradientai/Llama-3-70B-Instruct-Gradient-262k
- meta-llama/Meta-Llama-3-70B-Instruct
library_name: transformers
tags:
- mergekit
- peft

---
# Untitled LoRA Model (1)

This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit).

## LoRA Details

This LoRA adapter was extracted from [meta-llama/Meta-Llama-3-70B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct) and uses [gradientai/Llama-3-70B-Instruct-Gradient-262k](https://huggingface.co/gradientai/Llama-3-70B-Instruct-Gradient-262k) as a base.

### Parameters

The following command was used to extract this LoRA adapter:

```sh
mergekit-extract-lora gradientai/Llama-3-70B-Instruct-Gradient-262k meta-llama/Meta-Llama-3-70B-Instruct OUTPUT_PATH --rank=32
```