PEFT
Safetensors
English
jinjieyuan commited on
Commit
ab0d2fe
·
verified ·
1 Parent(s): de759d6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -7
README.md CHANGED
@@ -29,17 +29,30 @@ Refer to our [repo](https://github.com/IntelLabs/Hardware-Aware-Automated-Machin
29
 
30
  ## Model Sources
31
 
32
- - **Repository:** [https://github.com/IntelLabs/Hardware-Aware-Automated-Machine-Learning/tree/main/SQFT](https://github.com/IntelLabs/Hardware-Aware-Automated-Machine-Learning/tree/main/SQFT)
33
- - **Paper:** [SQFT: Low-cost Model Adaptation in Low-precision Sparse Foundation Models](https://arxiv.org/abs/2410.03750)
 
 
 
34
 
35
  ## Citation
36
 
37
  ```bash
38
- @article{munoz2024sqft,
39
- title = {SQFT: Low-cost Model Adaptation in Low-precision Sparse Foundation Models},
40
- author={J. Pablo Munoz and Jinjie Yuan and Nilesh Jain},
41
- journal={The 2024 Conference on Empirical Methods in Natural Language Processing (Findings)},
42
- year={2024}
 
 
 
 
 
 
 
 
 
 
43
  }
44
  ```
45
 
 
29
 
30
  ## Model Sources
31
 
32
+ **Repository:** [https://github.com/IntelLabs/Hardware-Aware-Automated-Machine-Learning/tree/main/SQFT](https://github.com/IntelLabs/Hardware-Aware-Automated-Machine-Learning/tree/main/SQFT)
33
+
34
+ **Paper:**
35
+ - [SQFT: Low-cost Model Adaptation in Low-precision Sparse Foundation Models](https://arxiv.org/abs/2410.03750)
36
+ - [Low-Rank Adapters Meet Neural Architecture Search for LLM Compression](https://arxiv.org/abs/2501.16372)
37
 
38
  ## Citation
39
 
40
  ```bash
41
+ @inproceedings{munoz-etal-2024-sqft,
42
+ title = "{SQFT}: Low-cost Model Adaptation in Low-precision Sparse Foundation Models",
43
+ author = "Munoz, Juan Pablo and
44
+ Yuan, Jinjie and
45
+ Jain, Nilesh",
46
+ editor = "Al-Onaizan, Yaser and
47
+ Bansal, Mohit and
48
+ Chen, Yun-Nung",
49
+ booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2024",
50
+ month = nov,
51
+ year = "2024",
52
+ address = "Miami, Florida, USA",
53
+ publisher = "Association for Computational Linguistics",
54
+ url = "https://aclanthology.org/2024.findings-emnlp.749",
55
+ pages = "12817--12832",
56
  }
57
  ```
58