File size: 2,511 Bytes
e91a58c
784a7e2
 
 
e91a58c
784a7e2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6518f30
 
784a7e2
 
 
 
 
 
d834d9d
059c845
 
 
 
 
 
 
d834d9d
 
3a7ac29
668aaf8
3a7ac29
d834d9d
784a7e2
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
# Scaling Up Natural Language Understanding for Multi-Robots Through the Lens of Hierarchy

[[Homepage](https://nl2hltl.github.io/)] [[Paper](https://arxiv.org/abs/2408.08188)] [[Video](https://youtu.be/o8CRrVK9g9Q)] [[Poster](https://nl2hltl.github.io/media/figures/overview.png)]

> The associated repo for paper "Scaling Up Natural Language Understanding for Multi-Robots Through the Lens of Hierarchy".

## Introduction

The dataset is modified based on the following projects, in which we replace the task related descriptions, into task independent descriptions

task related NL2TL example (from [Efficient-Eng-2-LTL](https://github.com/UM-ARM-Lab/Efficient-Eng-2-LTL))

```json

"globally ( and ( until ( scan , not ( any cubes ) ) , finally ( any cubes ) ) )": {

    "formula": "globally ( and ( until ( scan , not ( any cubes ) ) , finally ( any cubes ) ) )",

    "raw": "G & U S ! A F A"

  },

```

task independent NL2TL example:
```json

{"natural": "go through the P01 until you get to the  P04", "raw_ltl": "F ( P01 A ( F P04 ) )"}

```

**NOTE:** We mechanically obtain task independent descriptions from task related descriptions by noun/phrase substitution. Due to the removal of semantic information, some NL2TL mappings obtained through this method are not unique.

Based task related NL2TL datasets:
  - datasets
    - [Efficient-Eng-2-LTL](https://github.com/UM-ARM-Lab/Efficient-Eng-2-LTL)
    - [Lang2LTL](https://github.com/h2r/Lang2LTL)
    - [nl2spec](https://github.com/realChrisHahn2/nl2spec)
    - [NL2TL](https://github.com/yongchao98/NL2TL)
## File Structure
  - NL2HLTLTranslator
    - fastapi_server.py a FastAPI server for translate testing, will run on localhost:8001

    - mistral7b

      - finetune.py code for fintune

      - prediction.py code for prediction (this version do not have sockets)

  - mistral7b_quat8: a fintuned model based on Mistral7B in quat 8
  - NL2TL-dataset: used dataset
## Run
```bash

cd to/this/folder

pip install -e .

python finetune/fastapi_server.py

```
## Cite
```bibtex

@misc{xu2024scalingnaturallanguageunderstanding,

      title={Scaling Up Natural Language Understanding for Multi-Robots Through the Lens of Hierarchy}, 

      author={Shaojun Xu and Xusheng Luo and Yutong Huang and Letian Leng and Ruixuan Liu and Changliu Liu},

      year={2024},

      eprint={2408.08188},

      archivePrefix={arXiv},

      primaryClass={cs.RO},

      url={https://arxiv.org/abs/2408.08188}, 

}

```