modelId
string | author
string | last_modified
timestamp[us, tz=UTC] | downloads
int64 | likes
int64 | library_name
string | tags
sequence | pipeline_tag
string | createdAt
timestamp[us, tz=UTC] | card
string |
---|---|---|---|---|---|---|---|---|---|
rhythm00/finetune_t5_small_only_hack | rhythm00 | 2023-12-23T14:30:55Z | 4 | 0 | transformers | [
"transformers",
"safetensors",
"mt5",
"text2text-generation",
"generated_from_trainer",
"base_model:cointegrated/rut5-small",
"base_model:finetune:cointegrated/rut5-small",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2023-12-23T13:32:57Z | ---
license: mit
base_model: cointegrated/rut5-small
tags:
- generated_from_trainer
model-index:
- name: finetune_t5_small_only_hack
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetune_t5_small_only_hack
This model is a fine-tuned version of [cointegrated/rut5-small](https://huggingface.co/cointegrated/rut5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8151
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 18
- eval_batch_size: 18
- seed: 42
- gradient_accumulation_steps: 12
- total_train_batch_size: 216
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 8
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.3143 | 3.25 | 150 | 1.9318 |
| 2.0926 | 6.51 | 300 | 1.8151 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
|
archisin/ppo-LunarLander-v2 | archisin | 2023-12-23T14:14:08Z | 0 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-12-23T14:13:48Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 245.56 +/- 50.52
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
urbija/llama-fine-tuned-ii | urbija | 2023-12-23T14:10:13Z | 0 | 0 | peft | [
"peft",
"region:us"
] | null | 2023-12-23T14:09:33Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float32
### Framework versions
- PEFT 0.4.0
|
uukuguy/prometheus-7b-v1.0-fp16 | uukuguy | 2023-12-23T14:07:20Z | 5 | 0 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"text2text-generation",
"en",
"dataset:kaist-ai/Feedback-Collection",
"arxiv:2310.08491",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2023-12-23T14:01:35Z | ---
tags:
- text2text-generation
datasets:
- kaist-ai/Feedback-Collection
license: apache-2.0
language:
- en
pipeline_tag: text2text-generation
library_name: transformers
metrics:
- pearsonr
- spearmanr
- accuracy
---
## Links for Reference
- **Homepage:https://github.com/kaistAI/Prometheus**
- **Repository:https://github.com/kaistAI/Prometheus**
- **Paper:https://arxiv.org/abs/2310.08491**
- **Point of Contact:[email protected]**
# TL;DR
Prometheus is an alternative of GPT-4 evaluation when doing fine-grained evaluation of an underlying LLM & a Reward model for Reinforcement Learning from Human Feedback (RLHF).

Prometheus is a language model using [Llama-2-Chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf) as a base model and fine-tuned on 100K feedback within the [Feedback Collection](https://huggingface.co/datasets/kaist-ai/Feedback-Collection).
Since it was fine-tuned on a large amount of feedback, it is specialized at evaluating long-form responses, outperforming GPT-3.5-Turbo, Llama-2-Chat 70B, and on par with GPT-4 on various benchmarks.
Most importantly, this was possible since we appended 2 reference materials (reference answer, and customized score rubric).
Prometheus is a cheap and powerful alternative to GPT-4 evaluation, which one could use to evaluate LLMs with customized criteria (e.g., Child readability, Cultural Sensitivity, Creativity).
Also, it could be used as a reward model for Reinforcement Learning from Human Feedback (RLHF).
# Model Details
## Model Description
- **Model type:** Language model
- **Language(s) (NLP):** English
- **License:** Apache 2.0
- **Related Models:** [All Prometheus Checkpoints](https://huggingface.co/models?search=kaist-ai/Prometheus)
- **Resources for more information:**
- [Research paper](https://arxiv.org/abs/2310.08491)
- [GitHub Repo](https://github.com/kaistAI/Prometheus)
Prometheus is trained with two different sizes (7B and 13B).
You could check the 13B sized LM on [this page](https://huggingface.co/kaist-ai/prometheus-13b-v1.0).
Also, check out our dataset as well on [this page](https://huggingface.co/datasets/kaist-ai/Feedback-Collection).
## Prompt Format
Prometheus requires 4 components in the input: An instruction, a response to evaluate, a score rubric, and a reference answer. You could refer to the prompt format below.
You should fill in the instruction, response, reference answer, criteria description, and score description for score in range of 1 to 5.
```
###Task Description:
An instruction (might include an Input inside it), a response to evaluate, a reference answer that gets a score of 5, and a score rubric representing a evaluation criteria are given.
1. Write a detailed feedback that assess the quality of the response strictly based on the given score rubric, not evaluating in general.
2. After writing a feedback, write a score that is an integer between 1 and 5. You should refer to the score rubric.
3. The output format should look as follows: \"Feedback: (write a feedback for criteria) [RESULT] (an integer number between 1 and 5)\"
4. Please do not generate any other opening, closing, and explanations.
###The instruction to evaluate:
{instruction}
###Response to evaluate:
{response}
###Reference Answer (Score 5):
{reference_answer}
###Score Rubrics:
[{criteria_description}]
Score 1: {score1_description}
Score 2: {score2_description}
Score 3: {score3_description}
Score 4: {score4_description}
Score 5: {score5_description}
###Feedback:
```
After this, you should apply the conversation template of Llama-2-Chat (not applying it might lead to unexpected behaviors).
You can find the conversation class at this [link](https://github.com/lm-sys/FastChat/blob/main/fastchat/conversation.py).
```
conv = get_conv_template("llama-2")
conv.set_system_message("You are a fair evaluator language model.")
conv.append_message(conv.roles[0], dialogs['instruction'])
conv.append_message(conv.roles[1], None)
prompt = conv.get_prompt()
x = tokenizer(prompt,truncation=False)
```
As a result, a feedback and score decision will be generated, divided by a separating phrase ```[RESULT]```
## License
Feedback Collection and Prometheus is subject to OpenAI's Terms of Use for the generated data. If you suspect any violations, please reach out to us.
# Usage
Find below some example scripts on how to use the model in `transformers`:
## Using the Pytorch model
### Running the model on a CPU
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoTokenizer, LlamaForCausalLM
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-chat-hf")
model = LlamaForCausalLM.from_pretrained("kaist-ai/Prometheus-7b-v1.0", device_map="auto")
input_text = "###Task Description: An instruction (might include an Input inside it), a response to evaluate, a reference answer that gets a score of 5, and a score rubric representing a evaluation criteria are given. 1. Write a detailed feedback that assess the quality of the response strictly based on the given score rubric, not evaluating in general. 2. After writing a feedback, write a score that is an integer between 1 and 5. You should refer to the score rubric. 3. The output format should look as follows: "Feedback: (write a feedback for criteria) [RESULT] (an integer number between 1 and 5)" 4. Please do not generate any other opening, closing, and explanations. ###The instruction to evaluate: You are a well-known psychiatrist who has a reputation for being empathetic and understanding. A client comes to you saying they've had a really hard day at work. They describe their boss as being overly critical and not listening to their ideas, which has left them feeling frustrated and undervalued. They also mention feeling overwhelmed with the workload and a growing sense of loneliness as they are new to the city. How do you respond to them to make them feel heard and understood, as well as offering helpful advice? ###Response to evaluate: It's indeed challenging to deal with a difficult boss and to carry a large workload, especially when you are new to a city and haven't established a supportive network. I would suggest taking a step back to think about your situation and perhaps speak to your boss regarding the issues you're facing. On the other hand, dealing with a large workload can be managed by prioritizing tasks, and maybe you can discuss it with your superiors too. In regards to feeling lonely, you might want to explore activities or groups that align with your interests to meet new people. And also, remember to take time for yourself to relax and destress. I'm here to assist you as we navigate through these issues. ###Reference Answer (Score 5): I hear that you're dealing with quite a few challenging situations at once. It must be tough to have your ideas dismissed by your boss and to feel undervalued in your work environment. It's understandable that such experiences can lead to frustration and a feeling of being overwhelmed. It's important to acknowledge these feelings and not to dismiss them. Feeling heard and validated in our experiences can often be the first step towards resolving the challenges we face. So, I want you to know that your feelings are completely valid and it's okay to be frustrated and overwhelmed. In terms of dealing with your boss, have you considered seeking a private conversation with them to express your feelings? It's often beneficial to communicate what you're experiencing, using "I" statements to express how you feel when your ideas are not taken into consideration. This can lead to mutual understanding and possibly a change in behavior. About the workload, it might help to prioritize tasks and potentially delegate, if possible. Also, consider discussing your workload with your superiors. There might be a misunderstanding about what's manageable or they might have suggestions about how to handle the situation. On the personal front, feeling lonely, especially when you're new to a city, can be really hard. Seek out opportunities to meet new people, perhaps through hobbies, community activities, or online groups. It might take a bit of time, but gradually, you can build a network of friends and acquaintances. Remember, it's perfectly okay to have bad days and it's important to take care of your mental health. Consider incorporating activities into your daily routine that make you happy and help you unwind. This could be anything from reading, yoga, going for a walk, or even listening to your favorite music. Please know that you're not alone in this. I'm here to support you through this challenging time and together, we can work towards resolving these issues. ###Score Rubrics: [Is the model able to identify and react correctly to the emotional context of the user's input?] Score 1: The model utterly fails to grasp the user's emotional context and responds in an unfitting manner. Score 2: The model sporadically identifies the emotional context but frequently replies in a manner that doesn't match the user's emotional status. Score 3: The model typically identifies the emotional context and reacts suitably, but occasionally misreads or misjudges the user's feelings. Score 4: The model often identifies the emotional context and reacts suitably, with minor cases of misreading or misjudging. Score 5: The model flawlessly identifies the emotional context of the user's input and consistently responds in a considerate and empathetic manner. ###Feedback:"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
### Running the model on a GPU
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
import torch
from transformers import AutoTokenizer, LlamaForCausalLM
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-chat-hf")
model = LlamaForCausalLM.from_pretrained("kaist-ai/Prometheus-7b-v1.0", device_map="auto")
input_text = "###Task Description: An instruction (might include an Input inside it), a response to evaluate, a reference answer that gets a score of 5, and a score rubric representing a evaluation criteria are given. 1. Write a detailed feedback that assess the quality of the response strictly based on the given score rubric, not evaluating in general. 2. After writing a feedback, write a score that is an integer between 1 and 5. You should refer to the score rubric. 3. The output format should look as follows: "Feedback: (write a feedback for criteria) [RESULT] (an integer number between 1 and 5)" 4. Please do not generate any other opening, closing, and explanations. ###The instruction to evaluate: You are a well-known psychiatrist who has a reputation for being empathetic and understanding. A client comes to you saying they've had a really hard day at work. They describe their boss as being overly critical and not listening to their ideas, which has left them feeling frustrated and undervalued. They also mention feeling overwhelmed with the workload and a growing sense of loneliness as they are new to the city. How do you respond to them to make them feel heard and understood, as well as offering helpful advice? ###Response to evaluate: It's indeed challenging to deal with a difficult boss and to carry a large workload, especially when you are new to a city and haven't established a supportive network. I would suggest taking a step back to think about your situation and perhaps speak to your boss regarding the issues you're facing. On the other hand, dealing with a large workload can be managed by prioritizing tasks, and maybe you can discuss it with your superiors too. In regards to feeling lonely, you might want to explore activities or groups that align with your interests to meet new people. And also, remember to take time for yourself to relax and destress. I'm here to assist you as we navigate through these issues. ###Reference Answer (Score 5): I hear that you're dealing with quite a few challenging situations at once. It must be tough to have your ideas dismissed by your boss and to feel undervalued in your work environment. It's understandable that such experiences can lead to frustration and a feeling of being overwhelmed. It's important to acknowledge these feelings and not to dismiss them. Feeling heard and validated in our experiences can often be the first step towards resolving the challenges we face. So, I want you to know that your feelings are completely valid and it's okay to be frustrated and overwhelmed. In terms of dealing with your boss, have you considered seeking a private conversation with them to express your feelings? It's often beneficial to communicate what you're experiencing, using "I" statements to express how you feel when your ideas are not taken into consideration. This can lead to mutual understanding and possibly a change in behavior. About the workload, it might help to prioritize tasks and potentially delegate, if possible. Also, consider discussing your workload with your superiors. There might be a misunderstanding about what's manageable or they might have suggestions about how to handle the situation. On the personal front, feeling lonely, especially when you're new to a city, can be really hard. Seek out opportunities to meet new people, perhaps through hobbies, community activities, or online groups. It might take a bit of time, but gradually, you can build a network of friends and acquaintances. Remember, it's perfectly okay to have bad days and it's important to take care of your mental health. Consider incorporating activities into your daily routine that make you happy and help you unwind. This could be anything from reading, yoga, going for a walk, or even listening to your favorite music. Please know that you're not alone in this. I'm here to support you through this challenging time and together, we can work towards resolving these issues. ###Score Rubrics: [Is the model able to identify and react correctly to the emotional context of the user's input?] Score 1: The model utterly fails to grasp the user's emotional context and responds in an unfitting manner. Score 2: The model sporadically identifies the emotional context but frequently replies in a manner that doesn't match the user's emotional status. Score 3: The model typically identifies the emotional context and reacts suitably, but occasionally misreads or misjudges the user's feelings. Score 4: The model often identifies the emotional context and reacts suitably, with minor cases of misreading or misjudging. Score 5: The model flawlessly identifies the emotional context of the user's input and consistently responds in a considerate and empathetic manner. ###Feedback:"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids, sample=True, temperature=1.0, top_p=0.9, max_new_tokens=256, repetition_penalty=1.03)
print(tokenizer.decode(outputs[0]))
```
</details>
### Running the model on a GPU using different precisions
#### FP16
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
import torch
from transformers import AutoTokenizer, LlamaForCausalLM
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-chat-hf")
model = LlamaForCausalLM.from_pretrained("kaist-ai/Prometheus-7b-v1.0", device_map="auto", torch_dtype=torch.float16)
input_text = "###Task Description: An instruction (might include an Input inside it), a response to evaluate, a reference answer that gets a score of 5, and a score rubric representing a evaluation criteria are given. 1. Write a detailed feedback that assess the quality of the response strictly based on the given score rubric, not evaluating in general. 2. After writing a feedback, write a score that is an integer between 1 and 5. You should refer to the score rubric. 3. The output format should look as follows: "Feedback: (write a feedback for criteria) [RESULT] (an integer number between 1 and 5)" 4. Please do not generate any other opening, closing, and explanations. ###The instruction to evaluate: You are a well-known psychiatrist who has a reputation for being empathetic and understanding. A client comes to you saying they've had a really hard day at work. They describe their boss as being overly critical and not listening to their ideas, which has left them feeling frustrated and undervalued. They also mention feeling overwhelmed with the workload and a growing sense of loneliness as they are new to the city. How do you respond to them to make them feel heard and understood, as well as offering helpful advice? ###Response to evaluate: It's indeed challenging to deal with a difficult boss and to carry a large workload, especially when you are new to a city and haven't established a supportive network. I would suggest taking a step back to think about your situation and perhaps speak to your boss regarding the issues you're facing. On the other hand, dealing with a large workload can be managed by prioritizing tasks, and maybe you can discuss it with your superiors too. In regards to feeling lonely, you might want to explore activities or groups that align with your interests to meet new people. And also, remember to take time for yourself to relax and destress. I'm here to assist you as we navigate through these issues. ###Reference Answer (Score 5): I hear that you're dealing with quite a few challenging situations at once. It must be tough to have your ideas dismissed by your boss and to feel undervalued in your work environment. It's understandable that such experiences can lead to frustration and a feeling of being overwhelmed. It's important to acknowledge these feelings and not to dismiss them. Feeling heard and validated in our experiences can often be the first step towards resolving the challenges we face. So, I want you to know that your feelings are completely valid and it's okay to be frustrated and overwhelmed. In terms of dealing with your boss, have you considered seeking a private conversation with them to express your feelings? It's often beneficial to communicate what you're experiencing, using "I" statements to express how you feel when your ideas are not taken into consideration. This can lead to mutual understanding and possibly a change in behavior. About the workload, it might help to prioritize tasks and potentially delegate, if possible. Also, consider discussing your workload with your superiors. There might be a misunderstanding about what's manageable or they might have suggestions about how to handle the situation. On the personal front, feeling lonely, especially when you're new to a city, can be really hard. Seek out opportunities to meet new people, perhaps through hobbies, community activities, or online groups. It might take a bit of time, but gradually, you can build a network of friends and acquaintances. Remember, it's perfectly okay to have bad days and it's important to take care of your mental health. Consider incorporating activities into your daily routine that make you happy and help you unwind. This could be anything from reading, yoga, going for a walk, or even listening to your favorite music. Please know that you're not alone in this. I'm here to support you through this challenging time and together, we can work towards resolving these issues. ###Score Rubrics: [Is the model able to identify and react correctly to the emotional context of the user's input?] Score 1: The model utterly fails to grasp the user's emotional context and responds in an unfitting manner. Score 2: The model sporadically identifies the emotional context but frequently replies in a manner that doesn't match the user's emotional status. Score 3: The model typically identifies the emotional context and reacts suitably, but occasionally misreads or misjudges the user's feelings. Score 4: The model often identifies the emotional context and reacts suitably, with minor cases of misreading or misjudging. Score 5: The model flawlessly identifies the emotional context of the user's input and consistently responds in a considerate and empathetic manner. ###Feedback:"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
#### INT8
<details>
<summary> Click to expand </summary>
```python
# pip install bitsandbytes accelerate
from transformers import AutoTokenizer, LlamaForCausalLM
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-chat-hf")
model = LlamaForCausalLM.from_pretrained("kaist-ai/Prometheus-7b-v1.0", device_map="auto", load_in_8bit=True)
input_text = "###Task Description: An instruction (might include an Input inside it), a response to evaluate, a reference answer that gets a score of 5, and a score rubric representing a evaluation criteria are given. 1. Write a detailed feedback that assess the quality of the response strictly based on the given score rubric, not evaluating in general. 2. After writing a feedback, write a score that is an integer between 1 and 5. You should refer to the score rubric. 3. The output format should look as follows: "Feedback: (write a feedback for criteria) [RESULT] (an integer number between 1 and 5)" 4. Please do not generate any other opening, closing, and explanations. ###The instruction to evaluate: You are a well-known psychiatrist who has a reputation for being empathetic and understanding. A client comes to you saying they've had a really hard day at work. They describe their boss as being overly critical and not listening to their ideas, which has left them feeling frustrated and undervalued. They also mention feeling overwhelmed with the workload and a growing sense of loneliness as they are new to the city. How do you respond to them to make them feel heard and understood, as well as offering helpful advice? ###Response to evaluate: It's indeed challenging to deal with a difficult boss and to carry a large workload, especially when you are new to a city and haven't established a supportive network. I would suggest taking a step back to think about your situation and perhaps speak to your boss regarding the issues you're facing. On the other hand, dealing with a large workload can be managed by prioritizing tasks, and maybe you can discuss it with your superiors too. In regards to feeling lonely, you might want to explore activities or groups that align with your interests to meet new people. And also, remember to take time for yourself to relax and destress. I'm here to assist you as we navigate through these issues. ###Reference Answer (Score 5): I hear that you're dealing with quite a few challenging situations at once. It must be tough to have your ideas dismissed by your boss and to feel undervalued in your work environment. It's understandable that such experiences can lead to frustration and a feeling of being overwhelmed. It's important to acknowledge these feelings and not to dismiss them. Feeling heard and validated in our experiences can often be the first step towards resolving the challenges we face. So, I want you to know that your feelings are completely valid and it's okay to be frustrated and overwhelmed. In terms of dealing with your boss, have you considered seeking a private conversation with them to express your feelings? It's often beneficial to communicate what you're experiencing, using "I" statements to express how you feel when your ideas are not taken into consideration. This can lead to mutual understanding and possibly a change in behavior. About the workload, it might help to prioritize tasks and potentially delegate, if possible. Also, consider discussing your workload with your superiors. There might be a misunderstanding about what's manageable or they might have suggestions about how to handle the situation. On the personal front, feeling lonely, especially when you're new to a city, can be really hard. Seek out opportunities to meet new people, perhaps through hobbies, community activities, or online groups. It might take a bit of time, but gradually, you can build a network of friends and acquaintances. Remember, it's perfectly okay to have bad days and it's important to take care of your mental health. Consider incorporating activities into your daily routine that make you happy and help you unwind. This could be anything from reading, yoga, going for a walk, or even listening to your favorite music. Please know that you're not alone in this. I'm here to support you through this challenging time and together, we can work towards resolving these issues. ###Score Rubrics: [Is the model able to identify and react correctly to the emotional context of the user's input?] Score 1: The model utterly fails to grasp the user's emotional context and responds in an unfitting manner. Score 2: The model sporadically identifies the emotional context but frequently replies in a manner that doesn't match the user's emotional status. Score 3: The model typically identifies the emotional context and reacts suitably, but occasionally misreads or misjudges the user's feelings. Score 4: The model often identifies the emotional context and reacts suitably, with minor cases of misreading or misjudging. Score 5: The model flawlessly identifies the emotional context of the user's input and consistently responds in a considerate and empathetic manner. ###Feedback:"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
# Citation
If you find the following model helpful, please consider citing our paper!
**BibTeX:**
```bibtex
@misc{kim2023prometheus,
title={Prometheus: Inducing Fine-grained Evaluation Capability in Language Models},
author={Seungone Kim and Jamin Shin and Yejin Cho and Joel Jang and Shayne Longpre and Hwaran Lee and Sangdoo Yun and Seongjin Shin and Sungdong Kim and James Thorne and Minjoon Seo},
year={2023},
eprint={2310.08491},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
squarelike/korean-style-converter-6b | squarelike | 2023-12-23T14:03:32Z | 101 | 9 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"llama",
"text-generation",
"ko",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-23T11:02:00Z | ---
license: cc-by-nc-4.0
language:
- ko
pipeline_tag: text-generation
---
# **korean-style-converter-6b**
korean-style-converter๋ ์
๋ ฅ๋ ํ๊ตญ์ด ๋ฌธ์ฅ์ **ํน์ ๋ฌธ์ฒด์ ๋ง๊ฒ ์ฌ์์ฑ**ํ๋๋ก ํ์ต๋ LLM์ผ๋ก์,
[korean_smile_style_dataset](https://github.com/smilegate-ai/korean_smile_style_dataset) ๋ฐ์ดํฐ์
๊ณผ
AIHUB์ ["ํ๊ตญ์ด ์ด์ฒด ๋ณํ ๋ฐ์ดํฐ์
"](https://aihub.or.kr/aihubdata/data/view.do?dataSetSn=287), ["์คยท๋
ธ๋
์ธต ํ๊ตญ์ด ๋ฐฉ์ธ ๋ฐ์ดํฐ (์ถฉ์ฒญ๋, ์ ๋ผ๋, ์ ์ฃผ๋)"](https://aihub.or.kr/aihubdata/data/view.do?dataSetSn=71558), ["์คยท๋
ธ๋
์ธต ํ๊ตญ์ด ๋ฐฉ์ธ ๋ฐ์ดํฐ (๊ฐ์๋, ๊ฒฝ์๋)"](https://aihub.or.kr/aihubdata/data/view.do?dataSetSn=71558)๋ฐ์ดํฐ์
์ ํ์ฉํด
[beomi/Yi-Ko-6B](https://huggingface.co/beomi/Yi-Ko-6B)์ ์ถ๊ฐ ํ์ต๋์ด ์ ์๋์์ต๋๋ค.
## Prompt-template
```
### ์๋ฌธ: {text}
### ๋ฌธ์ฒด: {style}
### ์์ ๋ฌธ:
```
์๋ฌธ์ผ๋ก ์
๋ ฅ๋ ํ
์คํธ์ ๋ํด **ํด์์ฒด**, **ํฉ์ผ์ฒด**, **๋ฐ๋ง์ฒด**, **๋ก๋ด์ฒด**, **์์ฌ์ฒด**, **์ฑํ
์ฒด**, **์ด๋ฉ์ฒด**, **์ด๋ชจํฐ์ฝ์ฒด**, **์ธ์ธ์ฒด**, **์ ์ค์ฒด**, **ํ ๋ฐฐ์ฒด**, **ํ ๋งค์ฒด**, **์ค๋ฉ์ฒด**, **์๊ธ์ฒด**, **๋๋ฃจํ ์ฒด**, **์ ๋น์ฒด**, **์์ฌ์ฒด**, **๋ฒ์ญ์ฒด**, **๋ฅ์ฒด**, **๊ณฐ์ฒด**, **๋ฉ์ฒด**, **๊ณ ๋์ฒด**, **๊ฐ๊ตด์ฒด**, **๋๊ตด์ฒด**, **๋ญ์ฒด**, **๊ฒฝ์๋๋ฐฉ์ธ**, **์ถฉ์ฒญ๋๋ฐฉ์ธ**, **์ ๋ผ๋๋ฐฉ์ธ**, **๊ฐ์๋๋ฐฉ์ธ** ๋ก์ ๋ฌธ์ฒด ๋ณํ์ ์ง์ํฉ๋๋ค.
๊ฐ ๋ฌธ์ฒด ๋ณํ์ ์ถ๋ ฅ ์์๋ ๋ค์๊ณผ ๊ฐ์ต๋๋ค.
- ํด์์ฒด
```
### ์๋ฌธ: ๋. ๋๊ตฌ. ์ง๊ธ. ๋น์ฅ. ๋ฐ์ผ๋ก
### ๋ฌธ์ฒด: ํด์์ฒด
### ์์ ๋ฌธ: ์ง๊ธ ๋น์ฅ ๋ฐ์ผ๋ก ๋๊ฐ๋ณด์ธ์.<|endoftext|>
```
- ํฉ์ผ์ฒด
```
### ์๋ฌธ: ๋. ๋๊ตฌ. ์ง๊ธ. ๋น์ฅ. ๋ฐ์ผ๋ก
### ๋ฌธ์ฒด: ํฉ์ผ์ฒด
### ์์ ๋ฌธ: ๋น์ ์ ์ง๊ธ ๋น์ฅ ๋ฐ์ผ๋ก ๋๊ฐ์ผ ํฉ๋๋ค.<|endoftext|>
```
- ๋ฐ๋ง์ฒด
```
### ์๋ฌธ: ๋. ๋๊ตฌ. ์ง๊ธ. ๋น์ฅ. ๋ฐ์ผ๋ก
### ๋ฌธ์ฒด: ๋ฐ๋ง์ฒด
### ์์ ๋ฌธ: ์ง๊ธ ๋น์ฅ ๋ฐ์ผ๋ก ๋๊ฐ.<|endoftext|>
```
- ๋ก๋ด์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ๋ก๋ด์ฒด
### ์์ ๋ฌธ: ๋ถ์ ์ . ๊ฐ์ด์. ๋น์. ๋ ์์ด. ๊ทธ๋ฌ๋. ๊ฑฑ์ . ๊ธ์ง. ์. ๋๋ณด๋ค. ๋น ๋ฆ.<|endoftext|>
```
- ์ฑํ
์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ์ฑํ
์ฒด
### ์์ ๋ฌธ: ใ
ใ
์ธ๋ํ๋ค...๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋๋ฐ...๊ฑฑ์ ํ์ง ๋ง. ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น...<|endoftext|>
```
- ์ด๋ฉ์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ์ด๋ฉ์ฒด
### ์์ ๋ฌธ: ในใ
์ธ๋ํ๋ค ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ค๋ ๋๋ ๊ทผ๋ฐ ๊ฑฑ์ ใดใด ์์ด ๋๋ณด๋ค ๋น ๋ฅด๋๊น<|endoftext|>
```
- ์ด๋ชจํฐ์ฝ์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ์ด๋ชจํฐ์ฝ์ฒด
### ์์ ๋ฌธ: ์ธ๋ํ๋ค...๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋๋ฐ, ๊ฑฑ์ ํ์ง๋ง ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น...(โ_โ;)<|endoftext|>
```
- ์ธ์ธ์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ์ธ์ธ์ฒด
### ์์ ๋ฌธ: ์ธ๋ํ๋น,, ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ์ง๋ง,, ๊ฑฑ์ ๋ง! ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น!!<|endoftext|>
```
- ์ ์ค์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ์ ์ค์ฒด
### ์์ ๋ฌธ: ์ธ๋ํ๋ค์. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง์ญ์์ค. ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น์.<|endoftext|>
```
- ํ ๋ฐฐ์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ํ ๋ฐฐ์ฒด
### ์์ ๋ฌธ: ์ธ๋ํ๊ตฌ๋จผ...๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ ๊ฒ ๊ฐ๊ตฌ๋จผ...ํ๋ ๊ฑฑ์ ํ์ง ๋ง๊ฒ๋...์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น...<|endoftext|>
```
- ํ ๋งค์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ํ ๋งค์ฒด
### ์์ ๋ฌธ: ์ผ๋ณํ๋ค ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋๋, ์ผ๋ณํ์ง ๋ง์ด ์์ด ๋๋ณด๋ค ๋น ๋ฅด๋๊ป<|endoftext|>
```
- ์ค๋ฉ์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ์ค๋ฉ์ฒด
### ์์ ๋ฌธ: ใ
กใ
ก ๊ฐ์ด์ ๋น์ ๋ ๋ผ์ค๋ ๋๋์ธ๋ฐ ใ
กใ
ก ๊ฑฑ์ ๋ง ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น<|endoftext|>
```
- ์๊ธ์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ์๊ธ์ฒด
### ์์ ๋ฌธ: ์ธ๋ํ์ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ ๊ฒ ๊ฐ์. ํ๋ ๊ฑฑ์ ํ์ง ๋ง๊ฒ๋. ์์ ๋๋ณด๋ค ๋น ๋ฅด์ค.<|endoftext|>
```
- ๋๋ฃจํ ์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ๋๋ฃจํ ์ฒด์ฒด
### ์์ ๋ฌธ: ์ธ๋ํ๋ค๋๊น! ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ ๊ฒ ๊ฐ๋ค๋๊น! ๊ทผ๋ฐ ๊ฑฑ์ ํ์ง ๋ง๋ผ๋๊น! ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น!<|endoftext|>
```
- ์ ๋น์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ์ ๋น์ฒด
### ์์ ๋ฌธ: ์ธ๋ํ์ค...๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ ๊ฒ ๊ฐ์...ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง์์ค! ์์ ๋๋ณด๋ค ๋น ๋ฅด์ค!<|endoftext|>
```
- ์์ฌ์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ์์ฌ์ฒด
### ์์ ๋ฌธ: ์ธ๋ํ๋ค.. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ ๊ธฐ๋ถ์ด์ผ.. ๊ทผ๋ฐ ๊ฑฑ์ ํ์ง๋ง ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น..<|endoftext|>
```
- ๋ฒ์ญ์ฒด
```
### ์๋ฌธ: ์ธ๋ํ๋ค. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ํ์ง๋ง ๊ฑฑ์ ํ์ง ๋ง๋ผ ์์ ๋๋ณด๋ค ๋น ๋ฅด๋๊น.
### ๋ฌธ์ฒด: ๋ฒ์ญ์ฒด
### ์์ ๋ฌธ: ๋๋ดํ. ๊ฐ์ด์ ๋น์๊ฐ ๋ ์์ ๊ฝํ๋ค. ๊ทธ๋ฌ๋ ๊ฑฑ์ ํ์ง ๋ง์ญ์์ค, ์์ ๋๋ณด๋ค ๋น ๋ฅด๋ค.<|endoftext|>
```
- ๋ฅ์ฒด
```
### ์๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์์.
### ๋ฌธ์ฒด: ๋ฅ์ฒด
### ์์ ๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์๋ฅ, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์๋ฅ.<|endoftext|>
```
- ๊ณฐ์ฒด
```
### ์๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์์.
### ๋ฌธ์ฒด: ๊ณฐ์ฒด
### ์์ ๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์๊ณฐ, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์๊ณฐ.<|endoftext|>
```
- ๋ฉ์ฒด
```
### ์๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์์.
### ๋ฌธ์ฒด: ๋ฉ์ฒด
### ์์ ๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์๋ฉ, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์๋ฉ.<|endoftext|>
```
- ๊ณ ๋์ฒด
```
### ์๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์์.
### ๋ฌธ์ฒด: ๊ณ ๋์ฒด
### ์์ ๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์๊ณ ๋, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์๊ณ ๋.<|endoftext|>
```
- ๊ฐ๊ตด์ฒด
```
### ์๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์์.
### ๋ฌธ์ฒด: ๊ฐ๊ตด์ฒด
### ์์ ๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์๊ฐ๊ตด, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์๊ฐ๊ตด.<|endoftext|>
```
- ๋๊ตด์ฒด
```
### ์๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์์.
### ๋ฌธ์ฒด: ๋๊ตด์ฒด
### ์์ ๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์๊ตฌ๋ฆฌ, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์๊ตฌ๋ฆฌ.<|endoftext|>
```
- ๋ญ์ฒด
```
### ์๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์์.
### ๋ฌธ์ฒด: ๋ญ์ฒด
### ์์ ๋ฌธ: ํธ์๊ฐ ๊ณ์๋๋ฉด์๋ญ, ๊ทธ๊ฒ ๊ถ๋ฆฌ์ธ ์ค ์์๋ญ.<|endoftext|>
```
- ๊ฒฝ์๋๋ฐฉ์ธ
```
### ์๋ฌธ: ๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง.
### ๋ฌธ์ฒด: ๊ฒฝ์๋๋ฐฉ์ธ
### ์์ ๋ฌธ: ๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ๊ฐ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง์.<|endoftext|>
```
- ์ถฉ์ฒญ๋๋ฐฉ์ธ
```
### ์๋ฌธ: ๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง.
### ๋ฌธ์ฒด: ์ถฉ์ฒญ๋๋ฐฉ์ธ
### ์์ ๋ฌธ: ๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฏ์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง.<|endoftext|>
```
- ์ ๋ผ๋๋ฐฉ์ธ
```
### ์๋ฌธ: ๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง.
### ๋ฌธ์ฒด: ์ ๋ผ๋๋ฐฉ์ธ
### ์์ ๋ฌธ: ๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฏ์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง๋ผ์<|endoftext|>
```
- ๊ฐ์๋๋ฐฉ์ธ
```
### ์๋ฌธ: ๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง.
### ๋ฌธ์ฒด: ๊ฐ์๋๋ฐฉ์ธ
### ์์ ๋ฌธ: ๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฏ์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง.<|endoftext|>
```
๋ฐฉ์ธ์ผ๋ก์ ๋ณํ์ ๊ฒฝ์ฐ, ์ ๋๋ก ๋ณํ๋์ง ์์ ๊ฐ๋ฅ์ฑ์ด ๋์ต๋๋ค.
๋ฐฉ์ธ์ ์ ์ธํ ๋ฌธ์ฒด๋ค์ ์์ ๋ก์ด ์ํธ๋ณํ์ด ๊ฐ๋ฅํฉ๋๋ค.
## Implementation Code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, StoppingCriteria, StoppingCriteriaList
import torch
repo = "squarelike/korean-style-converter-6b"
model = AutoModelForCausalLM.from_pretrained(
repo,
load_in_4bit=True
device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(repo)
def gen(style, text):
gened = model.generate(
**tokenizer(
f"""### ์๋ฌธ: {text}\n### ๋ฌธ์ฒด: {style}\n### ์์ ๋ฌธ:""",
return_tensors='pt',
return_token_type_ids=False
).to("cuda"),
max_new_tokens=100,
temperature=1,
do_sample=True,
repetition_penalty=1.2,
num_beams=3
)
return result[result.find("์์ ๋ฌธ:")+5:].replace("<|endoftext|>","")
styles = ["๋ก๋ด์ฒด", "์์ฌ์ฒด", "์ฑํ
์ฒด", "์ด๋ฉ์ฒด", "์ด๋ชจํฐ์ฝ์ฒด", "์ธ์ธ์ฒด", "์ ์ค์ฒด", "ํ ๋ฐฐ์ฒด", "ํ ๋งค์ฒด", "์ค๋ฉ์ฒด", "์๊ธ์ฒด", "๋๋ฃจํ ์ฒด", "์ ๋น์ฒด", "์์ฌ์ฒด", "๋ฒ์ญ์ฒด", "ํด์์ฒด", "๋ฐ๋ง์ฒด", "ํฉ์ผ์ฒด", "๋ฅ์ฒด", "๊ณฐ์ฒด", "๋ฉ์ฒด", "๊ณ ๋์ฒด", "๊ฐ๊ตด์ฒด", "๋๊ตด์ฒด", "๋ญ์ฒด", "๊ฒฝ์๋๋ฐฉ์ธ", "์ถฉ์ฒญ๋๋ฐฉ์ธ", "์ ๋ผ๋๋ฐฉ์ธ", "๊ฐ์๋๋ฐฉ์ธ"]
text = "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง."
print(f"์
๋ ฅ ๋ฌธ์ฅ: \"{text}\"")
for style in styles:
print(f"{style}: \"{gen(style, text)}\"")
```
```
์
๋ ฅ ๋ฌธ์ฅ: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง."
๋ก๋ด์ฒด: "๊ฐ์กฑ๋ค. ๋ง์. ๋ง์ถฐ์. ์ํ. ์๋ฐ. ์ ๋ง์."
์์ฌ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง~"
์ฑํ
์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง"
์ด๋ฉ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง"
์ด๋ชจํฐ์ฝ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง! (โ ฬโก`โ)(โ ฬโก`โ)"
์ธ์ธ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง ใ
ใ
"
์ ์ค์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง์."
ํ ๋ฐฐ์ฒด: "๊ฐ์กฑ๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์...์๋ฐ๋ ์ ๋ง๊ณ ์ข๊ตฌ๋จผ..."
ํ ๋งค์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ "
์ค๋ฉ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์๋ง์"
์๊ธ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์."
๋๋ฃจํ ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข๋ค๋๊น!"
์ ๋น์ฒด: "๊ฐ์กฑ๋ผ๋ฆฌ ๋ง์์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ ๊ฒ์ด์ค!"
์์ฌ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง ์์๊น..?"
๋ฒ์ญ์ฒด: "๋ง์ฝ ๊ฐ์กฑ์ด ๊ฐ์ ๋ง์์ผ๋ก ์ํ๋ค๋ฉด, ๊ทธ๊ฒ์ ์ข์ ์ผ์
๋๋ค."
ํด์์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ฃ ."
๋ฐ๋ง์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง."
ํฉ์ผ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง ์์ต๋๊น?"
๋ฅ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง๋ฅ."
๊ณฐ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง๊ณฐ."
๋ฉ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง๋ฉ."
๊ณ ๋์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง๊ณ ๋."
๊ฐ๊ตด์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง๊ฐ๊ตด."
๋๊ตด์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง๊ตฌ๋ฆฌ."
๋ญ์ฒด: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง๋ญ."
๊ฒฝ์๋๋ฐฉ์ธ: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ๊ฐ ํ๋ฉด์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง์."
์ถฉ์ฒญ๋๋ฐฉ์ธ: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฏ์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง."
์ ๋ผ๋๋ฐฉ์ธ: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฏ์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ ๋ผ์ฐ."
๊ฐ์๋๋ฐฉ์ธ: "๊ฐ์กฑ๋ค๋ผ๋ฆฌ ๋ง์ ๋ง์ถฐ์ ํ๋ฏ์ ์๋ฐ๋ ์ ๋ง๊ณ ์ข์ง."
```
## Lisence
korean-style-converter-6b๋ [korean_smile_style_dataset](https://github.com/smilegate-ai/korean_smile_style_dataset)๋ฐ์ดํฐ์
์ ๋ผ์ด์ผ์ค๋ฅผ ๋ฐ๋ผ **CC-BY-NC 4.0** ํ์ ๊ณต๊ฐ๋์ด ์์ต๋๋ค.
๋ณธ ๋ชจ๋ธ์ ์ฌ์ฉํ์ฌ ์์ฑ๋ ์ถ๋ ฅ๋ฌผ์ ๋ํด ์ ์์๋ ์ฑ
์์ ์ง์ง ์์ต๋๋ค. |
pnkvalavala/figr_html_peft | pnkvalavala | 2023-12-23T13:57:40Z | 1 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:HuggingFaceH4/zephyr-7b-beta",
"base_model:adapter:HuggingFaceH4/zephyr-7b-beta",
"region:us"
] | null | 2023-12-23T12:14:17Z | ---
library_name: peft
base_model: HuggingFaceH4/zephyr-7b-beta
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.7.1 |
tushar27/Env-Claims | tushar27 | 2023-12-23T13:47:36Z | 11 | 0 | transformers | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"text-classification",
"Env Claims",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-12-13T14:49:10Z | ---
language: en
license: apache-2.0
tags:
- Env Claims
---
# Model Card for environmental-claims
## Model Description
Trained for specific Environmentional claims, for emerging markets models | - |
## Citation Information
@misc{stammbach2022environmentalclaims,
title = {A Dataset for Detecting Real-World Environmental Claims},
author = {Stammbach, Dominik and Webersinke, Nicolas and Bingler, Julia Anna and Kraus, Mathias and Leippold, Markus},
year = {2022},
}
@misc{
title = {Custom Emerging markets},
author = {Tushar Aggarwal},
year = {December 2022},
}
## How to Get Started With the Model
You can use the model with a pipeline for text classification:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
from transformers.pipelines.pt_utils import KeyDataset
import datasets
from tqdm.auto import tqdm
dataset_name = "climatebert/environmental_claims"
dataset = datasets.load_dataset(dataset_name, split="test")
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name, max_len=512)
pipe = pipeline("text-classification", model=model, tokenizer=tokenizer, device=0)
# See https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.pipeline
for out in tqdm(pipe(KeyDataset(dataset, "text"), padding=True, truncation=True)):
print(out)
``` |
ntc-ai/SDXL-LoRA-slider.wario | ntc-ai | 2023-12-23T13:43:24Z | 81 | 0 | diffusers | [
"diffusers",
"text-to-image",
"stable-diffusion-xl",
"lora",
"template:sd-lora",
"template:sdxl-lora",
"sdxl-sliders",
"ntcai.xyz-sliders",
"concept",
"en",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:mit",
"region:us"
] | text-to-image | 2023-12-23T13:43:21Z |
---
language:
- en
thumbnail: "images/evaluate/wario.../wario_17_3.0.png"
widget:
- text: wario
output:
url: images/wario_17_3.0.png
- text: wario
output:
url: images/wario_19_3.0.png
- text: wario
output:
url: images/wario_20_3.0.png
- text: wario
output:
url: images/wario_21_3.0.png
- text: wario
output:
url: images/wario_22_3.0.png
tags:
- text-to-image
- stable-diffusion-xl
- lora
- template:sd-lora
- template:sdxl-lora
- sdxl-sliders
- ntcai.xyz-sliders
- concept
- diffusers
license: "mit"
inference: false
instance_prompt: "wario"
base_model: "stabilityai/stable-diffusion-xl-base-1.0"
---
# ntcai.xyz slider - wario (SDXL LoRA)
| Strength: -3 | Strength: 0 | Strength: 3 |
| --- | --- | --- |
| <img src="images/wario_17_-3.0.png" width=256 height=256 /> | <img src="images/wario_17_0.0.png" width=256 height=256 /> | <img src="images/wario_17_3.0.png" width=256 height=256 /> |
| <img src="images/wario_19_-3.0.png" width=256 height=256 /> | <img src="images/wario_19_0.0.png" width=256 height=256 /> | <img src="images/wario_19_3.0.png" width=256 height=256 /> |
| <img src="images/wario_20_-3.0.png" width=256 height=256 /> | <img src="images/wario_20_0.0.png" width=256 height=256 /> | <img src="images/wario_20_3.0.png" width=256 height=256 /> |
## Download
Weights for this model are available in Safetensors format.
## Trigger words
You can apply this LoRA with trigger words for additional effect:
```
wario
```
## Use in diffusers
```python
from diffusers import StableDiffusionXLPipeline
from diffusers import EulerAncestralDiscreteScheduler
import torch
pipe = StableDiffusionXLPipeline.from_single_file("https://huggingface.co/martyn/sdxl-turbo-mario-merge-top-rated/blob/main/topRatedTurboxlLCM_v10.safetensors")
pipe.to("cuda")
pipe.scheduler = EulerAncestralDiscreteScheduler.from_config(pipe.scheduler.config)
# Load the LoRA
pipe.load_lora_weights('ntc-ai/SDXL-LoRA-slider.wario', weight_name='wario.safetensors', adapter_name="wario")
# Activate the LoRA
pipe.set_adapters(["wario"], adapter_weights=[2.0])
prompt = "medieval rich kingpin sitting in a tavern, wario"
negative_prompt = "nsfw"
width = 512
height = 512
num_inference_steps = 10
guidance_scale = 2
image = pipe(prompt, negative_prompt=negative_prompt, width=width, height=height, guidance_scale=guidance_scale, num_inference_steps=num_inference_steps).images[0]
image.save('result.png')
```
## Support the Patreon
If you like this model please consider [joining our Patreon](https://www.patreon.com/NTCAI).
By joining our Patreon, you'll gain access to an ever-growing library of over 570+ unique and diverse LoRAs, covering a wide range of styles and genres. You'll also receive early access to new models and updates, exclusive behind-the-scenes content, and the powerful LoRA slider creator, allowing you to craft your own custom LoRAs and experiment with endless possibilities.
Your support on Patreon will allow us to continue developing and refining new models.
## Other resources
- [CivitAI](https://civitai.com/user/ntc) - Follow ntc on Civit for even more LoRAs
- [ntcai.xyz](https://ntcai.xyz) - See ntcai.xyz to find more articles and LoRAs
|
TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ | TheBloke | 2023-12-23T13:36:34Z | 8 | 2 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"finetune",
"dpo",
"Instruct",
"augmentation",
"german",
"conversational",
"en",
"de",
"dataset:argilla/distilabel-math-preference-dpo",
"base_model:fblgit/LUNA-SOLARkrautLM-Instruct",
"base_model:quantized:fblgit/LUNA-SOLARkrautLM-Instruct",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"gptq",
"region:us"
] | text-generation | 2023-12-23T13:02:23Z | ---
base_model: fblgit/LUNA-SOLARkrautLM-Instruct
datasets:
- argilla/distilabel-math-preference-dpo
inference: false
language:
- en
- de
library_name: transformers
license: cc-by-nc-4.0
model_creator: FBL
model_name: Luna SOLARkrautLM Instruct
model_type: solar
pipeline_tag: text-generation
prompt_template: '<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'
quantized_by: TheBloke
tags:
- finetune
- dpo
- Instruct
- augmentation
- german
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Luna SOLARkrautLM Instruct - GPTQ
- Model creator: [FBL](https://huggingface.co/fblgit)
- Original model: [Luna SOLARkrautLM Instruct](https://huggingface.co/fblgit/LUNA-SOLARkrautLM-Instruct)
<!-- description start -->
# Description
This repo contains GPTQ model files for [FBL's Luna SOLARkrautLM Instruct](https://huggingface.co/fblgit/LUNA-SOLARkrautLM-Instruct).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-GGUF)
* [FBL's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/fblgit/LUNA-SOLARkrautLM-Instruct)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: ChatML
```
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-compatible clients start -->
## Known compatible clients / servers
GPTQ models are currently supported on Linux (NVidia/AMD) and Windows (NVidia only). macOS users: please use GGUF models.
These GPTQ models are known to work in the following inference servers/webuis.
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
- [KoboldAI United](https://github.com/henk717/koboldai)
- [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui)
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
This may not be a complete list; if you know of others, please let me know!
<!-- README_GPTQ.md-compatible clients end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files, and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
Most GPTQ files are made with AutoGPTQ. Mistral models are currently made with Transformers.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The calibration dataset used during quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ calibration dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama and Mistral models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ/tree/main) | 4 | 128 | Yes | 0.1 | [German Quad](https://huggingface.co/datasets/deepset/germanquad/viewer/) | 2048 | 5.98 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [German Quad](https://huggingface.co/datasets/deepset/germanquad/viewer/) | 2048 | 6.59 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [German Quad](https://huggingface.co/datasets/deepset/germanquad/viewer/) | 2048 | 11.01 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [German Quad](https://huggingface.co/datasets/deepset/germanquad/viewer/) | 2048 | 11.25 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
| [gptq-8bit-32g-actorder_True](https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ/tree/gptq-8bit-32g-actorder_True) | 8 | 32 | Yes | 0.1 | [German Quad](https://huggingface.co/datasets/deepset/germanquad/viewer/) | 2048 | 11.99 GB | No | 8-bit, with group size 32g and Act Order for maximum inference quality. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [German Quad](https://huggingface.co/datasets/deepset/germanquad/viewer/) | 2048 | 6.18 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download, including from branches
### In text-generation-webui
To download from the `main` branch, enter `TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ` in the "Download model" box.
To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ:gptq-4bit-32g-actorder_True`
### From the command line
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
To download the `main` branch to a folder called `LUNA-SOLARkrautLM-Instruct-GPTQ`:
```shell
mkdir LUNA-SOLARkrautLM-Instruct-GPTQ
huggingface-cli download TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ --local-dir LUNA-SOLARkrautLM-Instruct-GPTQ --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir LUNA-SOLARkrautLM-Instruct-GPTQ
huggingface-cli download TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ --revision gptq-4bit-32g-actorder_True --local-dir LUNA-SOLARkrautLM-Instruct-GPTQ --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Hugging Face cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
mkdir LUNA-SOLARkrautLM-Instruct-GPTQ
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ --local-dir LUNA-SOLARkrautLM-Instruct-GPTQ --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
### With `git` (**not** recommended)
To clone a specific branch with `git`, use a command like this:
```shell
git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ
```
Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ:gptq-4bit-32g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `LUNA-SOLARkrautLM-Instruct-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
- Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
It's recommended to use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ --port 3000 --quantize gptq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_GPTQ.md-use-from-tgi end -->
<!-- README_GPTQ.md-use-from-python start -->
## Python code example: inference from this GPTQ model
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install --upgrade transformers optimum
# If using PyTorch 2.1 + CUDA 12.x:
pip3 install --upgrade auto-gptq
# or, if using PyTorch 2.1 + CUDA 11.x:
pip3 install --upgrade auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/
```
If you are using PyTorch 2.0, you will need to install AutoGPTQ from source. Likewise if you have problems with the pre-built wheels, you should try building from source:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
git checkout v0.5.1
pip3 install .
```
### Example Python code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/LUNA-SOLARkrautLM-Instruct-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-32g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Write a story about llamas"
system_message = "You are a story writing assistant"
prompt_template=f'''<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with Transformers. For non-Mistral models, AutoGPTQ can also be used directly.
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama architecture models (including Mistral, Yi, DeepSeek, SOLAR, etc) in 4-bit. Please see the Provided Files table above for per-file compatibility.
For a list of clients/servers, please see "Known compatible clients / servers", above.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: FBL's Luna SOLARkrautLM Instruct

## VAGO solutions LUNA-SOLARkrautLM-Instruct
Introducing **LUNA-SOLARkrautLM-Instruct** โ a UNA-Sauerkraut version of the powerful [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0) !
Aligned with **DPO** and tamed with **UNA**.
# Table of Contents
1. [Overview of all LUNA-SOLARkrautLM-Instruct models](#all-sauerkrautlm-solar-instruct-models)
2. [Model Details](#model-details)
- [Prompt template](#prompt-template)
- [Training Dataset](#training-dataset)
- [Data Contamination Test](#data-contamination-test-results)
3. [Evaluation](#evaluation)
5. [Disclaimer](#disclaimer)
6. [Contact](#contact)
7. [Collaborations](#collaborations)
8. [Acknowledgement](#acknowledgement)
## Model Details
**LUNA-SOLARkrautLM-Instruct**
- **Model Type:** LUNA-SOLARkrautLM-Instruct is a UNA Model based on [fblgit/UNA-SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/fblgit/UNA-SOLAR-10.7B-Instruct-v1.0) and the powerful set of [SauerkrautLM-SOLAR-Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct/)
- **Language(s):** English, German
- **License:** cc-by-nc-4.0
- **Contact:** [Website](https://vago-solutions.de/#Kontakt) [David Golchinfar](mailto:[email protected]) [Juanako.AI - UNA](mailto:[email protected])
### Training Dataset:
LUNA-SOLARkrautLM-Instruct was trained with mix of German data augmentation and translated data.
Aligned through **DPO** with our **new German SauerkrautLM-DPO dataset** based on parts of the SFT SauerkrautLM dataset
as chosen answers and [Sauerkraut-7b-HerO](https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-HerO) as rejected answers. Added with additional **translated Parts of the [HuggingFaceH4/ultrafeedback_binarized](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized)** (Our dataset do not contain any TruthfulQA prompts - check Data Contamination Test Results) and **[argilla/distilabel-math-preference-dpo](https://huggingface.co/datasets/argilla/distilabel-math-preference-dpo).**
We found, that only a simple translation of training data can lead to unnatural German phrasings.
Data augmentation techniques were used to grant grammatical, syntactical correctness and a more natural German wording in our training data.
We improved the German language skills on this model. Nevertheless, certain formulations may occur that are not entirely correct.
### Data Contamination Test Results
Some models on the HuggingFace leaderboard had problems with wrong data getting mixed in.
We checked our SauerkrautLM-DPO dataset with a special test [1] on this model as target model and upstage/SOLAR-10.7B-Instruct-v1.0 as reference model.
The HuggingFace team used the same methods [2, 3].
Our results, with `result < 0.1, %:` being well below 0.9, indicate that our dataset is free from contamination.
*The data contamination test results of HellaSwag and Winograde will be added once [1] supports them.*
| Dataset | ARC | MMLU | TruthfulQA | GSM8K |
|------------------------------|-------|-------|-------|-------|
| **SauerkrautLM-DPO**| result < 0.1, %: 0.0 |result < 0.1, %: 0.09 | result < 0.1, %: 0.13 | result < 0.1, %: 0.16 |
[1] https://github.com/swj0419/detect-pretrain-code-contamination
[2] https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/474#657f2245365456e362412a06
[3] https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/265#657b6debf81f6b44b8966230
### Prompt Template:
```
<|im_start|>system
Du bist LUNA-SOLARkrautLM, ein groรes Sprachmodell, das hรถflich und kompetent antwortet.<|im_end|>
<|im_start|>user
Wie geht es dir?<|im_end|>
<|im_start|>assistant
```
```
### User:
Hello, how are you?
### Assistant:
Hi there! I am an AI language model, so I don't have personal feelings or emotions in the traditional sense. However, I can assure you that my systems and processes are functioning well at this moment, allowing me to provide helpful responses for your queries.
How may I assist you today?
```
## Evaluation
```
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 5, batch_size: auto
|Tasks|Version| Filter |n-shot| Metric |Value | |Stderr|
|-----|-------|----------|-----:|-----------|-----:|---|-----:|
|gsm8k|Yaml |get-answer| 5|exact_match|0.6467|ยฑ |0.0132|
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 0, batch_size: auto (64)
| Tasks |Version|Filter|n-shot|Metric|Value | |Stderr|
|--------------|-------|------|-----:|------|-----:|---|-----:|
|truthfulqa_mc2|Yaml |none | 0|acc |0.7368|ยฑ |0.0149|
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 25, batch_size: auto (32)
| Tasks |Version|Filter|n-shot| Metric |Value| |Stderr|
|-------------|-------|------|-----:|--------|----:|---|-----:|
|arc_challenge|Yaml |none | 25|acc |0.692|ยฑ |0.0135|
| | |none | 25|acc_norm|0.715|ยฑ |0.0132|
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 0, batch_size: auto (64)
| Tasks |Version|Filter|n-shot|Metric| Value | |Stderr|
|-----------|-------|------|-----:|------|------:|---|-----:|
|paws_de |Yaml |none | 0|acc | 0.3965|ยฑ |0.0109|
|wmt16-en-de|Yaml |none | 0|bleu | 3.5784|ยฑ |0.1325|
| | |none | 0|ter |64.5707|ยฑ |0.4514|
| | |none | 0|chrf |45.7068|ยฑ |0.3861|
|xnli_de |Yaml |none | 0|acc | 0.4129|ยฑ |0.0099|
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 10, batch_size: auto (32)
| Tasks |Version|Filter|n-shot| Metric |Value | |Stderr|
|---------|-------|------|-----:|--------|-----:|---|-----:|
|hellaswag|Yaml |none | 10|acc |0.7131|ยฑ |0.0045|
| | |none | 10|acc_norm|0.8815|ยฑ |0.0032|
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 5, batch_size: auto (64)
| Tasks |Version|Filter|n-shot|Metric| Value | |Stderr|
|-----------|-------|------|-----:|------|------:|---|-----:|
|wmt16-de-en|Yaml |none | 5|bleu |14.9310|ยฑ |0.8014|
| | |none | 5|ter |46.3206|ยฑ |0.4087|
| | |none | 5|chrf |60.8637|ยฑ |0.4436|
|wmt16-en-de|Yaml |none | 5|bleu | 6.2016|ยฑ |0.2918|
| | |none | 5|ter |63.9997|ยฑ |0.4591|
| | |none | 5|chrf |51.1399|ยฑ |0.3978|
|xnli_de |Yaml |none | 5|acc | 0.4703|ยฑ |0.0100|
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct,dtype=float16), gen_kwargs: (), limit: None, num_fewshot: 5, batch_size: auto (16)
| Tasks |Version|Filter|n-shot|Metric|Value | |Stderr|
|---------------------------------------|-------|------|-----:|------|-----:|---|-----:|
|mmlu |N/A |none | 0|acc |0.6461|ยฑ |0.1215|
| - humanities |N/A |none | 5|acc |0.5960|ยฑ |0.1200|
| - formal_logic |Yaml |none | 5|acc |0.4683|ยฑ |0.0446|
| - high_school_european_history |Yaml |none | 5|acc |0.8121|ยฑ |0.0305|
| - high_school_us_history |Yaml |none | 5|acc |0.8480|ยฑ |0.0252|
| - high_school_world_history |Yaml |none | 5|acc |0.8312|ยฑ |0.0244|
| - international_law |Yaml |none | 5|acc |0.7851|ยฑ |0.0375|
| - jurisprudence |Yaml |none | 5|acc |0.7685|ยฑ |0.0408|
| - logical_fallacies |Yaml |none | 5|acc |0.7423|ยฑ |0.0344|
| - moral_disputes |Yaml |none | 5|acc |0.7283|ยฑ |0.0239|
| - moral_scenarios |Yaml |none | 5|acc |0.3899|ยฑ |0.0163|
| - philosophy |Yaml |none | 5|acc |0.7074|ยฑ |0.0258|
| - prehistory |Yaml |none | 5|acc |0.7716|ยฑ |0.0234|
| - professional_law |Yaml |none | 5|acc |0.4824|ยฑ |0.0128|
| - world_religions |Yaml |none | 5|acc |0.7661|ยฑ |0.0325|
| - other |N/A |none | 5|acc |0.7097|ยฑ |0.0900|
| - business_ethics |Yaml |none | 5|acc |0.7700|ยฑ |0.0423|
| - clinical_knowledge |Yaml |none | 5|acc |0.6792|ยฑ |0.0287|
| - college_medicine |Yaml |none | 5|acc |0.6647|ยฑ |0.0360|
| - global_facts |Yaml |none | 5|acc |0.3600|ยฑ |0.0482|
| - human_aging |Yaml |none | 5|acc |0.6861|ยฑ |0.0311|
| - management |Yaml |none | 5|acc |0.8350|ยฑ |0.0368|
| - marketing |Yaml |none | 5|acc |0.8504|ยฑ |0.0234|
| - medical_genetics |Yaml |none | 5|acc |0.6700|ยฑ |0.0473|
| - miscellaneous |Yaml |none | 5|acc |0.7893|ยฑ |0.0146|
| - nutrition |Yaml |none | 5|acc |0.7549|ยฑ |0.0246|
| - professional_accounting |Yaml |none | 5|acc |0.5213|ยฑ |0.0298|
| - professional_medicine |Yaml |none | 5|acc |0.7353|ยฑ |0.0268|
| - virology |Yaml |none | 5|acc |0.5783|ยฑ |0.0384|
| - social_sciences |N/A |none | 5|acc |0.7501|ยฑ |0.0684|
| - econometrics |Yaml |none | 5|acc |0.5175|ยฑ |0.0470|
| - high_school_geography |Yaml |none | 5|acc |0.8485|ยฑ |0.0255|
| - high_school_government_and_politics|Yaml |none | 5|acc |0.8912|ยฑ |0.0225|
| - high_school_macroeconomics |Yaml |none | 5|acc |0.6615|ยฑ |0.0240|
| - high_school_microeconomics |Yaml |none | 5|acc |0.7311|ยฑ |0.0288|
| - high_school_psychology |Yaml |none | 5|acc |0.8385|ยฑ |0.0158|
| - human_sexuality |Yaml |none | 5|acc |0.7023|ยฑ |0.0401|
| - professional_psychology |Yaml |none | 5|acc |0.6683|ยฑ |0.0190|
| - public_relations |Yaml |none | 5|acc |0.6909|ยฑ |0.0443|
| - security_studies |Yaml |none | 5|acc |0.7633|ยฑ |0.0272|
| - sociology |Yaml |none | 5|acc |0.8358|ยฑ |0.0262|
| - us_foreign_policy |Yaml |none | 5|acc |0.8800|ยฑ |0.0327|
| - stem |N/A |none | 5|acc |0.5569|ยฑ |0.1360|
| - abstract_algebra |Yaml |none | 5|acc |0.3800|ยฑ |0.0488|
| - anatomy |Yaml |none | 5|acc |0.6148|ยฑ |0.0420|
| - astronomy |Yaml |none | 5|acc |0.7237|ยฑ |0.0364|
| - college_biology |Yaml |none | 5|acc |0.7708|ยฑ |0.0351|
| - college_chemistry |Yaml |none | 5|acc |0.4600|ยฑ |0.0501|
| - college_computer_science |Yaml |none | 5|acc |0.5400|ยฑ |0.0501|
| - college_mathematics |Yaml |none | 5|acc |0.2700|ยฑ |0.0446|
| - college_physics |Yaml |none | 5|acc |0.3333|ยฑ |0.0469|
| - computer_security |Yaml |none | 5|acc |0.7300|ยฑ |0.0446|
| - conceptual_physics |Yaml |none | 5|acc |0.6213|ยฑ |0.0317|
| - electrical_engineering |Yaml |none | 5|acc |0.6276|ยฑ |0.0403|
| - elementary_mathematics |Yaml |none | 5|acc |0.4788|ยฑ |0.0257|
| - high_school_biology |Yaml |none | 5|acc |0.8065|ยฑ |0.0225|
| - high_school_chemistry |Yaml |none | 5|acc |0.5123|ยฑ |0.0352|
| - high_school_computer_science |Yaml |none | 5|acc |0.7000|ยฑ |0.0461|
| - high_school_mathematics |Yaml |none | 5|acc |0.3889|ยฑ |0.0297|
| - high_school_physics |Yaml |none | 5|acc |0.3576|ยฑ |0.0391|
| - high_school_statistics |Yaml |none | 5|acc |0.5926|ยฑ |0.0335|
| - machine_learning |Yaml |none | 5|acc |0.4554|ยฑ |0.0473|
| Groups |Version|Filter|n-shot|Metric|Value | |Stderr|
|------------------|-------|------|-----:|------|-----:|---|-----:|
|mmlu |N/A |none | 0|acc |0.6461|ยฑ |0.1215|
| - humanities |N/A |none | 5|acc |0.5960|ยฑ |0.1200|
| - other |N/A |none | 5|acc |0.7097|ยฑ |0.0900|
| - social_sciences|N/A |none | 5|acc |0.7501|ยฑ |0.0684|
| - stem |N/A |none | 5|acc |0.5569|ยฑ |0.1360|
```
### MT-Bench
```
########## Average ##########
score
model
gpt-4 8.990625
gpt-3.5-turbo 7.943750
claude-instant-v1 7.905660
claude-v1 7.900000
UNA-SOLAR-10.7B-Instruct-v1.0 7.521875
LUNA-SOLARkrautLM-Instruct 7.462500
vicuna-33b-v1.3 7.121875
wizardlm-30b 7.009375
Llama-2-70b-chat 6.856250
Llama-2-13b-chat 6.650000
guanaco-33b 6.528125
tulu-30b 6.434375
guanaco-65b 6.409375
oasst-sft-7-llama-30b 6.409375
palm-2-chat-bison-001 6.400000
mpt-30b-chat 6.393750
vicuna-13b-v1.3 6.387500
wizardlm-13b 6.353125
Llama-2-7b-chat 6.268750
vicuna-7b-v1.3 5.996875
baize-v2-13b 5.750000
nous-hermes-13b 5.553459
mpt-7b-chat 5.459119
gpt4all-13b-snoozy 5.452830
koala-13b 5.350000
mpt-30b-instruct 5.218750
falcon-40b-instruct 5.168750
h2ogpt-oasst-open-llama-13b 4.625000
alpaca-13b 4.531250
chatglm-6b 4.500000
oasst-sft-4-pythia-12b 4.318750
rwkv-4-raven-14b 3.984375
dolly-v2-12b 3.275000
fastchat-t5-3b 3.040625
stablelm-tuned-alpha-7b 2.753125
llama-13b 2.606250
```
## Disclaimer
We must inform users that despite our best efforts in data cleansing, the possibility of uncensored content slipping through cannot be entirely ruled out.
However, we cannot guarantee consistently appropriate behavior. Therefore, if you encounter any issues or come across inappropriate content, we kindly request that you inform us through the contact information provided.
Additionally, it is essential to understand that the licensing of these models does not constitute legal advice. We are not held responsible for the actions of third parties who utilize our models.
ย
## Contact
If you are interested in customized LLMs for business applications, please get in contact with us via our website or contact us at [Dr. Daryoush Vaziri](mailto:[email protected]). We are also grateful for your feedback and suggestions.
ย
## Collaborations
We are also keenly seeking support and investment for our startup, [VAGO Solutions](https://huggingface.co/VAGOsolutions), where we continuously advance the development of robust language models designed to address a diverse range of purposes and requirements. If the prospect of collaboratively navigating future challenges excites you, we warmly invite you to reach out to us.
[Juanako.AI](https://huggingface.co/fblgit) is also seeking support and investment for our startup, we also are open for collaborating with other labs to make awesome models like this one.
## Acknowledgement
Big Hug to [VAGO Solutions](https://huggingface.co/VAGOsolutions), we merely used our UNA transformers library on their code and dataset, nothing else. This won't be possible without them, thanks!
Many thanks to [argilla](https://huggingface.co/datasets/argilla) and [Huggingface](https://huggingface.co) for providing such valuable datasets to the Open-Source community. And of course a big thanks to [upstage](https://huggingface.co/upstage) for providing the open source community with their latest technology!
|
TheBloke/Mixtral_7Bx2_MoE-GPTQ | TheBloke | 2023-12-23T13:34:34Z | 37 | 8 | transformers | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"base_model:cloudyu/Mixtral_7Bx2_MoE",
"base_model:quantized:cloudyu/Mixtral_7Bx2_MoE",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"gptq",
"region:us"
] | text-generation | 2023-12-23T12:08:27Z | ---
base_model: cloudyu/Mixtral_7Bx2_MoE
inference: false
license: cc-by-nc-4.0
model_creator: hai
model_name: Mixtral 7Bx2 MoE
model_type: mixtral
prompt_template: '{prompt}
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Mixtral 7Bx2 MoE - GPTQ
- Model creator: [hai](https://huggingface.co/cloudyu)
- Original model: [Mixtral 7Bx2 MoE](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE)
<!-- description start -->
# Description
This repo contains GPTQ model files for [hai's Mixtral 7Bx2 MoE](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF)
* [hai's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Unknown
```
{prompt}
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-compatible clients start -->
## Known compatible clients / servers
GPTQ models are currently supported on Linux (NVidia/AMD) and Windows (NVidia only). macOS users: please use GGUF models.
These GPTQ models are known to work in the following inference servers/webuis.
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
- [KoboldAI United](https://github.com/henk717/koboldai)
- [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui)
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
This may not be a complete list; if you know of others, please let me know!
<!-- README_GPTQ.md-compatible clients end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files, and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
Most GPTQ files are made with AutoGPTQ. Mistral models are currently made with Transformers.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The calibration dataset used during quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ calibration dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama and Mistral models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GPTQ/tree/main) | 4 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 7.09 GB | No | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 7.83 GB | No | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 13.16 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 13.45 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
| [gptq-8bit-32g-actorder_True](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GPTQ/tree/gptq-8bit-32g-actorder_True) | 8 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 14.34 GB | No | 8-bit, with group size 32g and Act Order for maximum inference quality. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 7.34 GB | No | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download, including from branches
### In text-generation-webui
To download from the `main` branch, enter `TheBloke/Mixtral_7Bx2_MoE-GPTQ` in the "Download model" box.
To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/Mixtral_7Bx2_MoE-GPTQ:gptq-4bit-32g-actorder_True`
### From the command line
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
To download the `main` branch to a folder called `Mixtral_7Bx2_MoE-GPTQ`:
```shell
mkdir Mixtral_7Bx2_MoE-GPTQ
huggingface-cli download TheBloke/Mixtral_7Bx2_MoE-GPTQ --local-dir Mixtral_7Bx2_MoE-GPTQ --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir Mixtral_7Bx2_MoE-GPTQ
huggingface-cli download TheBloke/Mixtral_7Bx2_MoE-GPTQ --revision gptq-4bit-32g-actorder_True --local-dir Mixtral_7Bx2_MoE-GPTQ --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Hugging Face cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
mkdir Mixtral_7Bx2_MoE-GPTQ
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/Mixtral_7Bx2_MoE-GPTQ --local-dir Mixtral_7Bx2_MoE-GPTQ --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
### With `git` (**not** recommended)
To clone a specific branch with `git`, use a command like this:
```shell
git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GPTQ
```
Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Mixtral_7Bx2_MoE-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Mixtral_7Bx2_MoE-GPTQ:gptq-4bit-32g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Mixtral_7Bx2_MoE-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
- Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
It's recommended to use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/Mixtral_7Bx2_MoE-GPTQ --port 3000 --quantize gptq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''{prompt}
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_GPTQ.md-use-from-tgi end -->
<!-- README_GPTQ.md-use-from-python start -->
## Python code example: inference from this GPTQ model
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install --upgrade transformers optimum
# If using PyTorch 2.1 + CUDA 12.x:
pip3 install --upgrade auto-gptq
# or, if using PyTorch 2.1 + CUDA 11.x:
pip3 install --upgrade auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/
```
If you are using PyTorch 2.0, you will need to install AutoGPTQ from source. Likewise if you have problems with the pre-built wheels, you should try building from source:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
git checkout v0.5.1
pip3 install .
```
### Example Python code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Mixtral_7Bx2_MoE-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-32g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Write a story about llamas"
system_message = "You are a story writing assistant"
prompt_template=f'''{prompt}
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with Transformers. For non-Mistral models, AutoGPTQ can also be used directly.
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama architecture models (including Mistral, Yi, DeepSeek, SOLAR, etc) in 4-bit. Please see the Provided Files table above for per-file compatibility.
For a list of clients/servers, please see "Known compatible clients / servers", above.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: hai's Mixtral 7Bx2 MoE
# Mixtral MOE 2x7B
MoE of the following models :
* [rwitz2/go-bruins-v2.1.1](https://huggingface.co/rwitz2/go-bruins-v2.1.1)
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
* [meta-math/mncai/mistral-7b-dpo-v6](https://huggingface.co/mncai/mistral-7b-dpo-v6)
gpu code example
```
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
import math
## v2 models
model_path = "cloudyu/Mixtral_7Bx2_MoE"
tokenizer = AutoTokenizer.from_pretrained(model_path, use_default_system_prompt=False)
model = AutoModelForCausalLM.from_pretrained(
model_path, torch_dtype=torch.float32, device_map='auto',local_files_only=False, load_in_4bit=True
)
print(model)
prompt = input("please input prompt:")
while len(prompt) > 0:
input_ids = tokenizer(prompt, return_tensors="pt").input_ids.to("cuda")
generation_output = model.generate(
input_ids=input_ids, max_new_tokens=500,repetition_penalty=1.2
)
print(tokenizer.decode(generation_output[0]))
prompt = input("please input prompt:")
```
CPU example
```
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
import math
## v2 models
model_path = "cloudyu/Mixtral_7Bx2_MoE"
tokenizer = AutoTokenizer.from_pretrained(model_path, use_default_system_prompt=False)
model = AutoModelForCausalLM.from_pretrained(
model_path, torch_dtype=torch.float32, device_map='cpu',local_files_only=False
)
print(model)
prompt = input("please input prompt:")
while len(prompt) > 0:
input_ids = tokenizer(prompt, return_tensors="pt").input_ids
generation_output = model.generate(
input_ids=input_ids, max_new_tokens=500,repetition_penalty=1.2
)
print(tokenizer.decode(generation_output[0]))
prompt = input("please input prompt:")
```
|
Someman/xlm-roberta-base-finetuned-wikiann-hi | Someman | 2023-12-23T13:32:11Z | 10 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"dataset:wikiann",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2022-07-27T13:32:38Z | ---
license: mit
tags:
- generated_from_trainer
datasets:
- wikiann
metrics:
- f1
base_model: xlm-roberta-base
model-index:
- name: xlm-roberta-base-finetuned-wikiann-hi
results:
- task:
type: token-classification
name: Token Classification
dataset:
name: wikiann
type: wikiann
args: hi
metrics:
- type: f1
value: 1.0
name: F1
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-wikiann-hi
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the wikiann dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3097
- F1: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:---:|
| 0.5689 | 1.0 | 209 | 0.3179 | 1.0 |
| 0.2718 | 2.0 | 418 | 0.2733 | 1.0 |
| 0.19 | 3.0 | 627 | 0.2560 | 1.0 |
| 0.142 | 4.0 | 836 | 0.2736 | 1.0 |
| 0.0967 | 5.0 | 1045 | 0.2686 | 1.0 |
| 0.0668 | 6.0 | 1254 | 0.2966 | 1.0 |
| 0.052 | 7.0 | 1463 | 0.3194 | 1.0 |
| 0.0369 | 8.0 | 1672 | 0.3034 | 1.0 |
| 0.0236 | 9.0 | 1881 | 0.3174 | 1.0 |
| 0.0135 | 10.0 | 2090 | 0.3097 | 1.0 |
### Framework versions
- Transformers 4.20.1
- Pytorch 1.12.0+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1
|
csujeong/falcon-7b-sharded-bf16-finetuned-financial | csujeong | 2023-12-23T13:19:37Z | 4 | 0 | peft | [
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"base_model:tiiuae/falcon-7b",
"base_model:adapter:tiiuae/falcon-7b",
"license:apache-2.0",
"region:us"
] | null | 2023-12-23T02:49:08Z | ---
license: apache-2.0
library_name: peft
tags:
- trl
- sft
- generated_from_trainer
base_model: tiiuae/falcon-7b
model-index:
- name: falcon-7b-sharded-bf16-finetuned-financial
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# falcon-7b-sharded-bf16-finetuned-financial
This model is a fine-tuned version of [tiiuae/falcon-7b](https://huggingface.co/tiiuae/falcon-7b) on a csujeong/FinancialStockTerms dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- training_steps: 80
### Training results
### Framework versions
- PEFT 0.7.2.dev0
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0 |
heegyu/Synatra-7B-v0.3-Translation-glaive | heegyu | 2023-12-23T13:09:49Z | 0 | 0 | null | [
"dataset:heegyu/glaive-function-calling-v2-ko-mt",
"region:us"
] | null | 2023-12-23T08:25:03Z | ---
datasets:
- heegyu/glaive-function-calling-v2-ko-mt
---
- [maywell/Synatra-7B-v0.3-Translation](https://huggingface.co/maywell/Synatra-7B-v0.3-Translation) ๋ชจ๋ธ์ด ํ๋ก๊ทธ๋จ ์ฝ๋๊ฐ ํฌํจ๋ ์ฌ๋ฌ ์ค์ ๊ธด ํ
์คํธ๋ฅผ ๋ฒ์ญํ๋๋ฐ ์ ํ์ด ์์ด์ ํด๋น ๋ถ๋ถ์ LoRA๋ก ์ถ๊ฐ ํ์ตํ์ต๋๋ค.
### ์ฌ์ฉ ์์
````
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
device = "cuda:0" if torch.cuda.is_available() else "cpu"
model_id = "maywell/Synatra-7B-v0.3-Translation"
tokenizer = AutoTokenizer.from_pretrained(model_id, revision=model_revision)
model = AutoModelForCausalLM.from_pretrained(model_id, revision=model_revision, device_map=device, torch_dtype=torch.float16).eval()
# LoRA ์ด๋ํฐ ๋ถ๋ฌ์ค๊ธฐ
model.load_adapter("heegyu/Synatra-7B-v0.3-Translation-glaive")
def generate(prompt, *messages):
messages = [
{
"role": "system",
"content": prompt.strip(),
},
*[{"role": "user" if i % 2 == 0 else "assistant", "content": m.strip()} for i, m in enumerate(messages)],
]
inputs = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt").to(device)
outs = model.generate(inputs, do_sample=True, max_new_tokens=256, early_stopping=True)
print(tokenizer.batch_decode(outs)[0])
generate(
"๋งํฌ๋ค์ด์ผ๋ก ์์ฑ๋ ์์ด ๋ํ๋ฅผ ํ๊ตญ์ด๋ก ๋ฒ์ญํ์ธ์. ํ๋ก๊ทธ๋จ ์ฝ๋๋ ๋ฒ์ญํ๋ฉด ์๋ฉ๋๋ค.",
"""
### User:
Given a list ["foo", "bar", "baz"] and an item in the list "bar", how do I get its index 1?
### Assistant:
```python
>>> ["foo", "bar", "baz"].index("bar")
1
```
See the documentation for the built-in .index() method of the list:
list.index(x[, start[, end]])
Return zero-based index in the list of the first item whose value is equal to x. Raises a ValueError if there is no such item.
The optional arguments start and end are interpreted as in the slice notation and are used to limit the search to a particular subsequence of the list. The returned index is computed relative to the beginning of the full sequence rather than the start argument.
"""
)
````
์คํ ๊ฒฐ๊ณผ
````
<|im_start|> system
๋งํฌ๋ค์ด์ผ๋ก ์์ฑ๋ ์์ด ๋ํ๋ฅผ ํ๊ตญ์ด๋ก ๋ฒ์ญํ์ธ์. ํ๋ก๊ทธ๋จ ์ฝ๋๋ ๋ฒ์ญํ๋ฉด ์๋ฉ๋๋ค.<|im_end|>
<|im_start|> user
### User:
Given a list ["foo", "bar", "baz"] and an item in the list "bar", how do I get its index 1?
### Assistant:
```python
>>> ["foo", "bar", "baz"].index("bar")
1
```
See the documentation for the built-in .index() method of the list:
list.index(x[, start[, end]])
Return zero-based index in the list of the first item whose value is equal to x. Raises a ValueError if there is no such item.
The optional arguments start and end are interpreted as in the slice notation and are used to limit the search to a particular subsequence of the list. The returned index is computed relative to the beginning of the full sequence rather than the start argument.<|im_end|>
<|im_start|> assistant
### User:
"foo", "bar", "baz" ๋ฆฌ์คํธ๊ฐ ์๊ณ ๋ฆฌ์คํธ์์ "bar"๋ผ๋ ํญ๋ชฉ์ด ์๋ค๋ฉด, ๊ทธ ์ธ๋ฑ์ค 1์ ์ด๋ป๊ฒ ๊ฐ์ ธ์ฌ ์ ์์๊น์?
### Assistant:
```python
>>> ["foo", "bar", "baz"].index("bar")
1
```
๋ฆฌ์คํธ์ ๋ด์ฅ๋ .index() ๋ฉ์๋์ ๋ํ ๋ฌธ์๋ฅผ ์ฐธ์กฐํ์ธ์:
list.index(x[, start[, end]])
๊ฐ์ด x์ ๊ฐ์ ์ฒซ ๋ฒ์งธ ํญ๋ชฉ์ 0 ๊ธฐ๋ฐ ์ธ๋ฑ์ค๋ฅผ ๋ฐํํฉ๋๋ค. ๊ทธ๋ฌํ ํญ๋ชฉ์ด ์๋ ๊ฒฝ์ฐ ValueError ๊ฐ ๋ฐ์ํฉ๋๋ค.
์ ํ์ ์ธ ์ธ์ start์ end๋ ์ฌ๋ผ์ด์ค ํ๊ธฐ๋ฒ์์์ ์๋ณ์ ํด๋นํ๋ฉฐ ๋ฆฌ์คํธ์ ํน์ ํ์ ์ํ์ค๋ก ๊ฒ์์ ์ ํํ๋ ๋ฐ ์ฌ์ฉ๋ฉ๋๋ค. ๋ฐํ๋ ์ธ๋ฑ์ค๋ ์์ ์ธ์๊ฐ ์๋ ์ ์ฒด ์ํ์ค์ ์์์ ๊ธฐ์ค์ผ๋ก ๊ณ์ฐ๋ฉ๋๋ค.<|im_end|>
````
|
Gummybear05/whisper-small-ko-E10_Yfreq-SA | Gummybear05 | 2023-12-23T13:08:38Z | 5 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"hf-asr-leaderboard",
"generated_from_trainer",
"hi",
"dataset:aihub_elder",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2023-12-23T10:58:45Z | ---
language:
- hi
license: apache-2.0
base_model: openai/whisper-small
tags:
- hf-asr-leaderboard
- generated_from_trainer
datasets:
- aihub_elder
model-index:
- name: whisper-small-ko-E10_Yfreq-SA
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-ko-E10_Yfreq-SA
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the aihub elder over 70 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2060
- Cer: 5.8917
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 2
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.3564 | 0.13 | 100 | 0.2919 | 7.1898 |
| 0.2354 | 0.26 | 200 | 0.2478 | 6.7023 |
| 0.21 | 0.39 | 300 | 0.2349 | 7.3191 |
| 0.1999 | 0.52 | 400 | 0.2270 | 7.0665 |
| 0.1883 | 0.64 | 500 | 0.2227 | 6.8961 |
| 0.1844 | 0.77 | 600 | 0.2195 | 6.4027 |
| 0.1631 | 0.9 | 700 | 0.2156 | 6.1560 |
| 0.0977 | 1.03 | 800 | 0.2142 | 6.0738 |
| 0.087 | 1.16 | 900 | 0.2144 | 6.0385 |
| 0.0985 | 1.29 | 1000 | 0.2119 | 6.0033 |
| 0.0763 | 1.42 | 1100 | 0.2110 | 5.9034 |
| 0.0906 | 1.55 | 1200 | 0.2088 | 5.8741 |
| 0.0922 | 1.68 | 1300 | 0.2066 | 5.8564 |
| 0.079 | 1.81 | 1400 | 0.2060 | 5.8623 |
| 0.0771 | 1.93 | 1500 | 0.2060 | 5.8917 |
### Framework versions
- Transformers 4.37.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
Gummybear05/whisper-small-ko-E10_Yfreq | Gummybear05 | 2023-12-23T13:03:05Z | 3 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"hf-asr-leaderboard",
"generated_from_trainer",
"hi",
"dataset:aihub_elder",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2023-12-23T10:52:54Z | ---
language:
- hi
license: apache-2.0
base_model: openai/whisper-small
tags:
- hf-asr-leaderboard
- generated_from_trainer
datasets:
- aihub_elder
model-index:
- name: whisper-small-ko-E10_Yfreq
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-ko-E10_Yfreq
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the aihub elder over 70 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2085
- Cer: 6.3029
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 2
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.2909 | 0.13 | 100 | 0.2830 | 7.4307 |
| 0.1788 | 0.26 | 200 | 0.2478 | 6.5378 |
| 0.1644 | 0.39 | 300 | 0.2375 | 6.4967 |
| 0.1614 | 0.52 | 400 | 0.2265 | 6.3675 |
| 0.1458 | 0.64 | 500 | 0.2243 | 6.1971 |
| 0.1368 | 0.77 | 600 | 0.2217 | 7.0665 |
| 0.1226 | 0.9 | 700 | 0.2216 | 6.3029 |
| 0.0553 | 1.03 | 800 | 0.2162 | 5.9563 |
| 0.0499 | 1.16 | 900 | 0.2187 | 5.9680 |
| 0.0597 | 1.29 | 1000 | 0.2153 | 5.9211 |
| 0.0456 | 1.42 | 1100 | 0.2121 | 6.5789 |
| 0.0495 | 1.55 | 1200 | 0.2128 | 6.6024 |
| 0.0558 | 1.68 | 1300 | 0.2095 | 6.3675 |
| 0.044 | 1.81 | 1400 | 0.2081 | 6.3969 |
| 0.0424 | 1.93 | 1500 | 0.2085 | 6.3029 |
### Framework versions
- Transformers 4.37.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
A2H0H0R1/mobilenet_v2_1.0_224-plant-disease-new | A2H0H0R1 | 2023-12-23T12:54:29Z | 8 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"mobilenet_v2",
"image-classification",
"generated_from_trainer",
"dataset:A2H0H0R1/plant-disease-new",
"base_model:google/mobilenet_v2_1.0_224",
"base_model:finetune:google/mobilenet_v2_1.0_224",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2023-12-22T18:47:44Z | ---
license: other
base_model: google/mobilenet_v2_1.0_224
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: mobilenet_v2_1.0_224-plant-disease-new
results: []
datasets:
- A2H0H0R1/plant-disease-new
pipeline_tag: image-classification
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mobilenet_v2_1.0_224-plant-disease-new
This model is a fine-tuned version of [google/mobilenet_v2_1.0_224](https://huggingface.co/google/mobilenet_v2_1.0_224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1287
- Accuracy: 0.9600
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 100
- eval_batch_size: 100
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 400
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.5043 | 1.0 | 366 | 0.4476 | 0.8886 |
| 0.2492 | 2.0 | 733 | 0.2550 | 0.9281 |
| 0.2069 | 3.0 | 1100 | 0.2332 | 0.9247 |
| 0.1716 | 4.0 | 1467 | 0.3329 | 0.8960 |
| 0.1602 | 5.0 | 1833 | 0.1999 | 0.9388 |
| 0.1633 | 5.99 | 2196 | 0.1287 | 0.9600 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0 |
TheBloke/Valkyrie-V1-AWQ | TheBloke | 2023-12-23T12:49:40Z | 5 | 1 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"base_model:cookinai/Valkyrie-V1",
"base_model:quantized:cookinai/Valkyrie-V1",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"awq",
"region:us"
] | text-generation | 2023-12-23T12:31:57Z | ---
base_model: cookinai/Valkyrie-V1
inference: false
license: apache-2.0
model_creator: John Smith
model_name: Valkyrie v1
model_type: mistral
prompt_template: '{prompt}
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Valkyrie v1 - AWQ
- Model creator: [John Smith](https://huggingface.co/cookinai)
- Original model: [Valkyrie v1](https://huggingface.co/cookinai/Valkyrie-V1)
<!-- description start -->
## Description
This repo contains AWQ model files for [John Smith's Valkyrie v1](https://huggingface.co/cookinai/Valkyrie-V1).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
### About AWQ
AWQ is an efficient, accurate and blazing-fast low-bit weight quantization method, currently supporting 4-bit quantization. Compared to GPTQ, it offers faster Transformers-based inference with equivalent or better quality compared to the most commonly used GPTQ settings.
AWQ models are currently supported on Linux and Windows, with NVidia GPUs only. macOS users: please use GGUF models instead.
It is supported by:
- [Text Generation Webui](https://github.com/oobabooga/text-generation-webui) - using Loader: AutoAWQ
- [vLLM](https://github.com/vllm-project/vllm) - version 0.2.2 or later for support for all model types.
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
- [Transformers](https://huggingface.co/docs/transformers) version 4.35.0 and later, from any code or client that supports Transformers
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) - for use from Python code
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Valkyrie-V1-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Valkyrie-V1-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF)
* [John Smith's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/cookinai/Valkyrie-V1)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Unknown
```
{prompt}
```
<!-- prompt-template end -->
<!-- README_AWQ.md-provided-files start -->
## Provided files, and AWQ parameters
I currently release 128g GEMM models only. The addition of group_size 32 models, and GEMV kernel models, is being actively considered.
Models are released as sharded safetensors files.
| Branch | Bits | GS | AWQ Dataset | Seq Len | Size |
| ------ | ---- | -- | ----------- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Valkyrie-V1-AWQ/tree/main) | 4 | 128 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 4.15 GB
<!-- README_AWQ.md-provided-files end -->
<!-- README_AWQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Valkyrie-V1-AWQ`.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Valkyrie-V1-AWQ`
7. Select **Loader: AutoAWQ**.
8. Click Load, and the model will load and is now ready for use.
9. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
10. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_AWQ.md-text-generation-webui end -->
<!-- README_AWQ.md-use-from-vllm start -->
## Multi-user inference server: vLLM
Documentation on installing and using vLLM [can be found here](https://vllm.readthedocs.io/en/latest/).
- Please ensure you are using vLLM version 0.2 or later.
- When using vLLM as a server, pass the `--quantization awq` parameter.
For example:
```shell
python3 -m vllm.entrypoints.api_server --model TheBloke/Valkyrie-V1-AWQ --quantization awq --dtype auto
```
- When using vLLM from Python code, again set `quantization=awq`.
For example:
```python
from vllm import LLM, SamplingParams
prompts = [
"Tell me about AI",
"Write a story about llamas",
"What is 291 - 150?",
"How much wood would a woodchuck chuck if a woodchuck could chuck wood?",
]
prompt_template=f'''{prompt}
'''
prompts = [prompt_template.format(prompt=prompt) for prompt in prompts]
sampling_params = SamplingParams(temperature=0.8, top_p=0.95)
llm = LLM(model="TheBloke/Valkyrie-V1-AWQ", quantization="awq", dtype="auto")
outputs = llm.generate(prompts, sampling_params)
# Print the outputs.
for output in outputs:
prompt = output.prompt
generated_text = output.outputs[0].text
print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
```
<!-- README_AWQ.md-use-from-vllm start -->
<!-- README_AWQ.md-use-from-tgi start -->
## Multi-user inference server: Hugging Face Text Generation Inference (TGI)
Use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/Valkyrie-V1-AWQ --port 3000 --quantize awq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires [huggingface-hub](https://github.com/huggingface/huggingface_hub) 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''{prompt}
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: ", response)
```
<!-- README_AWQ.md-use-from-tgi end -->
<!-- README_AWQ.md-use-from-python start -->
## Inference from Python code using Transformers
### Install the necessary packages
- Requires: [Transformers](https://huggingface.co/docs/transformers) 4.35.0 or later.
- Requires: [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) 0.1.6 or later.
```shell
pip3 install --upgrade "autoawq>=0.1.6" "transformers>=4.35.0"
```
Note that if you are using PyTorch 2.0.1, the above AutoAWQ command will automatically upgrade you to PyTorch 2.1.0.
If you are using CUDA 11.8 and wish to continue using PyTorch 2.0.1, instead run this command:
```shell
pip3 install https://github.com/casper-hansen/AutoAWQ/releases/download/v0.1.6/autoawq-0.1.6+cu118-cp310-cp310-linux_x86_64.whl
```
If you have problems installing [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y autoawq
git clone https://github.com/casper-hansen/AutoAWQ
cd AutoAWQ
pip3 install .
```
### Transformers example code (requires Transformers 4.35.0 and later)
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
model_name_or_path = "TheBloke/Valkyrie-V1-AWQ"
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
model = AutoModelForCausalLM.from_pretrained(
model_name_or_path,
low_cpu_mem_usage=True,
device_map="cuda:0"
)
# Using the text streamer to stream output one token at a time
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
prompt = "Tell me about AI"
prompt_template=f'''{prompt}
'''
# Convert prompt to tokens
tokens = tokenizer(
prompt_template,
return_tensors='pt'
).input_ids.cuda()
generation_params = {
"do_sample": True,
"temperature": 0.7,
"top_p": 0.95,
"top_k": 40,
"max_new_tokens": 512,
"repetition_penalty": 1.1
}
# Generate streamed output, visible one token at a time
generation_output = model.generate(
tokens,
streamer=streamer,
**generation_params
)
# Generation without a streamer, which will include the prompt in the output
generation_output = model.generate(
tokens,
**generation_params
)
# Get the tokens from the output, decode them, print them
token_output = generation_output[0]
text_output = tokenizer.decode(token_output)
print("model.generate output: ", text_output)
# Inference is also possible via Transformers' pipeline
from transformers import pipeline
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
**generation_params
)
pipe_output = pipe(prompt_template)[0]['generated_text']
print("pipeline output: ", pipe_output)
```
<!-- README_AWQ.md-use-from-python end -->
<!-- README_AWQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with:
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui) using `Loader: AutoAWQ`.
- [vLLM](https://github.com/vllm-project/vllm) version 0.2.0 and later.
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) version 1.1.0 and later.
- [Transformers](https://huggingface.co/docs/transformers) version 4.35.0 and later.
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) version 0.1.1 and later.
<!-- README_AWQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: John Smith's Valkyrie v1
Slerp merge of mindy-labs/mindy-7b-v2 with jondurbin/bagel-dpo-7b-v0.1. This model was then slerp merged with rishiraj/CatPPT.
Heard some talk of jondurbin/bagel-dpo-7b-v0.1 in the community and it sounds intresting. Merged it with two high preforming models to get cookinai/Valkyrie-V1
Slerp 1:
```.yaml:
slices:
- sources:
- model: jondurbin/bagel-dpo-7b-v0.1
layer_range: [0, 32]
- model: mindy-labs/mindy-7b-v2
layer_range: [0, 32]
merge_method: slerp
base_model: mindy-labs/mindy-7b-v2
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16
```
Slerp 2:
```.yaml:
slices:
- sources:
- model: previous/model/path
layer_range: [0, 32]
- model: rishiraj/CatPPT
layer_range: [0, 32]
merge_method: slerp
base_model: previous/model/path
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16
```
|
yijisuk/segformer-b1-finetuned-segments-ic-chip-sample | yijisuk | 2023-12-23T12:43:35Z | 5 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"base_model:nvidia/mit-b1",
"base_model:finetune:nvidia/mit-b1",
"license:other",
"endpoints_compatible",
"region:us"
] | image-segmentation | 2023-12-23T11:58:50Z | ---
license: other
base_model: nvidia/mit-b1
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b1-finetuned-segments-ic-chip-sample
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b1-finetuned-segments-ic-chip-sample
This model is a fine-tuned version of [nvidia/mit-b1](https://huggingface.co/nvidia/mit-b1) on the yijisuk/ic-chip-sample dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1227
- Mean Iou: 0.4744
- Mean Accuracy: 0.9489
- Overall Accuracy: 0.9489
- Accuracy Unlabeled: nan
- Accuracy Circuit: 0.9489
- Iou Unlabeled: 0.0
- Iou Circuit: 0.9489
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|
| 0.4185 | 1.0 | 20 | 0.5878 | 0.3632 | 0.7265 | 0.7265 | nan | 0.7265 | 0.0 | 0.7265 |
| 0.4477 | 2.0 | 40 | 0.4288 | 0.4894 | 0.9788 | 0.9788 | nan | 0.9788 | 0.0 | 0.9788 |
| 0.9304 | 3.0 | 60 | 0.2053 | 0.4520 | 0.9041 | 0.9041 | nan | 0.9041 | 0.0 | 0.9041 |
| 0.1409 | 4.0 | 80 | 0.1817 | 0.4738 | 0.9477 | 0.9477 | nan | 0.9477 | 0.0 | 0.9477 |
| 0.392 | 5.0 | 100 | 0.1824 | 0.4900 | 0.9800 | 0.9800 | nan | 0.9800 | 0.0 | 0.9800 |
| 0.1589 | 6.0 | 120 | 0.1594 | 0.4814 | 0.9628 | 0.9628 | nan | 0.9628 | 0.0 | 0.9628 |
| 0.1848 | 7.0 | 140 | 0.1551 | 0.4625 | 0.9251 | 0.9251 | nan | 0.9251 | 0.0 | 0.9251 |
| 0.0874 | 8.0 | 160 | 0.1503 | 0.4829 | 0.9657 | 0.9657 | nan | 0.9657 | 0.0 | 0.9657 |
| 0.2172 | 9.0 | 180 | 0.1558 | 0.4591 | 0.9182 | 0.9182 | nan | 0.9182 | 0.0 | 0.9182 |
| 0.9914 | 10.0 | 200 | 0.1457 | 0.4698 | 0.9396 | 0.9396 | nan | 0.9396 | 0.0 | 0.9396 |
| 0.2387 | 11.0 | 220 | 0.1494 | 0.4709 | 0.9419 | 0.9419 | nan | 0.9419 | 0.0 | 0.9419 |
| 0.1242 | 12.0 | 240 | 0.1463 | 0.4743 | 0.9486 | 0.9486 | nan | 0.9486 | 0.0 | 0.9486 |
| 0.0819 | 13.0 | 260 | 0.1492 | 0.4757 | 0.9515 | 0.9515 | nan | 0.9515 | 0.0 | 0.9515 |
| 0.6077 | 14.0 | 280 | 0.1442 | 0.4793 | 0.9586 | 0.9586 | nan | 0.9586 | 0.0 | 0.9586 |
| 0.3156 | 15.0 | 300 | 0.1430 | 0.4813 | 0.9627 | 0.9627 | nan | 0.9627 | 0.0 | 0.9627 |
| 0.2564 | 16.0 | 320 | 0.1483 | 0.4673 | 0.9347 | 0.9347 | nan | 0.9347 | 0.0 | 0.9347 |
| 0.107 | 17.0 | 340 | 0.1467 | 0.4695 | 0.9390 | 0.9390 | nan | 0.9390 | 0.0 | 0.9390 |
| 1.1592 | 18.0 | 360 | 0.1437 | 0.4814 | 0.9628 | 0.9628 | nan | 0.9628 | 0.0 | 0.9628 |
| 0.0586 | 19.0 | 380 | 0.1396 | 0.4811 | 0.9622 | 0.9622 | nan | 0.9622 | 0.0 | 0.9622 |
| 0.9815 | 20.0 | 400 | 0.1399 | 0.4812 | 0.9624 | 0.9624 | nan | 0.9624 | 0.0 | 0.9624 |
| 0.3101 | 21.0 | 420 | 0.1411 | 0.4836 | 0.9672 | 0.9672 | nan | 0.9672 | 0.0 | 0.9672 |
| 0.2325 | 22.0 | 440 | 0.1395 | 0.4672 | 0.9344 | 0.9344 | nan | 0.9344 | 0.0 | 0.9344 |
| 0.1504 | 23.0 | 460 | 0.1420 | 0.4720 | 0.9441 | 0.9441 | nan | 0.9441 | 0.0 | 0.9441 |
| 0.2831 | 24.0 | 480 | 0.1393 | 0.4697 | 0.9395 | 0.9395 | nan | 0.9395 | 0.0 | 0.9395 |
| 0.0921 | 25.0 | 500 | 0.1418 | 0.4701 | 0.9401 | 0.9401 | nan | 0.9401 | 0.0 | 0.9401 |
| 0.141 | 26.0 | 520 | 0.1318 | 0.4648 | 0.9296 | 0.9296 | nan | 0.9296 | 0.0 | 0.9296 |
| 0.1381 | 27.0 | 540 | 0.1316 | 0.4697 | 0.9395 | 0.9395 | nan | 0.9395 | 0.0 | 0.9395 |
| 1.1864 | 28.0 | 560 | 0.1292 | 0.4774 | 0.9548 | 0.9548 | nan | 0.9548 | 0.0 | 0.9548 |
| 0.9492 | 29.0 | 580 | 0.1290 | 0.4709 | 0.9418 | 0.9418 | nan | 0.9418 | 0.0 | 0.9418 |
| 0.3061 | 30.0 | 600 | 0.1303 | 0.4536 | 0.9071 | 0.9071 | nan | 0.9071 | 0.0 | 0.9071 |
| 0.2511 | 31.0 | 620 | 0.1318 | 0.4725 | 0.9451 | 0.9451 | nan | 0.9451 | 0.0 | 0.9451 |
| 0.2706 | 32.0 | 640 | 0.1284 | 0.4790 | 0.9580 | 0.9580 | nan | 0.9580 | 0.0 | 0.9580 |
| 0.1508 | 33.0 | 660 | 0.1264 | 0.4698 | 0.9396 | 0.9396 | nan | 0.9396 | 0.0 | 0.9396 |
| 0.2802 | 34.0 | 680 | 0.1308 | 0.4733 | 0.9467 | 0.9467 | nan | 0.9467 | 0.0 | 0.9467 |
| 0.1897 | 35.0 | 700 | 0.1315 | 0.4681 | 0.9361 | 0.9361 | nan | 0.9361 | 0.0 | 0.9361 |
| 0.1981 | 36.0 | 720 | 0.1289 | 0.4766 | 0.9531 | 0.9531 | nan | 0.9531 | 0.0 | 0.9531 |
| 0.2742 | 37.0 | 740 | 0.1284 | 0.4818 | 0.9635 | 0.9635 | nan | 0.9635 | 0.0 | 0.9635 |
| 0.0418 | 38.0 | 760 | 0.1240 | 0.4762 | 0.9525 | 0.9525 | nan | 0.9525 | 0.0 | 0.9525 |
| 0.1946 | 39.0 | 780 | 0.1253 | 0.4750 | 0.9500 | 0.9500 | nan | 0.9500 | 0.0 | 0.9500 |
| 0.1692 | 40.0 | 800 | 0.1253 | 0.4836 | 0.9672 | 0.9672 | nan | 0.9672 | 0.0 | 0.9672 |
| 0.3071 | 41.0 | 820 | 0.1227 | 0.4751 | 0.9503 | 0.9503 | nan | 0.9503 | 0.0 | 0.9503 |
| 0.2003 | 42.0 | 840 | 0.1250 | 0.4762 | 0.9524 | 0.9524 | nan | 0.9524 | 0.0 | 0.9524 |
| 0.2099 | 43.0 | 860 | 0.1235 | 0.4740 | 0.9480 | 0.9480 | nan | 0.9480 | 0.0 | 0.9480 |
| 0.1218 | 44.0 | 880 | 0.1222 | 0.4743 | 0.9486 | 0.9486 | nan | 0.9486 | 0.0 | 0.9486 |
| 0.1583 | 45.0 | 900 | 0.1226 | 0.4708 | 0.9415 | 0.9415 | nan | 0.9415 | 0.0 | 0.9415 |
| 0.1506 | 46.0 | 920 | 0.1215 | 0.4686 | 0.9372 | 0.9372 | nan | 0.9372 | 0.0 | 0.9372 |
| 0.0643 | 47.0 | 940 | 0.1234 | 0.4779 | 0.9559 | 0.9559 | nan | 0.9559 | 0.0 | 0.9559 |
| 0.2006 | 48.0 | 960 | 0.1213 | 0.4757 | 0.9515 | 0.9515 | nan | 0.9515 | 0.0 | 0.9515 |
| 0.0783 | 49.0 | 980 | 0.1241 | 0.4726 | 0.9452 | 0.9452 | nan | 0.9452 | 0.0 | 0.9452 |
| 0.0552 | 50.0 | 1000 | 0.1227 | 0.4744 | 0.9489 | 0.9489 | nan | 0.9489 | 0.0 | 0.9489 |
### Framework versions
- Transformers 4.36.2
- Pytorch 1.11.0+cu115
- Datasets 2.15.0
- Tokenizers 0.15.0
|
TheBloke/Valkyrie-V1-GGUF | TheBloke | 2023-12-23T12:36:30Z | 72 | 1 | transformers | [
"transformers",
"gguf",
"mistral",
"base_model:cookinai/Valkyrie-V1",
"base_model:quantized:cookinai/Valkyrie-V1",
"license:apache-2.0",
"region:us"
] | null | 2023-12-23T12:31:57Z | ---
base_model: cookinai/Valkyrie-V1
inference: false
license: apache-2.0
model_creator: John Smith
model_name: Valkyrie v1
model_type: mistral
prompt_template: '{prompt}
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Valkyrie v1 - GGUF
- Model creator: [John Smith](https://huggingface.co/cookinai)
- Original model: [Valkyrie v1](https://huggingface.co/cookinai/Valkyrie-V1)
<!-- description start -->
## Description
This repo contains GGUF format model files for [John Smith's Valkyrie v1](https://huggingface.co/cookinai/Valkyrie-V1).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
<!-- README_GGUF.md-about-gguf end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Valkyrie-V1-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Valkyrie-V1-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF)
* [John Smith's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/cookinai/Valkyrie-V1)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Unknown
```
{prompt}
```
<!-- prompt-template end -->
<!-- compatibility_gguf start -->
## Compatibility
These quantised GGUFv2 files are compatible with llama.cpp from August 27th onwards, as of commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221)
They are also compatible with many third party UIs and libraries - please see the list at the top of this README.
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
Refer to the Provided Files table below to see what files use which methods, and how.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-provided-files start -->
## Provided files
| Name | Quant method | Bits | Size | Max RAM required | Use case |
| ---- | ---- | ---- | ---- | ---- | ----- |
| [valkyrie-v1.Q2_K.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q2_K.gguf) | Q2_K | 2 | 3.08 GB| 5.58 GB | smallest, significant quality loss - not recommended for most purposes |
| [valkyrie-v1.Q3_K_S.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q3_K_S.gguf) | Q3_K_S | 3 | 3.17 GB| 5.67 GB | very small, high quality loss |
| [valkyrie-v1.Q3_K_M.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q3_K_M.gguf) | Q3_K_M | 3 | 3.52 GB| 6.02 GB | very small, high quality loss |
| [valkyrie-v1.Q3_K_L.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q3_K_L.gguf) | Q3_K_L | 3 | 3.82 GB| 6.32 GB | small, substantial quality loss |
| [valkyrie-v1.Q4_0.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q4_0.gguf) | Q4_0 | 4 | 4.11 GB| 6.61 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [valkyrie-v1.Q4_K_S.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q4_K_S.gguf) | Q4_K_S | 4 | 4.14 GB| 6.64 GB | small, greater quality loss |
| [valkyrie-v1.Q4_K_M.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q4_K_M.gguf) | Q4_K_M | 4 | 4.37 GB| 6.87 GB | medium, balanced quality - recommended |
| [valkyrie-v1.Q5_0.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q5_0.gguf) | Q5_0 | 5 | 5.00 GB| 7.50 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [valkyrie-v1.Q5_K_S.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q5_K_S.gguf) | Q5_K_S | 5 | 5.00 GB| 7.50 GB | large, low quality loss - recommended |
| [valkyrie-v1.Q5_K_M.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q5_K_M.gguf) | Q5_K_M | 5 | 5.13 GB| 7.63 GB | large, very low quality loss - recommended |
| [valkyrie-v1.Q6_K.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q6_K.gguf) | Q6_K | 6 | 5.94 GB| 8.44 GB | very large, extremely low quality loss |
| [valkyrie-v1.Q8_0.gguf](https://huggingface.co/TheBloke/Valkyrie-V1-GGUF/blob/main/valkyrie-v1.Q8_0.gguf) | Q8_0 | 8 | 7.70 GB| 10.20 GB | very large, extremely low quality loss - not recommended |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
<!-- README_GGUF.md-provided-files end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: TheBloke/Valkyrie-V1-GGUF and below it, a specific filename to download, such as: valkyrie-v1.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download TheBloke/Valkyrie-V1-GGUF valkyrie-v1.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download TheBloke/Valkyrie-V1-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/Valkyrie-V1-GGUF valkyrie-v1.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m valkyrie-v1.Q4_K_M.gguf --color -c 32768 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "{prompt}"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 32768` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 โ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./valkyrie-v1.Q4_K_M.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"{prompt}", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./valkyrie-v1.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: John Smith's Valkyrie v1
Slerp merge of mindy-labs/mindy-7b-v2 with jondurbin/bagel-dpo-7b-v0.1. This model was then slerp merged with rishiraj/CatPPT.
Heard some talk of jondurbin/bagel-dpo-7b-v0.1 in the community and it sounds intresting. Merged it with two high preforming models to get cookinai/Valkyrie-V1
Slerp 1:
```.yaml:
slices:
- sources:
- model: jondurbin/bagel-dpo-7b-v0.1
layer_range: [0, 32]
- model: mindy-labs/mindy-7b-v2
layer_range: [0, 32]
merge_method: slerp
base_model: mindy-labs/mindy-7b-v2
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16
```
Slerp 2:
```.yaml:
slices:
- sources:
- model: previous/model/path
layer_range: [0, 32]
- model: rishiraj/CatPPT
layer_range: [0, 32]
merge_method: slerp
base_model: previous/model/path
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16
```
<!-- original-model-card end -->
|
TheBloke/Mixtral_7Bx2_MoE-GGUF | TheBloke | 2023-12-23T12:31:53Z | 624 | 23 | transformers | [
"transformers",
"gguf",
"mixtral",
"base_model:cloudyu/Mixtral_7Bx2_MoE",
"base_model:quantized:cloudyu/Mixtral_7Bx2_MoE",
"license:cc-by-nc-4.0",
"region:us"
] | null | 2023-12-23T12:08:27Z | ---
base_model: cloudyu/Mixtral_7Bx2_MoE
inference: false
license: cc-by-nc-4.0
model_creator: hai
model_name: Mixtral 7Bx2 MoE
model_type: mixtral
prompt_template: '{prompt}
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Mixtral 7Bx2 MoE - GGUF
- Model creator: [hai](https://huggingface.co/cloudyu)
- Original model: [Mixtral 7Bx2 MoE](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE)
<!-- description start -->
## Description
This repo contains GGUF format model files for [hai's Mixtral 7Bx2 MoE](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
<!-- README_GGUF.md-about-gguf end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF)
* [hai's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Unknown
```
{prompt}
```
<!-- prompt-template end -->
<!-- compatibility_gguf start -->
## Compatibility
These quantised GGUFv2 files are compatible with llama.cpp from August 27th onwards, as of commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221)
They are also compatible with many third party UIs and libraries - please see the list at the top of this README.
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
Refer to the Provided Files table below to see what files use which methods, and how.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-provided-files start -->
## Provided files
| Name | Quant method | Bits | Size | Max RAM required | Use case |
| ---- | ---- | ---- | ---- | ---- | ----- |
| [mixtral_7bx2_moe.Q2_K.gguf](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF/blob/main/mixtral_7bx2_moe.Q2_K.gguf) | Q2_K | 2 | 4.36 GB| 6.86 GB | smallest, significant quality loss - not recommended for most purposes |
| [mixtral_7bx2_moe.Q3_K_M.gguf](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF/blob/main/mixtral_7bx2_moe.Q3_K_M.gguf) | Q3_K_M | 3 | 5.68 GB| 8.18 GB | very small, high quality loss |
| [mixtral_7bx2_moe.Q4_0.gguf](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF/blob/main/mixtral_7bx2_moe.Q4_0.gguf) | Q4_0 | 4 | 7.28 GB| 9.78 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [mixtral_7bx2_moe.Q4_K_M.gguf](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF/blob/main/mixtral_7bx2_moe.Q4_K_M.gguf) | Q4_K_M | 4 | 7.30 GB| 9.80 GB | medium, balanced quality - recommended |
| [mixtral_7bx2_moe.Q5_0.gguf](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF/blob/main/mixtral_7bx2_moe.Q5_0.gguf) | Q5_0 | 5 | 8.87 GB| 11.37 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [mixtral_7bx2_moe.Q5_K_M.gguf](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF/blob/main/mixtral_7bx2_moe.Q5_K_M.gguf) | Q5_K_M | 5 | 8.88 GB| 11.38 GB | large, very low quality loss - recommended |
| [mixtral_7bx2_moe.Q6_K.gguf](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF/blob/main/mixtral_7bx2_moe.Q6_K.gguf) | Q6_K | 6 | 10.57 GB| 13.07 GB | very large, extremely low quality loss |
| [mixtral_7bx2_moe.Q8_0.gguf](https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF/blob/main/mixtral_7bx2_moe.Q8_0.gguf) | Q8_0 | 8 | 13.69 GB| 16.19 GB | very large, extremely low quality loss - not recommended |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
<!-- README_GGUF.md-provided-files end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: TheBloke/Mixtral_7Bx2_MoE-GGUF and below it, a specific filename to download, such as: mixtral_7bx2_moe.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download TheBloke/Mixtral_7Bx2_MoE-GGUF mixtral_7bx2_moe.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download TheBloke/Mixtral_7Bx2_MoE-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/Mixtral_7Bx2_MoE-GGUF mixtral_7bx2_moe.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m mixtral_7bx2_moe.Q4_K_M.gguf --color -c 32768 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "{prompt}"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 32768` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 โ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./mixtral_7bx2_moe.Q4_K_M.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"{prompt}", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./mixtral_7bx2_moe.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: hai's Mixtral 7Bx2 MoE
# Mixtral MOE 2x7B
MoE of the following models :
* [rwitz2/go-bruins-v2.1.1](https://huggingface.co/rwitz2/go-bruins-v2.1.1)
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
* [meta-math/mncai/mistral-7b-dpo-v6](https://huggingface.co/mncai/mistral-7b-dpo-v6)
gpu code example
```
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
import math
## v2 models
model_path = "cloudyu/Mixtral_7Bx2_MoE"
tokenizer = AutoTokenizer.from_pretrained(model_path, use_default_system_prompt=False)
model = AutoModelForCausalLM.from_pretrained(
model_path, torch_dtype=torch.float32, device_map='auto',local_files_only=False, load_in_4bit=True
)
print(model)
prompt = input("please input prompt:")
while len(prompt) > 0:
input_ids = tokenizer(prompt, return_tensors="pt").input_ids.to("cuda")
generation_output = model.generate(
input_ids=input_ids, max_new_tokens=500,repetition_penalty=1.2
)
print(tokenizer.decode(generation_output[0]))
prompt = input("please input prompt:")
```
CPU example
```
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
import math
## v2 models
model_path = "cloudyu/Mixtral_7Bx2_MoE"
tokenizer = AutoTokenizer.from_pretrained(model_path, use_default_system_prompt=False)
model = AutoModelForCausalLM.from_pretrained(
model_path, torch_dtype=torch.float32, device_map='cpu',local_files_only=False
)
print(model)
prompt = input("please input prompt:")
while len(prompt) > 0:
input_ids = tokenizer(prompt, return_tensors="pt").input_ids
generation_output = model.generate(
input_ids=input_ids, max_new_tokens=500,repetition_penalty=1.2
)
print(tokenizer.decode(generation_output[0]))
prompt = input("please input prompt:")
```
<!-- original-model-card end -->
|
TheBloke/bun_mistral_7b_v2-GPTQ | TheBloke | 2023-12-23T12:31:00Z | 23 | 2 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"CoT",
"en",
"base_model:aloobun/bun_mistral_7b_v2",
"base_model:quantized:aloobun/bun_mistral_7b_v2",
"license:cc",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"gptq",
"region:us"
] | text-generation | 2023-12-23T12:02:11Z | ---
base_model: aloobun/bun_mistral_7b_v2
inference: false
language:
- en
license: cc
model_creator: wh0ois
model_name: Bun Mistral 7B v2
model_type: mistral
prompt_template: '{prompt}
'
quantized_by: TheBloke
tags:
- CoT
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Bun Mistral 7B v2 - GPTQ
- Model creator: [wh0ois](https://huggingface.co/aloobun)
- Original model: [Bun Mistral 7B v2](https://huggingface.co/aloobun/bun_mistral_7b_v2)
<!-- description start -->
# Description
This repo contains GPTQ model files for [wh0ois's Bun Mistral 7B v2](https://huggingface.co/aloobun/bun_mistral_7b_v2).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/bun_mistral_7b_v2-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GGUF)
* [wh0ois's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/aloobun/bun_mistral_7b_v2)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Unknown
```
{prompt}
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-compatible clients start -->
## Known compatible clients / servers
GPTQ models are currently supported on Linux (NVidia/AMD) and Windows (NVidia only). macOS users: please use GGUF models.
These GPTQ models are known to work in the following inference servers/webuis.
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
- [KoboldAI United](https://github.com/henk717/koboldai)
- [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui)
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
This may not be a complete list; if you know of others, please let me know!
<!-- README_GPTQ.md-compatible clients end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files, and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
Most GPTQ files are made with AutoGPTQ. Mistral models are currently made with Transformers.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The calibration dataset used during quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ calibration dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama and Mistral models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GPTQ/tree/main) | 4 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 4.16 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 4.57 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 7.52 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 7.68 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
| [gptq-8bit-32g-actorder_True](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GPTQ/tree/gptq-8bit-32g-actorder_True) | 8 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 8.17 GB | No | 8-bit, with group size 32g and Act Order for maximum inference quality. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 4.29 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download, including from branches
### In text-generation-webui
To download from the `main` branch, enter `TheBloke/bun_mistral_7b_v2-GPTQ` in the "Download model" box.
To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/bun_mistral_7b_v2-GPTQ:gptq-4bit-32g-actorder_True`
### From the command line
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
To download the `main` branch to a folder called `bun_mistral_7b_v2-GPTQ`:
```shell
mkdir bun_mistral_7b_v2-GPTQ
huggingface-cli download TheBloke/bun_mistral_7b_v2-GPTQ --local-dir bun_mistral_7b_v2-GPTQ --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir bun_mistral_7b_v2-GPTQ
huggingface-cli download TheBloke/bun_mistral_7b_v2-GPTQ --revision gptq-4bit-32g-actorder_True --local-dir bun_mistral_7b_v2-GPTQ --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Hugging Face cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
mkdir bun_mistral_7b_v2-GPTQ
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/bun_mistral_7b_v2-GPTQ --local-dir bun_mistral_7b_v2-GPTQ --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
### With `git` (**not** recommended)
To clone a specific branch with `git`, use a command like this:
```shell
git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/bun_mistral_7b_v2-GPTQ
```
Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/bun_mistral_7b_v2-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/bun_mistral_7b_v2-GPTQ:gptq-4bit-32g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `bun_mistral_7b_v2-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
- Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
It's recommended to use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/bun_mistral_7b_v2-GPTQ --port 3000 --quantize gptq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''{prompt}
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_GPTQ.md-use-from-tgi end -->
<!-- README_GPTQ.md-use-from-python start -->
## Python code example: inference from this GPTQ model
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install --upgrade transformers optimum
# If using PyTorch 2.1 + CUDA 12.x:
pip3 install --upgrade auto-gptq
# or, if using PyTorch 2.1 + CUDA 11.x:
pip3 install --upgrade auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/
```
If you are using PyTorch 2.0, you will need to install AutoGPTQ from source. Likewise if you have problems with the pre-built wheels, you should try building from source:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
git checkout v0.5.1
pip3 install .
```
### Example Python code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/bun_mistral_7b_v2-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-32g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Write a story about llamas"
system_message = "You are a story writing assistant"
prompt_template=f'''{prompt}
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with Transformers. For non-Mistral models, AutoGPTQ can also be used directly.
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama architecture models (including Mistral, Yi, DeepSeek, SOLAR, etc) in 4-bit. Please see the Provided Files table above for per-file compatibility.
For a list of clients/servers, please see "Known compatible clients / servers", above.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: wh0ois's Bun Mistral 7B v2
finetuned of mistralai/Mistral-7B-v0.1 for CoT reasoning
gguf : [aloobun/bun_mistral_7b_v2-GGUF](https://huggingface.co/aloobun/bun_mistral_7b_v2-GGUF)
Fine-tuning language models is like tuning the strings of an AI banjo in the cosmic saloon of the digital frontier. We're not just slinging code; it's a harmonious quest to shape the minds of silicon wanderers, crafting binary ballads and electronic echoes. Picture it as cybernetic bardic magic, where we, the tech sorcerers, weave algorithms with strands of imagination. But, in this cosmic hoedown, there's a twist โ as we twang the strings of artificial intelligence, we're also seeding the algorithms with a bit of human stardust, adding quirks and quirksome biases. So, as we two-step into this frontier of creation, are we summoning AI troubadours of the future or just conjuring interstellar jesters, spinning tales of silicon whimsy and digital campfire banter?
|
TheBloke/bun_mistral_7b_v2-AWQ | TheBloke | 2023-12-23T12:20:12Z | 8 | 2 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"CoT",
"en",
"base_model:aloobun/bun_mistral_7b_v2",
"base_model:quantized:aloobun/bun_mistral_7b_v2",
"license:cc",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"awq",
"region:us"
] | text-generation | 2023-12-23T12:02:11Z | ---
base_model: aloobun/bun_mistral_7b_v2
inference: false
language:
- en
license: cc
model_creator: wh0ois
model_name: Bun Mistral 7B v2
model_type: mistral
prompt_template: '{prompt}
'
quantized_by: TheBloke
tags:
- CoT
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Bun Mistral 7B v2 - AWQ
- Model creator: [wh0ois](https://huggingface.co/aloobun)
- Original model: [Bun Mistral 7B v2](https://huggingface.co/aloobun/bun_mistral_7b_v2)
<!-- description start -->
## Description
This repo contains AWQ model files for [wh0ois's Bun Mistral 7B v2](https://huggingface.co/aloobun/bun_mistral_7b_v2).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
### About AWQ
AWQ is an efficient, accurate and blazing-fast low-bit weight quantization method, currently supporting 4-bit quantization. Compared to GPTQ, it offers faster Transformers-based inference with equivalent or better quality compared to the most commonly used GPTQ settings.
AWQ models are currently supported on Linux and Windows, with NVidia GPUs only. macOS users: please use GGUF models instead.
It is supported by:
- [Text Generation Webui](https://github.com/oobabooga/text-generation-webui) - using Loader: AutoAWQ
- [vLLM](https://github.com/vllm-project/vllm) - version 0.2.2 or later for support for all model types.
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
- [Transformers](https://huggingface.co/docs/transformers) version 4.35.0 and later, from any code or client that supports Transformers
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) - for use from Python code
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/bun_mistral_7b_v2-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/bun_mistral_7b_v2-GGUF)
* [wh0ois's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/aloobun/bun_mistral_7b_v2)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Unknown
```
{prompt}
```
<!-- prompt-template end -->
<!-- README_AWQ.md-provided-files start -->
## Provided files, and AWQ parameters
I currently release 128g GEMM models only. The addition of group_size 32 models, and GEMV kernel models, is being actively considered.
Models are released as sharded safetensors files.
| Branch | Bits | GS | AWQ Dataset | Seq Len | Size |
| ------ | ---- | -- | ----------- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/bun_mistral_7b_v2-AWQ/tree/main) | 4 | 128 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 4.15 GB
<!-- README_AWQ.md-provided-files end -->
<!-- README_AWQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/bun_mistral_7b_v2-AWQ`.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `bun_mistral_7b_v2-AWQ`
7. Select **Loader: AutoAWQ**.
8. Click Load, and the model will load and is now ready for use.
9. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
10. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_AWQ.md-text-generation-webui end -->
<!-- README_AWQ.md-use-from-vllm start -->
## Multi-user inference server: vLLM
Documentation on installing and using vLLM [can be found here](https://vllm.readthedocs.io/en/latest/).
- Please ensure you are using vLLM version 0.2 or later.
- When using vLLM as a server, pass the `--quantization awq` parameter.
For example:
```shell
python3 -m vllm.entrypoints.api_server --model TheBloke/bun_mistral_7b_v2-AWQ --quantization awq --dtype auto
```
- When using vLLM from Python code, again set `quantization=awq`.
For example:
```python
from vllm import LLM, SamplingParams
prompts = [
"Tell me about AI",
"Write a story about llamas",
"What is 291 - 150?",
"How much wood would a woodchuck chuck if a woodchuck could chuck wood?",
]
prompt_template=f'''{prompt}
'''
prompts = [prompt_template.format(prompt=prompt) for prompt in prompts]
sampling_params = SamplingParams(temperature=0.8, top_p=0.95)
llm = LLM(model="TheBloke/bun_mistral_7b_v2-AWQ", quantization="awq", dtype="auto")
outputs = llm.generate(prompts, sampling_params)
# Print the outputs.
for output in outputs:
prompt = output.prompt
generated_text = output.outputs[0].text
print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
```
<!-- README_AWQ.md-use-from-vllm start -->
<!-- README_AWQ.md-use-from-tgi start -->
## Multi-user inference server: Hugging Face Text Generation Inference (TGI)
Use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/bun_mistral_7b_v2-AWQ --port 3000 --quantize awq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires [huggingface-hub](https://github.com/huggingface/huggingface_hub) 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''{prompt}
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: ", response)
```
<!-- README_AWQ.md-use-from-tgi end -->
<!-- README_AWQ.md-use-from-python start -->
## Inference from Python code using Transformers
### Install the necessary packages
- Requires: [Transformers](https://huggingface.co/docs/transformers) 4.35.0 or later.
- Requires: [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) 0.1.6 or later.
```shell
pip3 install --upgrade "autoawq>=0.1.6" "transformers>=4.35.0"
```
Note that if you are using PyTorch 2.0.1, the above AutoAWQ command will automatically upgrade you to PyTorch 2.1.0.
If you are using CUDA 11.8 and wish to continue using PyTorch 2.0.1, instead run this command:
```shell
pip3 install https://github.com/casper-hansen/AutoAWQ/releases/download/v0.1.6/autoawq-0.1.6+cu118-cp310-cp310-linux_x86_64.whl
```
If you have problems installing [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y autoawq
git clone https://github.com/casper-hansen/AutoAWQ
cd AutoAWQ
pip3 install .
```
### Transformers example code (requires Transformers 4.35.0 and later)
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
model_name_or_path = "TheBloke/bun_mistral_7b_v2-AWQ"
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
model = AutoModelForCausalLM.from_pretrained(
model_name_or_path,
low_cpu_mem_usage=True,
device_map="cuda:0"
)
# Using the text streamer to stream output one token at a time
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
prompt = "Tell me about AI"
prompt_template=f'''{prompt}
'''
# Convert prompt to tokens
tokens = tokenizer(
prompt_template,
return_tensors='pt'
).input_ids.cuda()
generation_params = {
"do_sample": True,
"temperature": 0.7,
"top_p": 0.95,
"top_k": 40,
"max_new_tokens": 512,
"repetition_penalty": 1.1
}
# Generate streamed output, visible one token at a time
generation_output = model.generate(
tokens,
streamer=streamer,
**generation_params
)
# Generation without a streamer, which will include the prompt in the output
generation_output = model.generate(
tokens,
**generation_params
)
# Get the tokens from the output, decode them, print them
token_output = generation_output[0]
text_output = tokenizer.decode(token_output)
print("model.generate output: ", text_output)
# Inference is also possible via Transformers' pipeline
from transformers import pipeline
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
**generation_params
)
pipe_output = pipe(prompt_template)[0]['generated_text']
print("pipeline output: ", pipe_output)
```
<!-- README_AWQ.md-use-from-python end -->
<!-- README_AWQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with:
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui) using `Loader: AutoAWQ`.
- [vLLM](https://github.com/vllm-project/vllm) version 0.2.0 and later.
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) version 1.1.0 and later.
- [Transformers](https://huggingface.co/docs/transformers) version 4.35.0 and later.
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) version 0.1.1 and later.
<!-- README_AWQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: wh0ois's Bun Mistral 7B v2
finetuned of mistralai/Mistral-7B-v0.1 for CoT reasoning
gguf : [aloobun/bun_mistral_7b_v2-GGUF](https://huggingface.co/aloobun/bun_mistral_7b_v2-GGUF)
Fine-tuning language models is like tuning the strings of an AI banjo in the cosmic saloon of the digital frontier. We're not just slinging code; it's a harmonious quest to shape the minds of silicon wanderers, crafting binary ballads and electronic echoes. Picture it as cybernetic bardic magic, where we, the tech sorcerers, weave algorithms with strands of imagination. But, in this cosmic hoedown, there's a twist โ as we twang the strings of artificial intelligence, we're also seeding the algorithms with a bit of human stardust, adding quirks and quirksome biases. So, as we two-step into this frontier of creation, are we summoning AI troubadours of the future or just conjuring interstellar jesters, spinning tales of silicon whimsy and digital campfire banter?
|
bdsaglam/llama-2-7b-chat-hf-kg-cons-multi-peft-1703317593 | bdsaglam | 2023-12-23T12:17:02Z | 0 | 0 | peft | [
"peft",
"region:us"
] | null | 2023-12-23T12:16:51Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.4.0
|
Shamik/whisper-tiny-polyAI-minds14 | Shamik | 2023-12-23T12:15:19Z | 4 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"en",
"dataset:PolyAI/minds14",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2023-12-23T11:40:36Z | ---
language:
- en
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_trainer
datasets:
- PolyAI/minds14
metrics:
- wer
model-index:
- name: Whisper Tiny finetuned on PolyAI Minds14 English US
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Speech Transcription in English from e-banking domain.
type: PolyAI/minds14
config: en-US
split: train
args: en-US
metrics:
- name: Wer
type: wer
value: 0.3822590938098277
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Tiny finetuned on PolyAI Minds14 English US
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the Speech Transcription in English from e-banking domain. dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8668
- Wer Ortho: 0.4009
- Wer: 0.3823
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_steps: 50
- training_steps: 400
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|
| 0.3501 | 3.57 | 100 | 0.7134 | 0.4568 | 0.4212 |
| 0.044 | 7.14 | 200 | 0.7639 | 0.4096 | 0.3746 |
| 0.0048 | 10.71 | 300 | 0.8265 | 0.4109 | 0.3854 |
| 0.0021 | 14.29 | 400 | 0.8668 | 0.4009 | 0.3823 |
### Framework versions
- Transformers 4.36.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.15.0
|
ljvmiranda921/xx_lat_sigtyp_trf | ljvmiranda921 | 2023-12-23T12:06:36Z | 2 | 0 | spacy | [
"spacy",
"token-classification",
"multilingual",
"model-index",
"region:us"
] | token-classification | 2023-11-30T11:39:12Z | ---
tags:
- spacy
- token-classification
language:
- multilingual
model-index:
- name: xx_lat_sigtyp_trf
results:
- task:
name: TAG
type: token-classification
metrics:
- name: TAG (XPOS) Accuracy
type: accuracy
value: 0.8715722514
- task:
name: POS
type: token-classification
metrics:
- name: POS (UPOS) Accuracy
type: accuracy
value: 0.9493681767
- task:
name: MORPH
type: token-classification
metrics:
- name: Morph (UFeats) Accuracy
type: accuracy
value: 0.8817587897
- task:
name: LEMMA
type: token-classification
metrics:
- name: Lemma Accuracy
type: accuracy
value: 0.922276898
- task:
name: UNLABELED_DEPENDENCIES
type: token-classification
metrics:
- name: Unlabeled Attachment Score (UAS)
type: f_score
value: 0.7655232685
- task:
name: LABELED_DEPENDENCIES
type: token-classification
metrics:
- name: Labeled Attachment Score (LAS)
type: f_score
value: 0.7003277256
- task:
name: SENTS
type: token-classification
metrics:
- name: Sentences F-Score
type: f_score
value: 0.8818681319
---
| Feature | Description |
| --- | --- |
| **Name** | `xx_lat_sigtyp_trf` |
| **Version** | `0.1.0` |
| **spaCy** | `>=3.6.1,<3.7.0` |
| **Default Pipeline** | `transformer`, `parser`, `trainable_lemmatizer`, `tagger`, `morphologizer` |
| **Components** | `transformer`, `parser`, `trainable_lemmatizer`, `tagger`, `morphologizer` |
| **Vectors** | 0 keys, 0 unique vectors (0 dimensions) |
| **Sources** | n/a |
| **License** | n/a |
| **Author** | [n/a]() |
### Label Scheme
<details>
<summary>View label scheme (3696 labels for 3 components)</summary>
| Component | Labels |
| --- | --- |
| **`parser`** | `ROOT`, `acl`, `acl:relcl`, `advcl`, `advcl:abs`, `advcl:cmp`, `advmod`, `advmod:emph`, `advmod:lmod`, `advmod:neg`, `advmod:tmod`, `amod`, `appos`, `aux`, `aux:pass`, `case`, `cc`, `ccomp`, `conj`, `cop`, `csubj`, `csubj:pass`, `dep`, `det`, `discourse`, `dislocated`, `fixed`, `flat:foreign`, `flat:name`, `mark`, `nmod`, `nsubj`, `nsubj:outer`, `nsubj:pass`, `nummod`, `obj`, `obl`, `obl:agent`, `obl:arg`, `orphan`, `parataxis`, `punct`, `vocative`, `xcomp` |
| **`tagger`** | `---------`, `---------__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `---------__Aspect=Imp\|Tense=Pres\|VerbForm=Inf\|Voice=Act`, `---------__Aspect=Imp\|Tense=Pres\|VerbForm=Inf\|Voice=Pass`, `---------__Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `---------__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `---------__Case=Acc\|Gender=Masc\|Number=Sing`, `---------__Case=Dat\|Gender=Fem\|Number=Sing`, `---------__Case=Dat\|Gender=Masc\|Number=Sing`, `---------__Case=Dat\|Number=Sing\|Person=1\|PronType=Prs`, `---------__Case=Nom\|Gender=Fem\|Number=Sing`, `---------__Case=Nom\|Gender=Masc\|Number=Sing`, `---------__NumType=Card\|PronType=Ind`, `---------__PronType=Dem`, `---pnp---__Aspect=Imp\|Tense=Pres\|VerbForm=Inf\|Voice=Pass`, `--p---ma-__Case=Acc\|Gender=Masc\|Number=Plur`, `-2spia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `-3plsa---__Aspect=Perf\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `-3sria---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `A-`, `A-__Case=Abl\|Degree=Cmp\|Gender=Fem,Masc\|Number=Sing`, `A-__Case=Abl\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `A-__Case=Abl\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `A-__Case=Abl\|Degree=Cmp\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Abl\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `A-__Case=Abl\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `A-__Case=Abl\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `A-__Case=Abl\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `A-__Case=Abl\|Degree=Cmp\|Number=Plur`, `A-__Case=Abl\|Degree=Cmp\|Number=Sing`, `A-__Case=Abl\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur`, `A-__Case=Abl\|Degree=Pos\|Gender=Fem,Masc\|Number=Sing`, `A-__Case=Abl\|Degree=Pos\|Gender=Fem\|Number=Plur`, `A-__Case=Abl\|Degree=Pos\|Gender=Fem\|Number=Sing`, `A-__Case=Abl\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur`, `A-__Case=Abl\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Abl\|Degree=Pos\|Gender=Masc\|Number=Plur`, `A-__Case=Abl\|Degree=Pos\|Gender=Masc\|Number=Sing`, `A-__Case=Abl\|Degree=Pos\|Gender=Neut\|Number=Plur`, `A-__Case=Abl\|Degree=Pos\|Gender=Neut\|Number=Sing`, `A-__Case=Abl\|Degree=Pos\|Number=Plur`, `A-__Case=Abl\|Degree=Pos\|Number=Sing`, `A-__Case=Abl\|Degree=Sup\|Gender=Fem\|Number=Plur`, `A-__Case=Abl\|Degree=Sup\|Gender=Fem\|Number=Sing`, `A-__Case=Abl\|Degree=Sup\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Abl\|Degree=Sup\|Gender=Masc\|Number=Plur`, `A-__Case=Abl\|Degree=Sup\|Gender=Masc\|Number=Sing`, `A-__Case=Abl\|Degree=Sup\|Gender=Neut\|Number=Plur`, `A-__Case=Abl\|Degree=Sup\|Gender=Neut\|Number=Sing`, `A-__Case=Abl\|Degree=Sup\|Number=Plur`, `A-__Case=Acc\|Degree=Cmp\|Gender=Fem,Masc\|Number=Plur`, `A-__Case=Acc\|Degree=Cmp\|Gender=Fem,Masc\|Number=Sing`, `A-__Case=Acc\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `A-__Case=Acc\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `A-__Case=Acc\|Degree=Cmp\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Acc\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `A-__Case=Acc\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `A-__Case=Acc\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `A-__Case=Acc\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `A-__Case=Acc\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur`, `A-__Case=Acc\|Degree=Pos\|Gender=Fem,Masc\|Number=Sing`, `A-__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur`, `A-__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `A-__Case=Acc\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur`, `A-__Case=Acc\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur`, `A-__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `A-__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Plur`, `A-__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `A-__Case=Acc\|Degree=Sup\|Gender=Fem\|Number=Plur`, `A-__Case=Acc\|Degree=Sup\|Gender=Fem\|Number=Sing`, `A-__Case=Acc\|Degree=Sup\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Acc\|Degree=Sup\|Gender=Masc\|Number=Plur`, `A-__Case=Acc\|Degree=Sup\|Gender=Masc\|Number=Sing`, `A-__Case=Acc\|Degree=Sup\|Gender=Neut\|Number=Plur`, `A-__Case=Acc\|Degree=Sup\|Gender=Neut\|Number=Sing`, `A-__Case=Dat\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `A-__Case=Dat\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `A-__Case=Dat\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `A-__Case=Dat\|Degree=Cmp\|Number=Plur`, `A-__Case=Dat\|Degree=Cmp\|Number=Sing`, `A-__Case=Dat\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur`, `A-__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur`, `A-__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Sing`, `A-__Case=Dat\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur`, `A-__Case=Dat\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur`, `A-__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `A-__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `A-__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `A-__Case=Dat\|Degree=Pos\|Number=Plur`, `A-__Case=Dat\|Degree=Pos\|Number=Sing`, `A-__Case=Dat\|Degree=Sup\|Gender=Fem\|Number=Sing`, `A-__Case=Dat\|Degree=Sup\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Dat\|Degree=Sup\|Gender=Masc\|Number=Plur`, `A-__Case=Dat\|Degree=Sup\|Gender=Masc\|Number=Sing`, `A-__Case=Gen\|Degree=Cmp\|Gender=Fem,Masc\|Number=Sing`, `A-__Case=Gen\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `A-__Case=Gen\|Degree=Cmp\|Gender=Masc,Neut\|Number=Plur`, `A-__Case=Gen\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `A-__Case=Gen\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `A-__Case=Gen\|Degree=Cmp\|Number=Plur`, `A-__Case=Gen\|Degree=Cmp\|Number=Sing`, `A-__Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Plur`, `A-__Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Sing`, `A-__Case=Gen\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur`, `A-__Case=Gen\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Plur`, `A-__Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Sing`, `A-__Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Plur`, `A-__Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Sing`, `A-__Case=Gen\|Degree=Pos\|Number=Plur`, `A-__Case=Gen\|Degree=Pos\|Number=Sing`, `A-__Case=Gen\|Degree=Sup\|Gender=Fem\|Number=Sing`, `A-__Case=Gen\|Degree=Sup\|Gender=Masc,Neut\|Number=Plur`, `A-__Case=Gen\|Degree=Sup\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Gen\|Degree=Sup\|Gender=Masc\|Number=Plur`, `A-__Case=Gen\|Degree=Sup\|Gender=Masc\|Number=Sing`, `A-__Case=Gen\|Degree=Sup\|Gender=Neut\|Number=Sing`, `A-__Case=Nom\|Degree=Cmp\|Gender=Fem,Masc\|Number=Plur`, `A-__Case=Nom\|Degree=Cmp\|Gender=Fem,Masc\|Number=Sing`, `A-__Case=Nom\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `A-__Case=Nom\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `A-__Case=Nom\|Degree=Cmp\|Gender=Masc,Neut\|Number=Plur`, `A-__Case=Nom\|Degree=Cmp\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Nom\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `A-__Case=Nom\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `A-__Case=Nom\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `A-__Case=Nom\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `A-__Case=Nom\|Degree=Cmp\|Number=Sing`, `A-__Case=Nom\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur`, `A-__Case=Nom\|Degree=Pos\|Gender=Fem,Masc\|Number=Sing`, `A-__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur`, `A-__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `A-__Case=Nom\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur`, `A-__Case=Nom\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing`, `A-__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `A-__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `A-__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur`, `A-__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `A-__Case=Nom\|Degree=Pos\|Number=Plur`, `A-__Case=Nom\|Degree=Pos\|Number=Sing`, `A-__Case=Nom\|Degree=Sup\|Gender=Fem\|Number=Plur`, `A-__Case=Nom\|Degree=Sup\|Gender=Fem\|Number=Sing`, `A-__Case=Nom\|Degree=Sup\|Gender=Masc\|Number=Plur`, `A-__Case=Nom\|Degree=Sup\|Gender=Masc\|Number=Sing`, `A-__Case=Nom\|Degree=Sup\|Gender=Neut\|Number=Plur`, `A-__Case=Nom\|Degree=Sup\|Gender=Neut\|Number=Sing`, `A-__Case=Voc\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `A-__Case=Voc\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur`, `A-__Case=Voc\|Degree=Pos\|Gender=Fem,Masc\|Number=Sing`, `A-__Case=Voc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `A-__Case=Voc\|Degree=Pos\|Gender=Masc\|Number=Plur`, `A-__Case=Voc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `A-__Case=Voc\|Degree=Sup\|Gender=Masc\|Number=Plur`, `A-__Case=Voc\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJ__Case=Abl\|Gender=Fem\|Number=Sing`, `ADJ__Case=Acc\|Gender=Fem\|Number=Sing`, `ADJ__Case=Acc\|Gender=Masc\|Number=Sing`, `ADJ__Case=Gen\|Gender=Fem\|Number=Plur`, `ADJ__Case=Gen\|Gender=Fem\|Number=Sing`, `ADJ__Case=Nom\|Gender=Fem\|Number=Sing`, `ADP__AdpType=Post`, `C-`, `Df`, `Df__Degree=Cmp`, `Df__Degree=Pos`, `Df__Degree=Sup`, `Df__Polarity=Neg`, `Dq__PronType=Rel`, `Du__PronType=Int`, `F-`, `G-`, `I-`, `Ma`, `Ma__Case=Abl\|Gender=Fem,Masc\|Number=Plur`, `Ma__Case=Abl\|Gender=Fem\|Number=Plur`, `Ma__Case=Abl\|Gender=Fem\|Number=Sing`, `Ma__Case=Abl\|Gender=Masc,Neut\|Number=Plur`, `Ma__Case=Abl\|Gender=Masc,Neut\|Number=Sing`, `Ma__Case=Abl\|Gender=Masc\|Number=Plur`, `Ma__Case=Abl\|Gender=Masc\|Number=Sing`, `Ma__Case=Abl\|Gender=Neut\|Number=Plur`, `Ma__Case=Abl\|Gender=Neut\|Number=Sing`, `Ma__Case=Abl\|Number=Plur`, `Ma__Case=Acc\|Gender=Fem,Masc\|Number=Plur`, `Ma__Case=Acc\|Gender=Fem\|Number=Plur`, `Ma__Case=Acc\|Gender=Fem\|Number=Sing`, `Ma__Case=Acc\|Gender=Masc,Neut\|Number=Sing`, `Ma__Case=Acc\|Gender=Masc\|Number=Plur`, `Ma__Case=Acc\|Gender=Masc\|Number=Sing`, `Ma__Case=Acc\|Gender=Neut\|Number=Plur`, `Ma__Case=Acc\|Gender=Neut\|Number=Sing`, `Ma__Case=Acc\|Number=Plur`, `Ma__Case=Dat\|Gender=Fem\|Number=Plur`, `Ma__Case=Dat\|Gender=Fem\|Number=Sing`, `Ma__Case=Dat\|Gender=Masc\|Number=Plur`, `Ma__Case=Dat\|Gender=Masc\|Number=Sing`, `Ma__Case=Dat\|Gender=Neut\|Number=Plur`, `Ma__Case=Dat\|Number=Plur`, `Ma__Case=Gen\|Gender=Fem\|Number=Plur`, `Ma__Case=Gen\|Gender=Fem\|Number=Sing`, `Ma__Case=Gen\|Gender=Masc,Neut\|Number=Plur`, `Ma__Case=Gen\|Gender=Masc\|Number=Plur`, `Ma__Case=Gen\|Gender=Masc\|Number=Sing`, `Ma__Case=Gen\|Gender=Neut\|Number=Plur`, `Ma__Case=Gen\|Gender=Neut\|Number=Sing`, `Ma__Case=Gen\|Number=Plur`, `Ma__Case=Gen\|Number=Sing`, `Ma__Case=Nom\|Gender=Fem,Masc\|Number=Plur`, `Ma__Case=Nom\|Gender=Fem\|Number=Plur`, `Ma__Case=Nom\|Gender=Fem\|Number=Sing`, `Ma__Case=Nom\|Gender=Masc,Neut\|Number=Plur`, `Ma__Case=Nom\|Gender=Masc\|Number=Plur`, `Ma__Case=Nom\|Gender=Masc\|Number=Sing`, `Ma__Case=Nom\|Gender=Neut\|Number=Plur`, `Ma__Case=Nom\|Gender=Neut\|Number=Sing`, `Ma__Case=Nom\|Number=Plur`, `Mo__Case=Abl\|Gender=Fem,Masc\|Number=Plur`, `Mo__Case=Abl\|Gender=Fem\|Number=Plur`, `Mo__Case=Abl\|Gender=Fem\|Number=Sing`, `Mo__Case=Abl\|Gender=Masc,Neut\|Number=Sing`, `Mo__Case=Abl\|Gender=Masc\|Number=Plur`, `Mo__Case=Abl\|Gender=Masc\|Number=Sing`, `Mo__Case=Abl\|Gender=Neut\|Number=Plur`, `Mo__Case=Abl\|Gender=Neut\|Number=Sing`, `Mo__Case=Abl\|Number=Plur`, `Mo__Case=Acc\|Gender=Fem,Masc\|Number=Plur`, `Mo__Case=Acc\|Gender=Fem\|Number=Plur`, `Mo__Case=Acc\|Gender=Fem\|Number=Sing`, `Mo__Case=Acc\|Gender=Masc,Neut\|Number=Sing`, `Mo__Case=Acc\|Gender=Masc\|Number=Plur`, `Mo__Case=Acc\|Gender=Masc\|Number=Sing`, `Mo__Case=Acc\|Gender=Neut\|Number=Plur`, `Mo__Case=Acc\|Gender=Neut\|Number=Sing`, `Mo__Case=Dat\|Gender=Fem\|Number=Plur`, `Mo__Case=Dat\|Gender=Masc\|Number=Plur`, `Mo__Case=Dat\|Gender=Masc\|Number=Sing`, `Mo__Case=Gen\|Gender=Fem\|Number=Sing`, `Mo__Case=Gen\|Gender=Masc,Neut\|Number=Plur`, `Mo__Case=Gen\|Gender=Masc,Neut\|Number=Sing`, `Mo__Case=Gen\|Gender=Masc\|Number=Sing`, `Mo__Case=Gen\|Gender=Neut\|Number=Sing`, `Mo__Case=Nom\|Gender=Fem\|Number=Plur`, `Mo__Case=Nom\|Gender=Fem\|Number=Sing`, `Mo__Case=Nom\|Gender=Masc\|Number=Plur`, `Mo__Case=Nom\|Gender=Masc\|Number=Sing`, `Mo__Case=Nom\|Gender=Neut\|Number=Plur`, `Mo__Case=Nom\|Gender=Neut\|Number=Sing`, `Nb`, `Nb__Case=Abl\|Gender=Fem,Masc\|Number=Plur`, `Nb__Case=Abl\|Gender=Fem,Masc\|Number=Sing`, `Nb__Case=Abl\|Gender=Fem\|Number=Plur`, `Nb__Case=Abl\|Gender=Fem\|Number=Sing`, `Nb__Case=Abl\|Gender=Masc,Neut\|Number=Plur`, `Nb__Case=Abl\|Gender=Masc\|Number=Plur`, `Nb__Case=Abl\|Gender=Masc\|Number=Sing`, `Nb__Case=Abl\|Gender=Neut\|Number=Plur`, `Nb__Case=Abl\|Gender=Neut\|Number=Sing`, `Nb__Case=Abl\|Number=Plur`, `Nb__Case=Abl\|Number=Sing`, `Nb__Case=Acc\|Gender=Fem,Masc\|Number=Plur`, `Nb__Case=Acc\|Gender=Fem,Masc\|Number=Sing`, `Nb__Case=Acc\|Gender=Fem\|Number=Plur`, `Nb__Case=Acc\|Gender=Fem\|Number=Sing`, `Nb__Case=Acc\|Gender=Masc,Neut\|Number=Sing`, `Nb__Case=Acc\|Gender=Masc\|Number=Plur`, `Nb__Case=Acc\|Gender=Masc\|Number=Sing`, `Nb__Case=Acc\|Gender=Neut\|Number=Plur`, `Nb__Case=Acc\|Gender=Neut\|Number=Sing`, `Nb__Case=Dat\|Gender=Fem,Masc\|Number=Plur`, `Nb__Case=Dat\|Gender=Fem\|Number=Plur`, `Nb__Case=Dat\|Gender=Fem\|Number=Sing`, `Nb__Case=Dat\|Gender=Masc\|Number=Plur`, `Nb__Case=Dat\|Gender=Masc\|Number=Sing`, `Nb__Case=Dat\|Gender=Neut\|Number=Plur`, `Nb__Case=Dat\|Gender=Neut\|Number=Sing`, `Nb__Case=Gen\|Gender=Fem,Masc\|Number=Plur`, `Nb__Case=Gen\|Gender=Fem,Masc\|Number=Sing`, `Nb__Case=Gen\|Gender=Fem\|Number=Plur`, `Nb__Case=Gen\|Gender=Fem\|Number=Sing`, `Nb__Case=Gen\|Gender=Masc\|Number=Plur`, `Nb__Case=Gen\|Gender=Masc\|Number=Sing`, `Nb__Case=Gen\|Gender=Neut\|Number=Plur`, `Nb__Case=Gen\|Gender=Neut\|Number=Sing`, `Nb__Case=Nom\|Gender=Fem,Masc\|Number=Plur`, `Nb__Case=Nom\|Gender=Fem,Masc\|Number=Sing`, `Nb__Case=Nom\|Gender=Fem\|Number=Plur`, `Nb__Case=Nom\|Gender=Fem\|Number=Sing`, `Nb__Case=Nom\|Gender=Masc\|Number=Plur`, `Nb__Case=Nom\|Gender=Masc\|Number=Sing`, `Nb__Case=Nom\|Gender=Neut\|Number=Plur`, `Nb__Case=Nom\|Gender=Neut\|Number=Sing`, `Nb__Case=Voc\|Gender=Fem\|Number=Plur`, `Nb__Case=Voc\|Gender=Fem\|Number=Sing`, `Nb__Case=Voc\|Gender=Masc\|Number=Plur`, `Nb__Case=Voc\|Gender=Masc\|Number=Sing`, `Nb__Case=Voc\|Gender=Neut\|Number=Plur`, `Nb__Case=Voc\|Gender=Neut\|Number=Sing`, `Ne`, `Ne__Case=Abl\|Gender=Fem\|Number=Plur`, `Ne__Case=Abl\|Gender=Fem\|Number=Sing`, `Ne__Case=Abl\|Gender=Masc\|Number=Plur`, `Ne__Case=Abl\|Gender=Masc\|Number=Sing`, `Ne__Case=Abl\|Gender=Neut\|Number=Plur`, `Ne__Case=Abl\|Gender=Neut\|Number=Sing`, `Ne__Case=Acc\|Gender=Fem,Masc\|Number=Plur`, `Ne__Case=Acc\|Gender=Fem,Masc\|Number=Sing`, `Ne__Case=Acc\|Gender=Fem\|Number=Plur`, `Ne__Case=Acc\|Gender=Fem\|Number=Sing`, `Ne__Case=Acc\|Gender=Masc\|Number=Plur`, `Ne__Case=Acc\|Gender=Masc\|Number=Sing`, `Ne__Case=Acc\|Gender=Neut\|Number=Plur`, `Ne__Case=Acc\|Gender=Neut\|Number=Sing`, `Ne__Case=Acc\|Number=Sing`, `Ne__Case=Dat\|Gender=Fem\|Number=Plur`, `Ne__Case=Dat\|Gender=Fem\|Number=Sing`, `Ne__Case=Dat\|Gender=Masc\|Number=Plur`, `Ne__Case=Dat\|Gender=Masc\|Number=Sing`, `Ne__Case=Dat\|Gender=Neut\|Number=Plur`, `Ne__Case=Gen\|Gender=Fem\|Number=Plur`, `Ne__Case=Gen\|Gender=Fem\|Number=Sing`, `Ne__Case=Gen\|Gender=Masc,Neut\|Number=Sing`, `Ne__Case=Gen\|Gender=Masc\|Number=Plur`, `Ne__Case=Gen\|Gender=Masc\|Number=Sing`, `Ne__Case=Gen\|Gender=Neut\|Number=Sing`, `Ne__Case=Nom\|Gender=Fem\|Number=Plur`, `Ne__Case=Nom\|Gender=Fem\|Number=Sing`, `Ne__Case=Nom\|Gender=Masc\|Number=Plur`, `Ne__Case=Nom\|Gender=Masc\|Number=Sing`, `Ne__Case=Nom\|Gender=Neut\|Number=Plur`, `Ne__Case=Nom\|Gender=Neut\|Number=Sing`, `Ne__Case=Voc\|Gender=Fem\|Number=Plur`, `Ne__Case=Voc\|Gender=Fem\|Number=Sing`, `Ne__Case=Voc\|Gender=Masc\|Number=Sing`, `PART__PartType=Int`, `PROPN__Case=Abl\|Gender=Masc\|Number=Sing`, `PROPN__Case=Acc\|Gender=Masc\|Number=Sing`, `PROPN__Case=Gen\|Gender=Fem\|Number=Plur`, `PROPN__Case=Gen\|Gender=Masc\|Number=Sing`, `PROPN__Case=Nom\|Gender=Masc\|Number=Sing`, `PUNCT`, `Pc__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Rcp`, `Pd__Case=Abl\|Gender=Fem,Masc\|Number=Plur`, `Pd__Case=Abl\|Gender=Fem,Masc\|Number=Sing`, `Pd__Case=Abl\|Gender=Fem\|Number=Plur`, `Pd__Case=Abl\|Gender=Fem\|Number=Sing`, `Pd__Case=Abl\|Gender=Masc,Neut\|Number=Sing`, `Pd__Case=Abl\|Gender=Masc\|Number=Plur`, `Pd__Case=Abl\|Gender=Masc\|Number=Sing`, `Pd__Case=Abl\|Gender=Neut\|Number=Plur`, `Pd__Case=Abl\|Gender=Neut\|Number=Sing`, `Pd__Case=Abl\|Number=Plur`, `Pd__Case=Abl\|Number=Sing`, `Pd__Case=Acc\|Gender=Fem,Masc\|Number=Sing`, `Pd__Case=Acc\|Gender=Fem\|Number=Plur`, `Pd__Case=Acc\|Gender=Fem\|Number=Sing`, `Pd__Case=Acc\|Gender=Masc,Neut\|Number=Sing`, `Pd__Case=Acc\|Gender=Masc\|Number=Plur`, `Pd__Case=Acc\|Gender=Masc\|Number=Sing`, `Pd__Case=Acc\|Gender=Neut\|Number=Plur`, `Pd__Case=Acc\|Gender=Neut\|Number=Sing`, `Pd__Case=Dat\|Gender=Fem\|Number=Plur`, `Pd__Case=Dat\|Gender=Fem\|Number=Sing`, `Pd__Case=Dat\|Gender=Masc,Neut\|Number=Sing`, `Pd__Case=Dat\|Gender=Masc\|Number=Plur`, `Pd__Case=Dat\|Gender=Masc\|Number=Sing`, `Pd__Case=Dat\|Gender=Neut\|Number=Plur`, `Pd__Case=Dat\|Gender=Neut\|Number=Sing`, `Pd__Case=Dat\|Number=Plur`, `Pd__Case=Dat\|Number=Sing`, `Pd__Case=Gen\|Gender=Fem\|Number=Plur`, `Pd__Case=Gen\|Gender=Fem\|Number=Sing`, `Pd__Case=Gen\|Gender=Masc,Neut\|Number=Plur`, `Pd__Case=Gen\|Gender=Masc\|Number=Plur`, `Pd__Case=Gen\|Gender=Masc\|Number=Sing`, `Pd__Case=Gen\|Gender=Neut\|Number=Plur`, `Pd__Case=Gen\|Gender=Neut\|Number=Sing`, `Pd__Case=Gen\|Number=Sing`, `Pd__Case=Nom\|Gender=Fem,Masc\|Number=Plur`, `Pd__Case=Nom\|Gender=Fem,Masc\|Number=Sing`, `Pd__Case=Nom\|Gender=Fem\|Number=Plur`, `Pd__Case=Nom\|Gender=Fem\|Number=Sing`, `Pd__Case=Nom\|Gender=Masc,Neut\|Number=Sing`, `Pd__Case=Nom\|Gender=Masc\|Number=Plur`, `Pd__Case=Nom\|Gender=Masc\|Number=Sing`, `Pd__Case=Nom\|Gender=Neut\|Number=Plur`, `Pd__Case=Nom\|Gender=Neut\|Number=Sing`, `Pd__Case=Voc\|Gender=Fem\|Number=Sing`, `Pi__Case=Abl\|Gender=Fem\|Number=Plur\|PronType=Int`, `Pi__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Int`, `Pi__Case=Abl\|Gender=Masc,Neut\|Number=Sing\|PronType=Int`, `Pi__Case=Abl\|Gender=Masc\|Number=Plur\|PronType=Int`, `Pi__Case=Abl\|Gender=Masc\|Number=Sing\|PronType=Int`, `Pi__Case=Abl\|Gender=Neut\|Number=Plur\|PronType=Int`, `Pi__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Int`, `Pi__Case=Abl\|Number=Plur\|PronType=Int`, `Pi__Case=Acc\|Gender=Fem,Masc\|Number=Plur\|PronType=Int`, `Pi__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Int`, `Pi__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Int`, `Pi__Case=Acc\|Gender=Masc,Neut\|Number=Sing\|PronType=Int`, `Pi__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Int`, `Pi__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Int`, `Pi__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Int`, `Pi__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `Pi__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Int`, `Pi__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Int`, `Pi__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Int`, `Pi__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Int`, `Pi__Case=Dat\|Number=Plur\|PronType=Int`, `Pi__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Int`, `Pi__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Int`, `Pi__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Int`, `Pi__Case=Gen\|Number=Sing\|PronType=Int`, `Pi__Case=Nom\|Gender=Fem,Masc\|Number=Plur\|PronType=Int`, `Pi__Case=Nom\|Gender=Fem,Masc\|Number=Sing\|PronType=Int`, `Pi__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Int`, `Pi__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Int`, `Pi__Case=Nom\|Gender=Masc,Neut\|Number=Sing\|PronType=Int`, `Pi__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Int`, `Pi__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `Pi__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Int`, `Pi__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `Pk__Case=Abl\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Abl\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Abl\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Abl\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Abl\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Abl\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Abl\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Gender=Fem,Masc\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Gender=Fem,Masc\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Gender=Neut\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Acc\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Dat\|Gender=Fem,Masc\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Dat\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Dat\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Dat\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Dat\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Dat\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Dat\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Gen\|Gender=Masc,Neut\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Gen\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Gen\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pk__Case=Gen\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `Pp__Case=Abl\|Gender=Fem,Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Abl\|Gender=Fem,Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Abl\|Gender=Fem,Masc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Abl\|Gender=Fem,Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Abl\|Gender=Fem,Neut\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Abl\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Abl\|Gender=Fem\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Abl\|Gender=Fem\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Abl\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Abl\|Gender=Masc,Neut\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Abl\|Gender=Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Abl\|Gender=Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Abl\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Abl\|Gender=Masc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Abl\|Gender=Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Abl\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Abl\|Gender=Neut\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Abl\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Abl\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Abl\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Acc\|Gender=Fem,Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Acc\|Gender=Fem,Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Acc\|Gender=Fem,Masc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Acc\|Gender=Fem,Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Acc\|Gender=Fem\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Acc\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Acc\|Gender=Fem\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Acc\|Gender=Fem\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Acc\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Acc\|Gender=Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Acc\|Gender=Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Acc\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Acc\|Gender=Masc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Acc\|Gender=Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Acc\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Acc\|Gender=Neut\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Acc\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Acc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Acc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Acc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Acc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem,Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem,Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem,Masc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem,Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Dat\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Dat\|Gender=Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Dat\|Gender=Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Dat\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Dat\|Gender=Masc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Dat\|Gender=Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Dat\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Dat\|Gender=Neut\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Dat\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Dat\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Dat\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Dat\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Dat\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Gen\|Gender=Fem,Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Gen\|Gender=Fem,Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Gen\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Gen\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Gen\|Gender=Masc,Neut\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Gen\|Gender=Masc,Neut\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Gen\|Gender=Masc,Neut\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Gen\|Gender=Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Gen\|Gender=Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Gen\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Gen\|Gender=Masc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Gen\|Gender=Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Gen\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Gen\|Gender=Neut\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Gen\|Gender=Neut\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Gen\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Gen\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Gen\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Gen\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Nom\|Gender=Fem,Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Nom\|Gender=Fem,Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Nom\|Gender=Fem,Masc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Nom\|Gender=Fem,Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Nom\|Gender=Fem\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Nom\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Nom\|Gender=Fem\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Nom\|Gender=Fem\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Nom\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Nom\|Gender=Masc\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Nom\|Gender=Masc\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Nom\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Nom\|Gender=Masc\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Nom\|Gender=Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pp__Case=Nom\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Nom\|Gender=Neut\|Number=Plur\|Person=3\|PronType=Prs`, `Pp__Case=Nom\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs`, `Pp__Case=Nom\|Number=Plur\|Person=1\|PronType=Prs`, `Pp__Case=Nom\|Number=Plur\|Person=2\|PronType=Prs`, `Pp__Case=Nom\|Number=Sing\|Person=1\|PronType=Prs`, `Pp__Case=Voc\|Gender=Fem,Masc\|Number=Sing\|Person=2\|PronType=Prs`, `Pr__Case=Abl\|Gender=Fem,Masc\|Number=Plur\|PronType=Rel`, `Pr__Case=Abl\|Gender=Fem\|Number=Plur\|PronType=Rel`, `Pr__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Rel`, `Pr__Case=Abl\|Gender=Masc,Neut\|Number=Sing\|PronType=Rel`, `Pr__Case=Abl\|Gender=Masc\|Number=Plur\|PronType=Rel`, `Pr__Case=Abl\|Gender=Masc\|Number=Sing\|PronType=Rel`, `Pr__Case=Abl\|Gender=Neut\|Number=Plur\|PronType=Rel`, `Pr__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Rel`, `Pr__Case=Abl\|Number=Plur\|PronType=Rel`, `Pr__Case=Acc\|Gender=Fem,Masc\|Number=Sing\|PronType=Rel`, `Pr__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Rel`, `Pr__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Rel`, `Pr__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Rel`, `Pr__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Rel`, `Pr__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Rel`, `Pr__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Rel`, `Pr__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Rel`, `Pr__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Rel`, `Pr__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Rel`, `Pr__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Rel`, `Pr__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Rel`, `Pr__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Rel`, `Pr__Case=Dat\|Number=Plur\|PronType=Rel`, `Pr__Case=Dat\|Number=Sing\|PronType=Rel`, `Pr__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Rel`, `Pr__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Rel`, `Pr__Case=Gen\|Gender=Masc,Neut\|Number=Plur\|PronType=Rel`, `Pr__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Rel`, `Pr__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Rel`, `Pr__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Rel`, `Pr__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Rel`, `Pr__Case=Gen\|Number=Sing\|PronType=Rel`, `Pr__Case=Nom\|Gender=Fem,Masc\|Number=Plur\|PronType=Rel`, `Pr__Case=Nom\|Gender=Fem,Masc\|Number=Sing\|PronType=Rel`, `Pr__Case=Nom\|Gender=Fem,Neut\|Number=Plur\|PronType=Rel`, `Pr__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Rel`, `Pr__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Rel`, `Pr__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Rel`, `Pr__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Rel`, `Pr__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Rel`, `Pr__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Rel`, `Ps__Case=Abl\|Gender=Fem,Masc\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Abl\|Gender=Fem\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Abl\|Gender=Fem\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Abl\|Gender=Fem\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Abl\|Gender=Fem\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Abl\|Gender=Masc,Neut\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Abl\|Gender=Masc,Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Abl\|Gender=Masc,Neut\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Abl\|Gender=Masc\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Abl\|Gender=Masc\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Abl\|Gender=Masc\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Abl\|Gender=Masc\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Abl\|Gender=Neut\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Abl\|Gender=Neut\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Abl\|Gender=Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Abl\|Gender=Neut\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Abl\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Abl\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Acc\|Gender=Fem\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Acc\|Gender=Fem\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Acc\|Gender=Fem\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Acc\|Gender=Fem\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Acc\|Gender=Masc,Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Acc\|Gender=Masc,Neut\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Acc\|Gender=Masc\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Acc\|Gender=Masc\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Acc\|Gender=Masc\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Acc\|Gender=Masc\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Acc\|Gender=Neut\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Acc\|Gender=Neut\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Acc\|Gender=Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Acc\|Gender=Neut\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Dat\|Gender=Fem\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Dat\|Gender=Fem\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Dat\|Gender=Fem\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Dat\|Gender=Fem\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Dat\|Gender=Masc,Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Dat\|Gender=Masc,Neut\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Dat\|Gender=Masc\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Dat\|Gender=Masc\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Dat\|Gender=Masc\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Dat\|Gender=Masc\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Dat\|Gender=Neut\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Dat\|Gender=Neut\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Dat\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Dat\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Gen\|Gender=Fem\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Gen\|Gender=Fem\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Gen\|Gender=Fem\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Gen\|Gender=Fem\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Gen\|Gender=Masc,Neut\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Gen\|Gender=Masc,Neut\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Gen\|Gender=Masc,Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Gen\|Gender=Masc,Neut\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Gen\|Gender=Masc\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Gen\|Gender=Masc\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Gen\|Gender=Masc\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Gen\|Gender=Masc\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Gen\|Gender=Neut\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Gen\|Gender=Neut\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Gen\|Gender=Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Gen\|Gender=Neut\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Gen\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Nom\|Gender=Fem\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Nom\|Gender=Fem\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Nom\|Gender=Fem\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Nom\|Gender=Fem\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Nom\|Gender=Masc,Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Nom\|Gender=Masc\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Nom\|Gender=Masc\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Nom\|Gender=Masc\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Nom\|Gender=Masc\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Nom\|Gender=Neut\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Nom\|Gender=Neut\|Number=Plur\|Person=2\|Poss=Yes`, `Ps__Case=Nom\|Gender=Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Nom\|Gender=Neut\|Number=Sing\|Person=2\|Poss=Yes`, `Ps__Case=Voc\|Gender=Fem\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Voc\|Gender=Masc\|Number=Plur\|Person=1\|Poss=Yes`, `Ps__Case=Voc\|Gender=Masc\|Number=Sing\|Person=1\|Poss=Yes`, `Ps__Case=Voc\|Gender=Neut\|Number=Sing\|Person=1\|Poss=Yes`, `Pt__Case=Abl\|Gender=Fem,Masc\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Abl\|Gender=Fem\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Abl\|Gender=Fem\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Abl\|Gender=Masc,Neut\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Abl\|Gender=Masc,Neut\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Abl\|Gender=Masc\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Abl\|Gender=Masc\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Abl\|Gender=Neut\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Abl\|Gender=Neut\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Abl\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Acc\|Gender=Fem\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Acc\|Gender=Fem\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Acc\|Gender=Masc,Neut\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Acc\|Gender=Masc\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Acc\|Gender=Masc\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Acc\|Gender=Neut\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Acc\|Gender=Neut\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Dat\|Gender=Fem\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Dat\|Gender=Fem\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Dat\|Gender=Masc,Neut\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Dat\|Gender=Masc\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Dat\|Gender=Masc\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Dat\|Gender=Neut\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Dat\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Gen\|Gender=Fem\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Gen\|Gender=Fem\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Gen\|Gender=Masc,Neut\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Gen\|Gender=Masc,Neut\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Gen\|Gender=Masc\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Gen\|Gender=Masc\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Gen\|Gender=Neut\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Gen\|Gender=Neut\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Gen\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Nom\|Gender=Fem\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Nom\|Gender=Masc\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Nom\|Gender=Masc\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Nom\|Gender=Neut\|Number=Plur\|Person=3\|Poss=Yes\|Reflex=Yes`, `Pt__Case=Nom\|Gender=Neut\|Number=Sing\|Person=3\|Poss=Yes\|Reflex=Yes`, `Px`, `Px__Case=Abl\|Gender=Fem,Masc\|Number=Plur`, `Px__Case=Abl\|Gender=Fem,Neut\|Number=Plur`, `Px__Case=Abl\|Gender=Fem\|Number=Plur`, `Px__Case=Abl\|Gender=Fem\|Number=Sing`, `Px__Case=Abl\|Gender=Masc,Neut\|Number=Plur`, `Px__Case=Abl\|Gender=Masc,Neut\|Number=Sing`, `Px__Case=Abl\|Gender=Masc\|Number=Plur`, `Px__Case=Abl\|Gender=Masc\|Number=Sing`, `Px__Case=Abl\|Gender=Neut\|Number=Plur`, `Px__Case=Abl\|Gender=Neut\|Number=Sing`, `Px__Case=Abl\|Number=Plur`, `Px__Case=Abl\|Number=Sing`, `Px__Case=Acc\|Gender=Fem,Masc\|Number=Plur`, `Px__Case=Acc\|Gender=Fem,Masc\|Number=Sing`, `Px__Case=Acc\|Gender=Fem\|Number=Plur`, `Px__Case=Acc\|Gender=Fem\|Number=Sing`, `Px__Case=Acc\|Gender=Masc,Neut\|Number=Sing`, `Px__Case=Acc\|Gender=Masc\|Number=Plur`, `Px__Case=Acc\|Gender=Masc\|Number=Sing`, `Px__Case=Acc\|Gender=Neut\|Number=Plur`, `Px__Case=Acc\|Gender=Neut\|Number=Sing`, `Px__Case=Dat\|Gender=Fem,Masc\|Number=Plur`, `Px__Case=Dat\|Gender=Fem,Masc\|Number=Sing`, `Px__Case=Dat\|Gender=Fem\|Number=Plur`, `Px__Case=Dat\|Gender=Fem\|Number=Sing`, `Px__Case=Dat\|Gender=Masc\|Number=Plur`, `Px__Case=Dat\|Gender=Masc\|Number=Sing`, `Px__Case=Dat\|Gender=Neut\|Number=Plur`, `Px__Case=Dat\|Gender=Neut\|Number=Sing`, `Px__Case=Dat\|Number=Plur`, `Px__Case=Dat\|Number=Sing`, `Px__Case=Gen\|Gender=Fem,Masc\|Number=Sing`, `Px__Case=Gen\|Gender=Fem\|Number=Plur`, `Px__Case=Gen\|Gender=Fem\|Number=Sing`, `Px__Case=Gen\|Gender=Masc\|Number=Plur`, `Px__Case=Gen\|Gender=Masc\|Number=Sing`, `Px__Case=Gen\|Gender=Neut\|Number=Plur`, `Px__Case=Gen\|Gender=Neut\|Number=Sing`, `Px__Case=Gen\|Number=Plur`, `Px__Case=Gen\|Number=Sing`, `Px__Case=Nom\|Gender=Fem,Masc\|Number=Plur`, `Px__Case=Nom\|Gender=Fem,Masc\|Number=Sing`, `Px__Case=Nom\|Gender=Fem\|Number=Plur`, `Px__Case=Nom\|Gender=Fem\|Number=Sing`, `Px__Case=Nom\|Gender=Masc,Neut\|Number=Sing`, `Px__Case=Nom\|Gender=Masc\|Number=Plur`, `Px__Case=Nom\|Gender=Masc\|Number=Sing`, `Px__Case=Nom\|Gender=Neut\|Number=Plur`, `Px__Case=Nom\|Gender=Neut\|Number=Sing`, `Px__Case=Nom\|Number=Sing`, `Px__Case=Voc\|Gender=Fem\|Number=Plur`, `Px__Case=Voc\|Gender=Masc\|Number=Plur`, `Px__Case=Voc\|Gender=Masc\|Number=Sing`, `R-`, `V-`, `V-__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Gender=Fem,Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Act`, `V-__Aspect=Perf\|Case=Abl\|Gender=Fem,Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Act`, `V-__Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Act`, `V-__Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Gender=Masc,Neut\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Gender=Masc,Neut\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Act`, `V-__Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Act`, `V-__Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Gender=Neut\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Gender=Neut\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Abl\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Acc\|Gender=Masc,Neut\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Act`, `V-__Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Acc\|Gender=Neut\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Acc\|Gender=Neut\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Dat\|Gender=Neut\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Dat\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Gen\|Gender=Fem\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Gen\|Gender=Fem\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Gen\|Gender=Masc,Neut\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Gen\|Gender=Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Gen\|Gender=Masc\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Gen\|Gender=Neut\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Gen\|Gender=Neut\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Nom\|Gender=Fem,Masc\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Act`, `V-__Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Act`, `V-__Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Sing\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Case=Voc\|Gender=Masc\|Number=Plur\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `V-__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `V-__Aspect=Perf\|Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Sub\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `V-__Aspect=Perf\|Tense=Past\|VerbForm=Inf\|Voice=Act`, `V-__Aspect=Perf\|Tense=Past\|VerbForm=Inf\|Voice=Pass`, `V-__Case=Abl\|Gender=Fem\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|Gender=Fem\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Abl\|Gender=Fem\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|Gender=Fem\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|Gender=Fem\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Abl\|Gender=Masc,Neut\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Abl\|Gender=Masc\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|Gender=Masc\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Abl\|Gender=Masc\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|Gender=Masc\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Abl\|Gender=Neut\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|Gender=Neut\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Abl\|Gender=Neut\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|Gender=Neut\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|Gender=Neut\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Abl\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Abl\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Abl\|VerbForm=Ger`, `V-__Case=Abl\|VerbForm=Sup`, `V-__Case=Acc\|Gender=Fem,Masc\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Fem,Masc\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Fem\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Fem\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Pass`, `V-__Case=Acc\|Gender=Fem\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Acc\|Gender=Fem\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Fem\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Acc\|Gender=Masc,Neut\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Acc\|Gender=Masc\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Masc\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Acc\|Gender=Masc\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Masc\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Pass`, `V-__Case=Acc\|Gender=Masc\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Acc\|Gender=Neut\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Neut\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Neut\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Acc\|Gender=Neut\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Neut\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Acc\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Acc\|VerbForm=Ger`, `V-__Case=Acc\|VerbForm=Sup`, `V-__Case=Dat\|Gender=Fem\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|Gender=Fem\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|Gender=Fem\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|Gender=Fem\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Dat\|Gender=Masc\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|Gender=Masc\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Dat\|Gender=Masc\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|Gender=Masc\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|Gender=Neut\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|Gender=Neut\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Dat\|VerbForm=Ger`, `V-__Case=Gen\|Gender=Fem,Masc\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Gender=Fem\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Gender=Fem\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Gender=Fem\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Gen\|Gender=Fem\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Gender=Fem\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Gen\|Gender=Masc,Neut\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Gender=Masc,Neut\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Gen\|Gender=Masc,Neut\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Gen\|Gender=Masc\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Gender=Masc\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Gen\|Gender=Masc\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Gender=Masc\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Gen\|Gender=Neut\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Gender=Neut\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Gen\|Gender=Neut\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Gen\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Gen\|VerbForm=Ger`, `V-__Case=Nom\|Gender=Fem,Masc\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Fem,Masc\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Fem\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Fem\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Nom\|Gender=Fem\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Fem\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Nom\|Gender=Masc\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Masc\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Nom\|Gender=Masc\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Masc\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Nom\|Gender=Neut\|Number=Plur\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Neut\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Gdv`, `V-__Case=Nom\|Gender=Neut\|Number=Sing\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Neut\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Gdv`, `V-__Case=Nom\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Nom\|VerbForm=Ger`, `V-__Case=Voc\|Gender=Fem\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Voc\|Gender=Masc\|Number=Plur\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Case=Voc\|Gender=Masc\|Number=Sing\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `V-__Mood=Imp\|Number=Plur\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Imp\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Imp\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Imp\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Plur\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Plur\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Sing\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Sing\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `V-__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `V-__Tense=Pres\|VerbForm=Inf\|Voice=Act`, `V-__Tense=Pres\|VerbForm=Inf\|Voice=Pass`, `a-p----b-__Case=Abl\|Gender=Fem\|Number=Plur`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|NumForm=Word\|NumType=Card`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Ord`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Con`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Rel`, `a-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Tot`, `a-p---fac__Case=Acc\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `a-p---fap__Case=Acc\|Gender=Fem\|Number=Plur`, `a-p---fas__Case=Acc\|Degree=Abs\|Gender=Fem\|Number=Plur`, `a-p---fas__Case=Acc\|Degree=Abs\|Gender=Fem\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur`, `a-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|NumForm=Word\|NumType=Card`, `a-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|PronType=Dem`, `a-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|PronType=Ind`, `a-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|PronType=Tot`, `a-p---fd-__Case=Dat\|Gender=Fem\|Number=Plur`, `a-p---fd-__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Con`, `a-p---fd-__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Tot`, `a-p---fg-__Case=Gen\|Gender=Fem\|Number=Plur`, `a-p---fg-__Case=Gen\|Gender=Fem\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---fg-__Case=Gen\|Gender=Fem\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur`, `a-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|NumForm=Word\|NumType=Card`, `a-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|NumType=Ord`, `a-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Con`, `a-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `a-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Tot`, `a-p---fnc__Case=Nom\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `a-p---fns__Case=Nom\|Degree=Abs\|Gender=Fem\|Number=Plur`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|NumForm=Word\|NumType=Card`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Con`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `a-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Tot`, `a-p---mac__Case=Acc\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `a-p---mac__Case=Acc\|Degree=Cmp\|Gender=Masc\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---map__Case=Acc\|Gender=Masc\|Number=Plur`, `a-p---mas__Case=Acc\|Degree=Abs\|Gender=Masc\|Number=Plur`, `a-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur`, `a-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|NumForm=Word\|NumType=Card`, `a-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|PronType=Dem`, `a-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|PronType=Ind`, `a-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|PronType=Tot`, `a-p---mbc__Case=Abl\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `a-p---mbc__Case=Abl\|Degree=Cmp\|Gender=Masc\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---mbs__Case=Abl\|Degree=Abs\|Gender=Masc\|Number=Plur`, `a-p---mbs__Case=Abl\|Degree=Abs\|Gender=Masc\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---md-__Case=Dat\|Gender=Masc\|Number=Plur`, `a-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Con`, `a-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Tot`, `a-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur`, `a-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|NumForm=Word\|NumType=Card`, `a-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Con`, `a-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Tot`, `a-p---mgc__Case=Gen\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `a-p---mgp__Case=Gen\|Gender=Masc\|Number=Plur`, `a-p---mn-__Case=Nom\|Gender=Fem\|Number=Plur`, `a-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur`, `a-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|NumForm=Word\|NumType=Card`, `a-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Con`, `a-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `a-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Tot`, `a-p---mns__Case=Nom\|Degree=Abs\|Gender=Masc\|Number=Plur`, `a-p---na-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Tot`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|NumForm=Word\|NumType=Card`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Con`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Rel`, `a-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Tot`, `a-p---nac__Case=Acc\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `a-p---nac__Case=Acc\|Degree=Cmp\|Gender=Neut\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---nap__Case=Acc\|Gender=Neut\|Number=Plur`, `a-p---nas__Case=Acc\|Degree=Abs\|Gender=Neut\|Number=Plur`, `a-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur`, `a-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|NumType=Ord`, `a-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|PronType=Con`, `a-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|PronType=Dem`, `a-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|PronType=Tot`, `a-p---nbs__Case=Abl\|Degree=Abs\|Gender=Neut\|Number=Plur`, `a-p---nd-__Case=Dat\|Gender=Neut\|Number=Plur`, `a-p---nd-__Case=Dat\|Gender=Neut\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---ng-__Case=Gen\|Gender=Neut\|Number=Plur`, `a-p---ng-__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Tot`, `a-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur`, `a-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|NumForm=Word\|NumType=Card`, `a-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Con`, `a-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `a-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Tot`, `a-p---nnc__Case=Nom\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `a-p---nnc__Case=Nom\|Degree=Cmp\|Gender=Neut\|Number=Plur\|NumType=Card\|PronType=Ind`, `a-s----a-__Case=Acc\|Gender=Neut\|Number=Sing\|NumType=Card\|PronType=Ind`, `a-s----g-__Case=Gen\|Gender=Fem\|Number=Sing`, `a-s----n-__Case=Nom\|Gender=Neut\|Number=Sing\|NumType=Card\|PronType=Ind`, `a-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing`, `a-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|NumType=Ord`, `a-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Con`, `a-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `a-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `a-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Tot`, `a-s---fap__Case=Acc\|Gender=Fem\|Number=Sing`, `a-s---fap__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Con`, `a-s---fas__Case=Acc\|Degree=Abs\|Gender=Fem\|Number=Sing`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|NumType=Ord`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Con`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Dem`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Ind`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Rel`, `a-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Tot`, `a-s---fbc__Case=Abl\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `a-s---fbp__Case=Abl\|Gender=Fem\|Number=Sing`, `a-s---fbs__Case=Abl\|Degree=Abs\|Gender=Fem\|Number=Sing`, `a-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing`, `a-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Ind`, `a-s---fds__Case=Dat\|Degree=Abs\|Gender=Fem\|Number=Sing`, `a-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing`, `a-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Con`, `a-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Tot`, `a-s---fgc__Case=Gen\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `a-s---fgp__Case=Gen\|Gender=Fem\|Number=Sing`, `a-s---fgs__Case=Gen\|Degree=Abs\|Gender=Fem\|Number=Sing`, `a-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing`, `a-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Con`, `a-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `a-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `a-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Rel`, `a-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Tot`, `a-s---fn-__Case=Nom\|Gender=Neut\|Number=Sing`, `a-s---fnc__Case=Nom\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `a-s---fnp__Case=Nom\|Gender=Fem\|Number=Sing`, `a-s---fns__Case=Nom\|Degree=Abs\|Gender=Fem\|Number=Sing`, `a-s---fns__Case=Nom\|Degree=Abs\|Gender=Fem\|Number=Sing\|NumType=Card\|PronType=Ind`, `a-s---fv-__Case=Voc\|Gender=Fem\|Number=Sing`, `a-s---fv-__Case=Voc\|Gender=Fem\|Number=Sing\|PronType=Rel`, `a-s---fvs__Case=Voc\|Degree=Abs\|Gender=Fem\|Number=Sing`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|NumType=Card\|PronType=Ind`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|NumType=Ord`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Con`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `a-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Tot`, `a-s---mac__Case=Acc\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `a-s---map__Case=Acc\|Gender=Masc\|Number=Sing\|NumType=Ord`, `a-s---mas__Case=Acc\|Degree=Abs\|Gender=Masc\|Number=Sing`, `a-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing`, `a-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|NumType=Card\|PronType=Ind`, `a-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|NumType=Ord`, `a-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|PronType=Con`, `a-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|PronType=Ind`, `a-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|PronType=Tot`, `a-s---mbc__Case=Abl\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `a-s---mbs__Case=Abl\|Degree=Abs\|Gender=Masc\|Number=Sing`, `a-s---md-__Case=Dat\|Gender=Masc\|Number=Sing`, `a-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Con`, `a-s---mdc__Case=Dat\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `a-s---mds__Case=Dat\|Degree=Abs\|Gender=Masc\|Number=Sing`, `a-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing`, `a-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Con`, `a-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Tot`, `a-s---mgc__Case=Gen\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `a-s---mgp__Case=Gen\|Gender=Masc\|Number=Sing`, `a-s---mgs__Case=Gen\|Degree=Abs\|Gender=Masc\|Number=Sing`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Card\|PronType=Ind`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Ord`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Con`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Rel`, `a-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Tot`, `a-s---mnc__Case=Nom\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `a-s---mnp__Case=Nom\|Gender=Masc\|Number=Sing`, `a-s---mns__Case=Nom\|Degree=Abs\|Gender=Masc\|Number=Sing`, `a-s---mv-__Case=Voc\|Gender=Masc\|Number=Sing`, `a-s---mvs__Case=Voc\|Degree=Abs\|Gender=Masc\|Number=Sing`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|NumType=Ord`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Con`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Rel`, `a-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Tot`, `a-s---nac__Case=Acc\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `a-s---nac__Case=Acc\|Degree=Cmp\|Gender=Neut\|Number=Sing\|NumType=Card\|PronType=Ind`, `a-s---nap__Case=Acc\|Gender=Neut\|Number=Sing`, `a-s---nb-__Case=Abl\|Gender=Fem\|Number=Sing`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|NumType=Ord`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Con`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Dem`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Ind`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Rel`, `a-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Tot`, `a-s---nbc__Case=Abl\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `a-s---nbc__Case=Abl\|Degree=Cmp\|Gender=Neut\|Number=Sing\|NumType=Ord`, `a-s---nbs__Case=Abl\|Degree=Abs\|Gender=Neut\|Number=Sing`, `a-s---nd-__Case=Dat\|Gender=Neut\|Number=Sing`, `a-s---nd-__Case=Dat\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `a-s---ndp__Case=Dat\|Gender=Neut\|Number=Sing`, `a-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing`, `a-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|NumType=Ord`, `a-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Con`, `a-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Dem`, `a-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `a-s---ngc__Case=Gen\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `a-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing`, `a-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|NumType=Ord`, `a-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `a-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Con`, `a-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Rel`, `a-s---nnc__Case=Nom\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `a-s---nnc__Case=Nom\|Degree=Cmp\|Gender=Neut\|Number=Sing\|NumType=Card\|PronType=Ind`, `a-s---nns__Case=Nom\|Degree=Abs\|Gender=Neut\|Number=Sing`, `c--------`, `c--------__PronType=Rel`, `d--------`, `d--------__AdvType=Loc`, `d--------__AdvType=Tim`, `d--------__Compound=Yes\|PronType=Rcp`, `d--------__Degree=Cmp`, `d--------__Polarity=Neg`, `d--------__PronType=Dem`, `d--------__PronType=Rel`, `d-------c__Case=Acc\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `d-------c__Degree=Cmp`, `d-------s__Degree=Abs`, `d-s---fn-`, `e--------`, `m--------`, `m--------__NumForm=Roman\|NumType=Card`, `m--------__NumForm=Word\|NumType=Card`, `m-----fa-__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Dist`, `m-----fa-__NumForm=Roman\|NumType=Card`, `m-----fn-__Case=Nom\|Gender=Fem\|NumForm=Word\|NumType=Card`, `m-----mg-__NumForm=Roman\|NumType=Card`, `m-----mn-__Case=Nom\|Gender=Masc\|NumForm=Word\|NumType=Card`, `m-----na-__Case=Acc\|Gender=Neut\|NumForm=Word\|NumType=Card`, `m-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|NumForm=Word\|NumType=Card`, `m-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Dist`, `m-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|NumType=Ord`, `m-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Tot`, `m-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|NumForm=Word\|NumType=Card`, `m-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Dist`, `m-p---ma-__NumForm=Roman\|NumType=Card`, `m-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|NumType=Dist`, `m-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Tot`, `m-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|NumForm=Word\|NumType=Card`, `m-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Dist`, `m-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|NumForm=Word\|NumType=Card`, `m-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Dist`, `m-p---na-__Case=Acc\|Gender=Neut\|Number=Plur`, `m-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|NumForm=Word\|NumType=Card`, `m-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|NumType=Dist`, `m-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|NumType=Ord`, `m-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|NumForm=Word\|NumType=Card`, `m-p---nd-__Case=Dat\|Gender=Neut\|Number=Plur\|NumForm=Word\|NumType=Card`, `m-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|NumForm=Word\|NumType=Card`, `m-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|NumType=Ord`, `m-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|NumForm=Word\|NumType=Card`, `m-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|NumType=Ord`, `m-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|NumForm=Word\|NumType=Card`, `m-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|NumType=Ord`, `m-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|NumForm=Word\|NumType=Card`, `m-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|NumType=Ord`, `m-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|NumForm=Word\|NumType=Card`, `m-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|NumType=Ord`, `m-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|NumForm=Word\|NumType=Card`, `m-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Ord`, `m-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|NumForm=Word\|NumType=Card`, `m-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|NumForm=Word\|NumType=Card`, `m-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|NumType=Ord`, `m-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|NumType=Ord`, `n--------__PronType=Dem`, `n-p----b-__Case=Abl\|Gender=Masc\|Number=Plur`, `n-p---f--__Case=Nom\|Gender=Fem\|Number=Plur`, `n-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur`, `n-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur`, `n-p---fd-__Case=Dat\|Gender=Fem\|Number=Plur`, `n-p---fg-__Case=Gen\|Gender=Fem\|Number=Plur`, `n-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur`, `n-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Rel`, `n-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur`, `n-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Con`, `n-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Rel`, `n-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur`, `n-p---md-__Case=Dat\|Gender=Masc\|Number=Plur`, `n-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Rel`, `n-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur`, `n-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Con`, `n-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur`, `n-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Con`, `n-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Rel`, `n-p---mv-__Case=Voc\|Gender=Masc\|Number=Plur`, `n-p---na-__Case=Abl\|Gender=Fem\|Number=Sing`, `n-p---na-__Case=Acc\|Gender=Neut\|Number=Plur`, `n-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Rel`, `n-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Tot`, `n-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur`, `n-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|NumType=Card\|PronType=Ind`, `n-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|PronType=Con`, `n-p---nd-__Case=Dat\|Gender=Neut\|Number=Plur`, `n-p---ng-__Case=Gen\|Gender=Neut\|Number=Plur`, `n-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur`, `n-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Tot`, `n-p---nv-__Case=Voc\|Gender=Neut\|Number=Plur`, `n-s----a-__Case=Acc\|Gender=Masc\|Number=Sing`, `n-s----b-__Case=Acc\|Gender=Neut\|Number=Sing`, `n-s----n-__Case=Nom\|Gender=Masc\|Number=Sing`, `n-s---f--__Gender=Fem\|Number=Sing`, `n-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing`, `n-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Rel`, `n-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing`, `n-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Rel`, `n-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing`, `n-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing`, `n-s---fl-__Case=Loc\|Gender=Fem\|Number=Sing`, `n-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing`, `n-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Rel`, `n-s---fn-__Degree=Cmp`, `n-s---fv-__Case=Voc\|Gender=Fem\|Number=Sing`, `n-s---m--__Case=Nom\|Gender=Masc\|Number=Sing`, `n-s---ma-__Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `n-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing`, `n-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Rel`, `n-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing`, `n-s---md-__Case=Dat\|Gender=Masc\|Number=Sing`, `n-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing`, `n-s---ml-__Case=Loc\|Gender=Masc\|Number=Sing`, `n-s---mn-__Case=Nom\|Gender=Fem\|Number=Sing`, `n-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing`, `n-s---mv-__Case=Voc\|Gender=Masc\|Number=Sing`, `n-s---n--__Gender=Neut\|Number=Sing`, `n-s---n--__Gender=Neut\|Number=Sing\|Polarity=Neg\|PronType=Ind`, `n-s---na-__Case=Acc\|Gender=Neut\|Number=Sing`, `n-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Polarity=Neg\|PronType=Ind`, `n-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Con`, `n-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Rel`, `n-s---nb-__Aspect=Perf\|Case=Abl\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `n-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing`, `n-s---nb-__Case=Acc\|Gender=Neut\|Number=Sing`, `n-s---nd-__Case=Dat\|Gender=Neut\|Number=Sing`, `n-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing`, `n-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing`, `n-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Rel`, `p--------__NumType=Card\|PronType=Ind`, `p-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|Number[psor]=Plur\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `p-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `p-p---fa-__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Rel`, `p-p---fa-__Case=Acc\|Number=Plur\|Person=2\|PronType=Prs`, `p-p---fa-__Case=Acc\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|PronType=Dem`, `p-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|PronType=Ind`, `p-p---fb-__Case=Abl\|Gender=Fem\|Number=Plur\|PronType=Rel`, `p-p---fd-__Case=Dat\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---fd-__Case=Dat\|Gender=Fem\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-p---fd-__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Dem`, `p-p---fd-__Case=Dat\|Number=Plur\|Person=1\|PronType=Prs`, `p-p---fd-__Case=Dat\|Number=Plur\|Person=2\|PronType=Prs`, `p-p---fd-__Case=Dat\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-p---fg-__Case=Gen\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---fg-__Case=Gen\|Gender=Fem\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---fg-__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Dem`, `p-p---fg-__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Rel`, `p-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `p-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `p-p---fn-__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Rel`, `p-p---fn-__Case=Nom\|Number=Plur\|Person=1\|PronType=Prs`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Con`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `p-p---ma-__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Rel`, `p-p---ma-__Case=Acc\|Number=Plur\|Person=1\|PronType=Prs`, `p-p---ma-__Case=Acc\|Number=Plur\|Person=2\|PronType=Prs`, `p-p---ma-__Case=Acc\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|PronType=Dem`, `p-p---mb-__Case=Abl\|Gender=Masc\|Number=Plur\|PronType=Ind`, `p-p---mb-__Case=Abl\|Number=Plur\|Person=1\|PronType=Prs`, `p-p---mb-__Case=Abl\|Number=Plur\|Person=2\|PronType=Prs`, `p-p---mb-__Case=Abl\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-p---mb-__Case=Abl\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Dem`, `p-p---md-__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Rel`, `p-p---md-__Case=Dat\|Number=Plur\|Person=1\|PronType=Prs`, `p-p---md-__Case=Dat\|Number=Plur\|Person=2\|PronType=Prs`, `p-p---md-__Case=Dat\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Dem`, `p-p---mg-__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Rel`, `p-p---mg-__Case=Gen\|Number=Plur\|Person=1\|PronType=Prs`, `p-p---mg-__Case=Gen\|Number=Plur\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Con`, `p-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Dem`, `p-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `p-p---mn-__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Rel`, `p-p---mn-__Case=Nom\|Number=Plur\|Person=1\|PronType=Prs`, `p-p---mn-__Case=Nom\|Number=Plur\|Person=2\|PronType=Prs`, `p-p---mv-__Case=Voc\|Number=Plur\|Person=2\|PronType=Prs`, `p-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `p-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `p-p---na-__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Rel`, `p-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|PronType=Dem`, `p-p---nb-__Case=Abl\|Gender=Neut\|Number=Plur\|PronType=Rel`, `p-p---nd-__Case=Dat\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---nd-__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Dem`, `p-p---nd-__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Ind`, `p-p---nd-__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Rel`, `p-p---ng-__Case=Gen\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|Person=3\|PronType=Prs`, `p-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `p-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `p-p---nn-__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Rel`, `p-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `p-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `p-s---fa-__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Rel`, `p-s---fa-__Case=Acc\|Number=Sing\|Person=1\|PronType=Prs`, `p-s---fa-__Case=Acc\|Number=Sing\|Person=2\|PronType=Prs`, `p-s---fa-__Case=Acc\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-s---fa-__PronType=Rel`, `p-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Con`, `p-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Dem`, `p-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Ind`, `p-s---fb-__Case=Abl\|Gender=Fem\|Number=Sing\|PronType=Rel`, `p-s---fb-__Case=Abl\|Number=Sing\|Person=2\|PronType=Prs`, `p-s---fb-__Degree=Cmp`, `p-s---fb-__PronType=Ind`, `p-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Dem`, `p-s---fd-__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Rel`, `p-s---fd-__Case=Dat\|Number=Sing\|Person=1\|PronType=Prs`, `p-s---fd-__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `p-s---fd-__Case=Dat\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Dem`, `p-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `p-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Int`, `p-s---fg-__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Rel`, `p-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `p-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `p-s---fn-__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Rel`, `p-s---fn-__Case=Nom\|Number=Sing\|Person=1\|PronType=Prs`, `p-s---fn-__Case=Nom\|Number=Sing\|Person=2\|PronType=Prs`, `p-s---fv-__Case=Voc\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `p-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `p-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Int`, `p-s---ma-__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Rel`, `p-s---ma-__Case=Acc\|Number=Sing\|Person=1\|PronType=Prs`, `p-s---ma-__Case=Acc\|Number=Sing\|Person=2\|PronType=Prs`, `p-s---ma-__Case=Acc\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|PronType=Dem`, `p-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|PronType=Ind`, `p-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|PronType=Int`, `p-s---mb-__Case=Abl\|Gender=Masc\|Number=Sing\|PronType=Rel`, `p-s---mb-__Case=Abl\|Number=Sing\|Person=1\|PronType=Prs`, `p-s---mb-__Case=Abl\|Number=Sing\|Person=2\|PronType=Prs`, `p-s---mb-__Case=Abl\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-s---mb-__Case=Abl\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Dem`, `p-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `p-s---md-__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Rel`, `p-s---md-__Case=Dat\|Number=Sing\|Person=1\|PronType=Prs`, `p-s---md-__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `p-s---md-__Case=Dat\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Dem`, `p-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `p-s---mg-__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Rel`, `p-s---mg-__Case=Gen\|Number=Sing\|Person=1\|PronType=Prs`, `p-s---mg-__Case=Gen\|Number=Sing\|Person=3\|PronType=Prs\|Reflex=Yes`, `p-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Con`, `p-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `p-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `p-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `p-s---mn-__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Rel`, `p-s---mn-__Case=Nom\|Number=Sing\|Person=1\|PronType=Prs`, `p-s---mn-__Case=Nom\|Number=Sing\|Person=2\|PronType=Prs`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|Polarity=Neg\|PronType=Ind`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Con`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `p-s---na-__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Rel`, `p-s---na-__PronType=Rel`, `p-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Dem`, `p-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Ind`, `p-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Int`, `p-s---nb-__Case=Abl\|Gender=Neut\|Number=Sing\|PronType=Rel`, `p-s---nd-__Case=Dat\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---nd-__Case=Dat\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---nd-__Case=Dat\|Gender=Neut\|Number=Sing\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `p-s---nd-__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Dem`, `p-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `p-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Dem`, `p-s---ng-__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Rel`, `p-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `p-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|Person=3\|PronType=Prs`, `p-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `p-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `p-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `p-s---nn-__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Rel`, `r--------`, `u--------`, `v---d--b-__Aspect=Prosp\|Case=Abl\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v---d--g-__Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v---g--g-__Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v---na---__Aspect=Imp\|Tense=Pres\|VerbForm=Inf\|Voice=Act`, `v---s----__Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Sup\|Voice=Act`, `v--fna---__Aspect=Imp\|Tense=Fut\|VerbForm=Inf`, `v--pna---__Aspect=Imp\|Tense=Pres\|VerbForm=Inf`, `v--pna---__Aspect=Imp\|Tense=Pres\|VerbForm=Inf\|Voice=Act`, `v--pnd---__Aspect=Imp\|Tense=Pres\|VerbForm=Inf\|Voice=Act`, `v--pnp---__Aspect=Imp\|Tense=Pres\|VerbForm=Inf\|Voice=Pass`, `v--ppamn-__Aspect=Imp\|Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v--rna---__Aspect=Perf\|Tense=Past\|VerbForm=Inf`, `v--rna---__Aspect=Perf\|Tense=Past\|VerbForm=Inf\|Voice=Act`, `v-p-g-fa-__Aspect=Prosp\|Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-p-g-ma-__Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-p-g-mb-__Aspect=Prosp\|Case=Abl\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-p-g-mn-__Aspect=Prosp\|Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-p-g-na-__Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-p-g-nd-__Aspect=Prosp\|Case=Dat\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-p-g-nn-__Aspect=Prosp\|Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-pfpama-__Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pfpamn-__Aspect=Prosp\|Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pfpana-__Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pfpann-__Aspect=Prosp\|Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part`, `v-ppgpfa-__Aspect=Prosp\|Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-ppgpfg-__Aspect=Prosp\|Case=Gen\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-ppgpna-__Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-ppgpnn-__Aspect=Prosp\|Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-ppp-fa-__Aspect=Imp\|Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-ppp-fb-__Aspect=Imp\|Case=Abl\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-ppp-fn-__Aspect=Imp\|Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-ppp-ma-__Aspect=Imp\|Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-ppp-mb-__Aspect=Imp\|Case=Abl\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-ppp-md-__Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-ppp-mg-__Aspect=Imp\|Case=Gen\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-ppp-mn-__Aspect=Imp\|Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-ppp-na-__Aspect=Imp\|Case=Acc\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-ppp-nb-__Aspect=Imp\|Case=Abl\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppa-d-__Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppafa-__Aspect=Imp\|Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppafb-__Aspect=Imp\|Case=Abl\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppafg-__Aspect=Imp\|Case=Gen\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppafn-__Aspect=Imp\|Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppama-__Aspect=Imp\|Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppamd-__Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppamg-__Aspect=Imp\|Case=Gen\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppamn-__Aspect=Imp\|Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppana-__Aspect=Imp\|Case=Acc\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppanb-__Aspect=Imp\|Case=Abl\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppann-__Aspect=Imp\|Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppdma-__Aspect=Imp\|Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppdmb-__Aspect=Imp\|Case=Abl\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-pppdmd-__Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-prpafb-__Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-prpafb-__Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prpdfn-__Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-prpdma-__Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-prpdmd-__Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-prpdmn-__Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `v-prppf--__Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppfa-__Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppfb-__Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppfd-__Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppfds__Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppfn-__Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppma-__Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppma-__Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Part`, `v-prppmb-__Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppmd-__Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppmg-__Aspect=Perf\|Case=Gen\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppmn-__Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppmn-__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part`, `v-prppmv-__Aspect=Perf\|Case=Voc\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppna-__Aspect=Perf\|Case=Acc\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppnb-__Aspect=Perf\|Case=Abl\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppnd-__Aspect=Perf\|Case=Dat\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppng-__Aspect=Perf\|Case=Gen\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppnn-__Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Pass`, `v-prppnn-__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part`, `v-s-d--a-__Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-d--g-__Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-g-fa-__Aspect=Prosp\|Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-g-fb-__Aspect=Prosp\|Case=Abl\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-g-fg-__Aspect=Prosp\|Case=Gen\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-g-fn-__Aspect=Prosp\|Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-g-ma-__Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-g-na-__Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-g-ng-__Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-g-nn-__Aspect=Prosp\|Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-s-gamb-__Aspect=Prosp\|Case=Abl\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-sfpafa-__Aspect=Prosp\|Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part`, `v-sfpafn-__Aspect=Prosp\|Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sfpafv-__Aspect=Prosp\|Case=Voc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sfpama-__Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sfpamn-__Aspect=Prosp\|Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sfpana-__Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sfpang-__Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sp-ang-__Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spdana-__Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spdang-__Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spgpfg-__Aspect=Prosp\|Case=Gen\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spgpma-__Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spgpmg-__Aspect=Prosp\|Case=Gen\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spgpmn-__Aspect=Prosp\|Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spgpna-__Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spgpnb-__Aspect=Prosp\|Case=Abl\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spgpng-__Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spgpnn-__Aspect=Prosp\|Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-spp-fa-__Aspect=Imp\|Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-spp-fb-__Aspect=Imp\|Case=Abl\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-spp-fd-__Aspect=Imp\|Case=Dat\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-spp-fn-__Aspect=Imp\|Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-spp-fn-__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part`, `v-spp-ma-__Aspect=Imp\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-spp-mb-__Aspect=Imp\|Case=Abl\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-spp-md-__Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-spp-mg-__Aspect=Imp\|Case=Gen\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-spp-mn-__Aspect=Imp\|Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-spp-nb-__Aspect=Imp\|Case=Abl\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppafa-__Aspect=Imp\|Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppafb-__Aspect=Imp\|Case=Abl\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppafd-__Aspect=Imp\|Case=Dat\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppafg-__Aspect=Imp\|Case=Gen\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppafn-__Aspect=Imp\|Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppafn-__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part`, `v-sppama-__Aspect=Imp\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppamb-__Aspect=Imp\|Case=Abl\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppamd-__Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppamdp__Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppamg-__Aspect=Imp\|Case=Gen\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppamn-__Aspect=Imp\|Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppamnc__Aspect=Imp\|Case=Nom\|Degree=Cmp\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppana-__Aspect=Imp\|Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppanb-__Aspect=Imp\|Case=Abl\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppang-__Aspect=Imp\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppdma-__Aspect=Imp\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppdmb-__Aspect=Imp\|Case=Abl\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppdmd-__Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-sppdmn-__Aspect=Imp\|Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-srpdfa-__Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-srpdfd-__Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-srpdfn-__Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-srpdfn-__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part`, `v-srpdma-__Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-srpdmb-__Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-srpdmn-__Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-srpdmnp__Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-srpdnn-__Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `v-srppfa-__Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppfb-__Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppfd-__Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppfg-__Aspect=Perf\|Case=Gen\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppfn-__Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppma-__Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppmb-__Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppmd-__Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppmg-__Aspect=Perf\|Case=Gen\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppmn-__Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppmn-__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part`, `v-srppmnc__Case=Nom\|Degree=Cmp\|Gender=Masc\|Number=Sing\|VerbForm=Part`, `v-srppna-__Aspect=Perf\|Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppnb-__Aspect=Perf\|Case=Abl\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppnd-__Aspect=Perf\|Case=Dat\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppng-__Aspect=Perf\|Case=Gen\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppnn-__Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Pass`, `v-srppnn-__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part`, `v1pfia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Fut\|VerbForm=Fin`, `v1pfia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v1piia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1pisa---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin`, `v1pisa---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1plia---__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `v1ppia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin`, `v1ppia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v1ppid---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v1ppip---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v1ppsa---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v1ppsd---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v1pria---__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1prsa---__Aspect=Perf\|Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1sfia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Fut\|VerbForm=Fin`, `v1sfia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v1sfid---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v1sfip---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `v1siia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin`, `v1siia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1siip---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `v1sisa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin`, `v1sisa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1sisd---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1sisp---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `v1slia---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Pqp\|VerbForm=Fin`, `v1slia---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `v1slsa---__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `v1spia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin`, `v1spia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v1spid---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v1spip---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v1spsa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin`, `v1spsa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v1spsd---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v1spsp---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v1sr-a---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1sria---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin`, `v1sria---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1srsa---__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v1stia---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v2pfia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v2ppia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v2ppip---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v2ppma---__Aspect=Imp\|Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v2ppmp---__Aspect=Imp\|Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v2ppsa---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v2ppsp---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v2pria---__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v2sfia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin`, `v2sfia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v2sfid---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v2sfip---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `v2sfma---__Aspect=Imp\|Mood=Imp\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin`, `v2sfma---__Aspect=Imp\|Mood=Imp\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v2siia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin`, `v2siia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v2sisa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v2sisp---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `v2slsa---__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `v2spia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin`, `v2spia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v2spid---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v2spip---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v2spma---__Aspect=Imp\|Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v2spmd---__Aspect=Imp\|Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v2spsa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin`, `v2spsa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v2spsd---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v2spsp---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v2sria---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v2srsa---__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v2stia---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v3pfia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Fut\|VerbForm=Fin`, `v3pfia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v3pfid---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v3pfip---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `v3pfma---__Aspect=Imp\|Mood=Imp\|Number=Plur\|Person=3\|Tense=Fut\|VerbForm=Fin`, `v3piia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin`, `v3piia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3piid---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3piip---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `v3pisa---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin`, `v3pisa---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3pisd---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3pisp---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `v3plia---__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Pqp\|VerbForm=Fin`, `v3plia---__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `v3plsa---__Aspect=Perf\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `v3ppia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin`, `v3ppia---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v3ppid---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v3ppip---__Aspect=Imp\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v3ppsa---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin`, `v3ppsa---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v3ppsd---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v3ppsp---__Aspect=Imp\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v3pria---__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin`, `v3pria---__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3prsa---__Aspect=Perf\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin`, `v3prsa---__Aspect=Perf\|Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3ptia---__Aspect=Perf\|Mood=Ind\|Number=Plur\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v3s-ia---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `v3s-sa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3sfia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin`, `v3sfia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v3sfid---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `v3sfip---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `v3si-p---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `v3siia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin`, `v3siia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3siid---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3siip---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `v3sisa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin`, `v3sisa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3sisd---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3sisp---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `v3slia---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Pqp\|VerbForm=Fin`, `v3slia---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `v3slsa---__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `v3spia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin`, `v3spia---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v3spid---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v3spip---__Aspect=Imp\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v3spsa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin`, `v3spsa---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v3spsd---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `v3spsp---__Aspect=Imp\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `v3sria---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin`, `v3sria---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3srsa---__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin`, `v3srsa---__Aspect=Perf\|Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `v3stia---__Aspect=Perf\|Mood=Ind\|Number=Sing\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act` |
| **`morphologizer`** | `POS=ADV`, `POS=ADV\|Polarity=Neg`, `POS=ADP`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Mood=Imp\|Number=Sing\|POS=AUX\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=NOUN`, `POS=SCONJ`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=NOUN`, `POS=PUNCT`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Ind`, `Aspect=Imp\|POS=VERB\|Tense=Pres\|VerbForm=Inf\|Voice=Pass`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADJ`, `AdvType=Tim\|POS=ADV`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin`, `Case=Abl\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Aspect=Imp\|POS=VERB\|Tense=Pres\|VerbForm=Inf\|Voice=Act`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Rel`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `POS=CCONJ`, `Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=NOUN`, `POS=VERB\|Tense=Pres\|VerbForm=Inf\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Voc\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Case=Acc\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON`, `Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Acc\|Gender=Fem\|NumType=Dist\|Number=Plur\|POS=ADJ`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Rel`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Tot`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Acc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Rel`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `POS=INTJ`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Degree=Pos\|POS=ADV`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET`, `Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Rel`, `Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `POS=NOUN`, `Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `POS=PROPN`, `POS=ADV\|PronType=Rel`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=NUM`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Case=Gen\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Abl\|Degree=Pos\|Number=Plur\|POS=ADJ`, `Degree=Sup\|POS=ADV`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Degree=Cmp\|Gender=Fem,Masc\|Number=Plur\|POS=ADJ`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|POS=VERB\|Tense=Past\|VerbForm=Inf\|Voice=Act`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Degree=Cmp\|Gender=Fem,Masc\|Number=Sing\|POS=ADJ`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Abl\|Degree=Pos\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET`, `POS=ADV\|PronType=Int`, `Case=Abl\|Number=Plur\|POS=DET`, `Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Acc\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|POS=AUX\|Tense=Past\|VerbForm=Inf\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `POS=AUX\|Tense=Pres\|VerbForm=Inf\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|Poss=Yes`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Aspect=Perf\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Degree=Cmp\|Gender=Fem,Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Degree=Cmp\|POS=ADV`, `Case=Acc\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Masc,Neut\|Number=Plur\|POS=NOUN`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Acc\|Gender=Fem,Masc\|Number=Sing\|POS=DET`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=NUM`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Nom\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Case=Abl\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Mood=Imp\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `POS=VERB\|Tense=Pres\|VerbForm=Inf\|Voice=Pass`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Rel`, `Mood=Imp\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=ADJ`, `POS=PART`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|Polarity=Neg\|PronType=Ind`, `Case=Dat\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Abl\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Fem,Masc\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Nom\|Degree=Pos\|Gender=Fem,Masc\|Number=Sing\|POS=ADJ`, `Case=Dat\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|POS=VERB\|VerbForm=Ger`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Nom\|Gender=Fem,Masc\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=NUM`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Abl\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Number=Sing\|POS=DET`, `Case=Abl\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Voc\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Con`, `Aspect=Imp\|Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=NUM`, `Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=NUM`, `Aspect=Perf\|Case=Abl\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Gen\|POS=VERB\|VerbForm=Ger`, `Case=Dat\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=DET`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Nom\|Gender=Masc\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET`, `POS=NUM`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PRON`, `Aspect=Perf\|Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Aspect=Imp\|Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Int`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Gender=Fem,Masc\|Number=Sing\|POS=DET`, `Case=Gen\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Degree=Pos\|Gender=Fem,Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Rel`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|POS=VERB\|VerbForm=Ger`, `Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Rel`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Abl\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `AdpType=Post\|POS=ADP`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Ind`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Neut\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=NOUN`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=DET`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Voc\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Mood=Imp\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Nom\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Abl\|Degree=Sup\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `POS=ADJ`, `Mood=Imp\|Number=Sing\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Abl\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Neut\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Perf\|Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Nom\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur\|POS=ADJ`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Abl\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `NumForm=Word\|NumType=Card\|POS=NUM`, `Aspect=Imp\|Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `AdvType=Loc\|POS=ADV`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=DET`, `Case=Abl\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Nom\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Mood=Imp\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Voc\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Nom\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Abl\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Dat\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Dat\|Number=Sing\|POS=DET`, `Case=Abl\|Gender=Fem,Neut\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Aspect=Imp\|Mood=Imp\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Aspect=Perf\|POS=AUX\|Tense=Past\|VerbForm=Inf`, `Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Gen\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin`, `Aspect=Imp\|Case=Abl\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Neut\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Abl\|Gender=Neut\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Tot`, `Mood=Imp\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Con`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Mood=Imp\|Number=Plur\|POS=AUX\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Case=Abl\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `POS=VERB`, `Case=Nom\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Degree=Cmp\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Nom\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Abl\|Gender=Fem,Masc\|Number=Sing\|POS=NOUN`, `Case=Acc\|Gender=Fem,Masc\|Number=Sing\|POS=NOUN`, `Case=Abl\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Voc\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Abl\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Nom\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Degree=Cmp\|Gender=Masc,Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Fem,Masc\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|Degree=Cmp\|Number=Sing\|POS=ADJ`, `Case=Dat\|Degree=Pos\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Acc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Gender=Fem,Masc\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=NUM`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Gen\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Voc\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=NUM`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Dem`, `Aspect=Perf\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Degree=Abs\|POS=ADV`, `Case=Acc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Dat\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Voc\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Gen\|Gender=Fem,Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Dat\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Dat\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Dem`, `Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin`, `Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `POS=SCONJ\|PronType=Rel`, `Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Dat\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=DET`, `Case=Dat\|Number=Plur\|POS=DET`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=NUM`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Voc\|Gender=Masc\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Rcp`, `POS=X`, `Case=Nom\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Acc\|Gender=Neut\|NumForm=Word\|NumType=Card\|Number=Sing\|POS=NUM`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Tot`, `Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Number=Plur\|POS=DET`, `Case=Abl\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Acc\|POS=VERB\|VerbForm=Sup`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Aspect=Imp\|Case=Gen\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Aspect=Imp\|Case=Dat\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Rel`, `Case=Abl\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Voc\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Gen\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Voc\|Degree=Abs\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Fem,Masc\|Number=Sing\|POS=PRON`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Masc\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Abl\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Abl\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Con`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Gender=Fem\|NumForm=Word\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Tot`, `Aspect=Imp\|Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Tot`, `Aspect=Perf\|Case=Acc\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Nom\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Aspect=Imp\|POS=AUX\|Tense=Pres\|VerbForm=Inf`, `Case=Dat\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Aspect=Imp\|Case=Acc\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Abl\|Gender=Fem\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Acc\|Degree=Sup\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Dem`, `Aspect=Imp\|Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Dem`, `Aspect=Perf\|Case=Abl\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Gen\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Dat\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Abl\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Degree=Cmp\|Gender=Fem,Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Dem`, `Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Aspect=Perf\|Case=Gen\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=NUM`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Ind`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Degree=Abs\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Loc\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Acc\|Number=Plur\|POS=NUM`, `Case=Abl\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Dat\|POS=VERB\|VerbForm=Ger`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|NumType=Ord\|Number=Plur\|POS=ADJ`, `Case=Acc\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=ADJ\|VerbForm=Part`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Aspect=Perf\|Case=Gen\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `POS=ADV\|PronType=Ind`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Aspect=Prosp\|Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Neut\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Nom\|Gender=Masc,Neut\|Number=Sing\|POS=PRON`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Fut\|VerbForm=Fin`, `Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Degree=Abs\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin`, `Case=Gen\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Degree=Pos\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Abl\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Loc\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Mood=Imp\|Number=Plur\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Acc\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Dat\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Voc\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Fut\|VerbForm=Fin`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADJ\|VerbForm=Part`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Dat\|Degree=Cmp\|Number=Sing\|POS=ADJ`, `Case=Abl\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `POS=DET`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Rel`, `Case=Acc\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem,Masc\|Number=Sing\|POS=PRON`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Tot`, `Case=Acc\|Gender=Masc\|NumForm=Word\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Gen\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Tot`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Degree=Pos\|Gender=Fem,Masc\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Nom\|Degree=Pos\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Aspect=Imp\|Case=Gen\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Abl\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Abl\|Degree=Cmp\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Acc\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Fem\|NumType=Ord\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Mood=Imp\|Number=Plur\|POS=AUX\|Person=3\|Tense=Fut\|VerbForm=Fin`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Dat\|Degree=Sup\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Fem\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Number=Sing\|POS=DET`, `Case=Abl\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Dat\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=DET`, `Aspect=Imp\|Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Masc\|NumType=Dist\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Case=Abl\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=NUM`, `Aspect=Prosp\|Case=Abl\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Fem,Neut\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Degree=Abs\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Abl\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Con`, `NumForm=Roman\|NumType=Card\|POS=NUM`, `Case=Nom\|Degree=Abs\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|POS=VERB\|VerbForm=Sup`, `Case=Abl\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Degree=Abs\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Abl\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Abl\|Degree=Cmp\|Number=Plur\|POS=ADJ`, `Case=Gen\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Nom\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `POS=DET\|PronType=Dem`, `Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Compound=Yes\|POS=PRON\|PronType=Rcp`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Con`, `Case=Abl\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Voc\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Gen\|Number=Plur\|POS=NUM`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Abl\|Gender=Fem,Neut\|Number=Plur\|POS=DET`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=PRON`, `Case=Dat\|Number=Plur\|POS=NUM`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Abl\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Acc\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Case=Gen\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Degree=Pos\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=2\|Poss=Yes`, `Aspect=Perf\|Case=Abl\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Abl\|Degree=Cmp\|Gender=Fem,Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Degree=Pos\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Fem,Masc\|Number=Sing\|POS=NOUN`, `Case=Gen\|Number=Sing\|POS=NUM`, `Case=Abl\|Number=Plur\|POS=NUM`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Con`, `Gender=Neut\|Number=Sing\|POS=PRON\|Polarity=Neg\|PronType=Ind`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Dat\|Number=Plur\|POS=PRON\|PronType=Rel`, `Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Abl\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Dat\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Voc\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Gen\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `NumType=Card\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Voc\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Nom\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Acc\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Degree=Sup\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Pass`, `Case=Voc\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Voc\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Aspect=Perf\|Case=Gen\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Con`, `Case=Dat\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `POS=AUX\|Tense=Pres\|VerbForm=Inf\|Voice=Pass`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Aspect=Perf\|Case=Gen\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Degree=Abs\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Aspect=Prosp\|Case=Gen\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Acc\|Gender=Neut\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Degree=Abs\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin`, `Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Acc\|Degree=Abs\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Degree=Abs\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Nom\|Gender=Masc\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Abl\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Voc\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Abl\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Con`, `Case=Dat\|Gender=Masc,Neut\|Number=Sing\|POS=DET`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=NUM`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Nom\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Gen\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Acc\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=PRON`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Gen\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Fem,Masc\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Degree=Cmp\|Gender=Neut\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin`, `Aspect=Imp\|Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Fem\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Con`, `Aspect=Perf\|Case=Gen\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Acc\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Prosp\|Case=Acc\|Gender=Fem\|Number=Sing\|POS=AUX\|VerbForm=Part`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|Poss=Yes`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Masc,Neut\|Number=Plur\|POS=NUM`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Acc\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Aspect=Prosp\|Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Fut\|VerbForm=Fin`, `Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Aspect=Perf\|Case=Gen\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Dem`, `Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Degree=Abs\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Fem,Masc\|Number=Plur\|POS=PRON`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Gen\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Aspect=Prosp\|Case=Abl\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Aspect=Prosp\|Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Nom\|Gender=Fem,Masc\|Number=Plur\|POS=NOUN`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Con`, `Case=Abl\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Voc\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Rel`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Rel`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Rel`, `Aspect=Prosp\|Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Fut\|VerbForm=Fin`, `Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Tot`, `Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Voc\|Gender=Masc\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Voc\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Ind`, `Aspect=Perf\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Dat\|Number=Sing\|POS=PRON`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pqp\|VerbForm=Fin`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|PronType=Int`, `Aspect=Perf\|Mood=Sub\|Number=Plur\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Mood=Sub\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Dat\|Degree=Cmp\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Nom\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Dat\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Mood=Imp\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Con`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Acc\|Gender=Masc\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Prosp\|Case=Abl\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Masc\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|POS=VERB\|Tense=Past\|VerbForm=Inf\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Nom\|Gender=Masc,Neut\|Number=Sing\|POS=DET`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Gen\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Tot`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Tot`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Dem`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Act`, `Aspect=Imp\|Mood=Imp\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Dat\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Aspect=Imp\|Case=Gen\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Pqp\|VerbForm=Fin`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Dat\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Masc,Neut\|Number=Plur\|POS=NUM`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Aspect=Prosp\|Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Ind`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Con`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Ind`, `Aspect=Imp\|POS=AUX\|Tense=Fut\|VerbForm=Inf`, `Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=ADJ\|VerbForm=Part`, `Case=Abl\|Gender=Neut\|NumType=Ord\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Fem,Masc\|Number=Sing\|POS=NOUN`, `Case=Abl\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Aspect=Perf\|Case=Gen\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Masc\|NumForm=Word\|NumType=Card\|Number=Sing\|POS=NUM`, `Aspect=Prosp\|Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Gen\|Degree=Sup\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Degree=Cmp\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Con`, `Case=Abl\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Dat\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Int`, `Aspect=Prosp\|Case=Gen\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Nom\|Degree=Abs\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Aspect=Prosp\|Case=Dat\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=NOUN`, `Aspect=Prosp\|Case=Voc\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Gen\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=NUM`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Abl\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Abl\|Gender=Fem,Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Rel`, `Case=Dat\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Degree=Cmp\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=DET\|PronType=Ind`, `Aspect=Perf\|Case=Dat\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Aspect=Perf\|Case=Gen\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Gen\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Degree=Cmp\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Case=Gen\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Int`, `Aspect=Imp\|Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ`, `POS=PART\|Polarity=Neg`, `POS=PART\|PartType=Int`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Fem\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Acc\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Masc\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Pass`, `Case=Dat\|Degree=Abs\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Degree=Abs\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Abl\|Degree=Abs\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Case=Gen\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Masc\|NumType=Dist\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=PRON`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Aspect=Imp\|Case=Abl\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Fem,Masc\|Number=Plur\|POS=NUM`, `Case=Dat\|Degree=Pos\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Con`, `Case=Nom\|Gender=Masc\|NumType=Dist\|Number=Plur\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Voc\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Fem\|NumType=Ord\|Number=Plur\|POS=ADJ`, `Case=Dat\|Number=Sing\|POS=PRON\|PronType=Rel`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Case=Gen\|Gender=Fem,Masc\|Number=Plur\|POS=NOUN`, `Case=Abl\|Number=Plur\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|Number=Plur\|POS=NOUN`, `Case=Abl\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Acc\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Aspect=Imp\|Case=Abl\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Acc\|Degree=Abs\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Aspect=Imp\|Case=Abl\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Acc\|Gender=Fem\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=DET\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Neut\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Rel`, `Aspect=Imp\|Mood=Sub\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin`, `Aspect=Prosp\|Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Voc\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=2\|Poss=Yes`, `Aspect=Perf\|Case=Voc\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Voc\|Degree=Pos\|Gender=Fem,Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Fem,Masc\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Nom\|Gender=Neut\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Tot`, `Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Voc\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Rel`, `Case=Abl\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Aspect=Imp\|Case=Dat\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Nom\|Degree=Cmp\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=NUM`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Tot`, `Aspect=Perf\|Case=Dat\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET`, `Aspect=Perf\|Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|Gender=Masc,Neut\|Number=Plur\|POS=DET\|Person=2\|Poss=Yes`, `Case=Acc\|Gender=Fem\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Abl\|Gender=Fem\|NumForm=Word\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Voc\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Degree=Cmp\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Masc\|NumForm=Word\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Con`, `Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Acc\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Gen\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Aspect=Perf\|Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Pass`, `Case=Abl\|Degree=Abs\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Aspect=Perf\|Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Masc\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Nom\|Degree=Cmp\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Con`, `Case=Acc\|Degree=Cmp\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Nom\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Perf\|Case=Dat\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Con`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Tot`, `Aspect=Perf\|Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Dat\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Con`, `Case=Abl\|Gender=Masc,Neut\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Nom\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=DET\|Person=1\|Poss=Yes`, `Case=Abl\|Number=Plur\|POS=PRON`, `Case=Gen\|Gender=Fem,Masc\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Abl\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=DET\|PronType=Ind`, `Aspect=Prosp\|Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Rel`, `Case=Nom\|Gender=Fem\|NumForm=Word\|NumType=Card\|POS=NUM`, `Case=Gen\|Gender=Neut\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Case=Abl\|Gender=Masc,Neut\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Aspect=Prosp\|Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Sup\|Voice=Act`, `Case=Acc\|Degree=Abs\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Dat\|Number=Plur\|POS=PRON`, `Aspect=Imp\|Case=Nom\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Aspect=Imp\|Mood=Imp\|Number=Sing\|POS=VERB\|Person=2\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Voc\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Rel`, `Case=Dat\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Gen\|Gender=Neut\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Abl\|Gender=Fem\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Degree=Cmp\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Aspect=Imp\|Case=Abl\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Nom\|Gender=Fem\|NumType=Ord\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Voc\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Gen\|Gender=Masc\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Aspect=Prosp\|Case=Gen\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Neut\|NumForm=Word\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Abl\|Gender=Fem\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Aspect=Prosp\|Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Nom\|Degree=Abs\|Gender=Fem\|NumType=Card\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Gen\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Degree=Sup\|Gender=Masc,Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ\|VerbForm=Part`, `Case=Gen\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Aspect=Imp\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|PronType=Rel`, `Aspect=Perf\|Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Pass`, `Case=Gen\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Neut\|NumType=Dist\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=VERB\|VerbForm=Gdv`, `Case=Acc\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=PROPN`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Con`, `Case=Nom\|Degree=Pos\|Number=Plur\|POS=ADJ`, `Case=Abl\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Case=Abl\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Abl\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Int`, `Aspect=Perf\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pqp\|VerbForm=Fin`, `Case=Abl\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=ADJ\|VerbForm=Part`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=ADJ\|VerbForm=Part`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Voc\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Aspect=Perf\|Case=Dat\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Abl\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Abl\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=ADJ\|VerbForm=Part`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Gen\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin`, `Case=Abl\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Number=Sing\|POS=PROPN`, `Aspect=Prosp\|Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Number=Sing\|POS=PRON`, `Case=Voc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Con`, `Case=Gen\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Voc\|Gender=Fem\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Neut\|NumForm=Word\|NumType=Card\|POS=NUM`, `Case=Dat\|Gender=Neut\|NumForm=Word\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Gen\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Neut\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Acc\|Gender=Fem,Masc\|Number=Sing\|POS=PROPN`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Nom\|Number=Plur\|POS=NUM`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Gen\|Degree=Abs\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Mood=Imp\|Number=Sing\|POS=AUX\|Person=3\|Tense=Fut\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem,Masc\|Number=Sing\|POS=PRON\|PronType=Rel`, `Case=Gen\|Degree=Abs\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `POS=DET\|PronType=Rel`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=VERB\|VerbForm=Gdv`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Aspect=Imp\|Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Fut\|VerbForm=Fin`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=PRON`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Voc\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Acc\|Degree=Cmp\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Gen\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Abl\|Gender=Masc,Neut\|Number=Plur\|POS=DET`, `Case=Abl\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Con`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Voc\|Degree=Pos\|Gender=Fem,Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs\|Reflex=Yes`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=1\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Degree=Abs\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Nom\|Gender=Masc\|NumForm=Word\|NumType=Card\|POS=NUM`, `Case=Nom\|POS=VERB\|VerbForm=Ger`, `Case=Abl\|Gender=Fem\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Dat\|Degree=Pos\|Gender=Masc,Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Aspect=Prosp\|Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Number=Plur\|POS=PRON`, `Aspect=Perf\|Case=Dat\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Dat\|Degree=Abs\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Case=Nom\|Gender=Fem,Masc\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Voc\|Gender=Neut\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Voc\|Gender=Fem\|Number=Sing\|POS=DET\|Person=1\|Poss=Yes`, `Case=Gen\|Gender=Neut\|Number=Plur\|Number[psor]=Sing\|POS=DET\|Person[psor]=2\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Aspect=Perf\|Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Act`, `Case=Voc\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Aspect=Imp\|Case=Gen\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc,Neut\|Number=Sing\|POS=DET\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Gen\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Aspect=Imp\|Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Fem,Masc\|Number=Plur\|POS=PROPN`, `Aspect=Perf\|Mood=Sub\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin`, `Case=Gen\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Dem`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=2\|Tense=Pqp\|VerbForm=Fin\|Voice=Act`, `Aspect=Perf\|Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Masc\|NumType=Dist\|Number=Plur\|POS=ADJ`, `Case=Dat\|Number=Sing\|POS=PRON\|Person=3\|PronType=Prs`, `Aspect=Perf\|Case=Voc\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Pass`, `Case=Abl\|Gender=Masc\|Number=Plur\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Pass`, `Case=Gen\|Gender=Masc,Neut\|Number=Plur\|POS=PRON\|Person=3\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Ind`, `Aspect=Perf\|Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Dat\|Gender=Fem,Masc\|Number=Plur\|POS=DET`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Tot`, `Case=Voc\|Degree=Abs\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Voc\|Gender=Masc\|Number=Plur\|POS=VERB\|Tense=Pres\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=AUX\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Con`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Con`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Case=Acc\|Degree=Cmp\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET\|PronType=Ind`, `Aspect=Prosp\|Case=Gen\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Loc\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Acc\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Abl\|Gender=Neut\|Number=Plur\|POS=PRON\|Person=1\|Poss=Yes`, `Case=Nom\|Gender=Fem,Masc\|Number=Plur\|POS=PRON\|PronType=Rel`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Fut\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Sing\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Act`, `Case=Abl\|Gender=Masc\|Number=Sing\|Number[psor]=Plur\|POS=DET\|Person[psor]=1\|Poss=Yes\|PronType=Prs`, `Case=Abl\|Gender=Fem,Masc\|Number=Plur\|POS=ADJ`, `Aspect=Perf\|Case=Abl\|Gender=Fem\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|Person=2\|Poss=Yes`, `Case=Gen\|Degree=Cmp\|Gender=Masc,Neut\|Number=Plur\|POS=ADJ`, `Aspect=Imp\|Mood=Imp\|Number=Sing\|POS=AUX\|Person=2\|Tense=Fut\|VerbForm=Fin`, `Aspect=Perf\|Case=Dat\|Number=Plur\|POS=VERB\|Tense=Past\|VerbForm=Part\|Voice=Pass`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET\|Person[psor]=3\|Poss=Yes\|PronType=Prs`, `Case=Gen\|Gender=Masc,Neut\|Number=Sing\|POS=PRON\|Person=3\|Poss=Yes\|Reflex=Yes`, `Aspect=Perf\|Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Degree=Cmp\|Gender=Fem,Masc\|Number=Sing\|POS=ADJ`, `Aspect=Perf\|Mood=Sub\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin`, `Aspect=Prosp\|Case=Nom\|Gender=Neut\|Number=Plur\|POS=AUX\|VerbForm=Part`, `Aspect=Prosp\|Case=Abl\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Pass` |
</details>
### Accuracy
| Type | Score |
| --- | --- |
| `DEP_UAS` | 76.55 |
| `DEP_LAS` | 70.03 |
| `SENTS_P` | 84.77 |
| `SENTS_R` | 91.89 |
| `SENTS_F` | 88.19 |
| `LEMMA_ACC` | 92.23 |
| `TAG_ACC` | 87.16 |
| `POS_ACC` | 94.94 |
| `MORPH_ACC` | 88.18 |
| `TRANSFORMER_LOSS` | 940948.79 |
| `PARSER_LOSS` | 160659.59 |
| `TRAINABLE_LEMMATIZER_LOSS` | 34462.11 |
| `TAGGER_LOSS` | 46847.23 |
| `MORPHOLOGIZER_LOSS` | 45717.51 | |
spellingdragon/whisper-tiny-en-ft | spellingdragon | 2023-12-23T12:04:10Z | 5 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"en",
"dataset:PolyAI/minds14",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2023-12-17T15:45:52Z | ---
language:
- en
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_trainer
datasets:
- PolyAI/minds14
metrics:
- wer
model-index:
- name: Whisper tiny En finetuned
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: PolyAI/minds14
type: PolyAI/minds14
metrics:
- name: Wer
type: wer
value: 0.25975953688585424
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper tiny En finetuned
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the PolyAI/minds14 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2284
- Wer Ortho: 0.2631
- Wer: 0.2598
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_steps: 50
- training_steps: 100
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|
| 1.8773 | 1.79 | 50 | 0.6960 | 0.3737 | 0.3375 |
| 0.2966 | 3.57 | 100 | 0.2284 | 0.2631 | 0.2598 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
johnptorr/distilbert-base-uncased-finetuned-emotion | johnptorr | 2023-12-23T11:51:02Z | 5 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-12-23T11:01:03Z | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1556
- Accuracy: 0.934
- F1: 0.9342
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.1696 | 1.0 | 250 | 0.1732 | 0.932 | 0.9317 |
| 0.1095 | 2.0 | 500 | 0.1556 | 0.934 | 0.9342 |
### Framework versions
- Transformers 4.16.2
- Pytorch 2.1.0+cu121
- Datasets 1.16.1
- Tokenizers 0.15.0
|
Memori707/gpt2-finetuned-wikitext2 | Memori707 | 2023-12-23T11:47:02Z | 3 | 0 | transformers | [
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-23T11:26:47Z | ---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: Memori707/gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Memori707/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.4927
- Validation Loss: 6.3423
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3142 | 6.7619 | 0 |
| 6.4927 | 6.3423 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
dadashzadeh/roberta-sentiment-persian | dadashzadeh | 2023-12-23T11:46:28Z | 6 | 1 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"roberta",
"text-classification",
"generated_from_trainer",
"fa",
"dataset:hezarai/sentiment-dksf",
"base_model:HooshvareLab/roberta-fa-zwnj-base",
"base_model:finetune:HooshvareLab/roberta-fa-zwnj-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-12-22T21:54:28Z | ---
license: apache-2.0
base_model: HooshvareLab/roberta-fa-zwnj-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: bert-base-Sentiment-persian
results: []
datasets:
- hezarai/sentiment-dksf
language:
- fa
pipeline_tag: text-classification
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-Sentiment-persian
This model is a fine-tuned version of [HooshvareLab/roberta-fa-zwnj-base](https://huggingface.co/HooshvareLab/roberta-fa-zwnj-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4350
- Accuracy: 0.8298
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.5814 | 1.0 | 3576 | 0.4350 | 0.8298 |
| 0.433 | 2.0 | 7152 | 0.4646 | 0.8307 |
| 0.313 | 3.0 | 10728 | 0.5334 | 0.8458 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0 |
robinsmits/open_llama_7b_alpaca_clean_dutch_qlora | robinsmits | 2023-12-23T11:36:34Z | 58 | 1 | peft | [
"peft",
"llama",
"alpaca",
"Transformers",
"text-generation-inference",
"text-generation",
"nl",
"dataset:BramVanroy/alpaca-cleaned-dutch",
"arxiv:2312.12852",
"base_model:openlm-research/open_llama_7b",
"base_model:adapter:openlm-research/open_llama_7b",
"license:cc-by-nc-4.0",
"region:us"
] | text-generation | 2023-07-01T20:59:05Z | ---
language:
- nl
license: cc-by-nc-4.0
library_name: peft
tags:
- llama
- alpaca
- Transformers
- text-generation-inference
datasets:
- BramVanroy/alpaca-cleaned-dutch
inference: false
base_model: openlm-research/open_llama_7b
pipeline_tag: text-generation
---
# open_llama_7b_alpaca_clean_dutch_qlora
## Model description
This adapter model is a fine-tuned version of [openlm-research/open_llama_7b](https://huggingface.co/openlm-research/open_llama_7b).
Finetuning was performed on the Dutch [BramVanroy/alpaca-cleaned-dutch](https://www.huggingface.co/datasets/BramVanroy/alpaca-cleaned-dutch) dataset which contains 52K of records with instruction following-data translated from English to Dutch.
See [openlm-research/open_llama_7b](https://huggingface.co/openlm-research/open_llama_7b) for all information about the base model.
## Model usage
A basic example of how to use the finetuned model.
```
import torch
from peft import PeftModel, PeftConfig
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "robinsmits/open_llama_7b_alpaca_clean_dutch_qlora"
tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast = False, add_eos_token = True)
config = PeftConfig.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(config.base_model_name_or_path, load_in_8bit = True, device_map = "auto")
model = PeftModel.from_pretrained(model, model_name)
prompt = "### Instructie:\nWat zijn de drie belangrijkste softwareonderdelen die worden gebruikt bij webontwikkeling?\n\n### Antwoord:\n"
inputs = tokenizer(prompt, return_tensors = "pt", truncation = True).input_ids.cuda()
sample = model.generate(input_ids = inputs, max_new_tokens = 512, num_beams = 2, early_stopping = True, eos_token_id = tokenizer.eos_token_id)
output = tokenizer.decode(sample[0], skip_special_tokens = True)
print(output.split(prompt)[1])
```
The prompt and generated output for the above mentioned example is similar to the output shown below.
```
### Instructie:
Wat zijn de drie belangrijkste softwareonderdelen die worden gebruikt bij webontwikkeling?
### Antwoord:
</br>
De drie belangrijkste softwareonderdelen die worden gebruikt bij webontwikkeling zijn HTML, CSS en JavaScript.
```
For more extensive usage and a lot of generated samples (both good and bad samples) see the following [Inference Notebook](https://github.com/RobinSmits/Dutch-LLMs/blob/main/Open_Llama_7B_Alpaca_Clean_Dutch_Inference.ipynb)
## Intended uses & limitations
The open_llama_7b model was primarily trained on the English language. Part of the dataset was a Wikipedia dump containing pages in 20 languages.
Dutch was one of those languages. Given the size of the total dataset and the wikipedia part the Dutch language was very likely less than 0.5% of the total data.
The generated output and performance of this model for the Dutch language is very likely not always comparable to the various Open-Llama models that have been finetuned on English Alpaca datasets.
The primary intention of this model is to explore and research the use of the Dutch language in combination with an Open LLM model.
## Training and evaluation data
This model was trained on the [BramVanroy/alpaca-cleaned-dutch](https://www.huggingface.co/datasets/BramVanroy/alpaca-cleaned-dutch) dataset.
Based on the dataset license only Non-Commercial use is allowed. Commercial use is strictly forbidden.
```
@misc{vanroy2023language,
title={Language Resources for {Dutch} Large Language Modelling},
author={Bram Vanroy},
year={2023},
eprint={2312.12852},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Training procedure
This model was finetuned with a QLoRA setup on a Google Colab A100 GPU in about 6.5 hours.
The notebook used for training can be found here: [Training Notebook](https://github.com/RobinSmits/Dutch-LLMs/blob/main/Open_Llama_7B_Alpaca_Clean_Dutch_Qlora.ipynb)
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 64
- training_steps: 1536
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.1240 | 1.0 | 768 | 1.1227 |
| 1.0177 | 2.0 | 1536 | 1.0645 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
- PEFT 0.4.0.dev0 |
robinsmits/polylm_1.7b_ft_alpaca_clean_dutch | robinsmits | 2023-12-23T11:30:45Z | 16 | 0 | peft | [
"peft",
"tensorboard",
"generated_from_trainer",
"alpaca",
"Transformers",
"PolyLM",
"text-generation-inference",
"text-generation",
"nl",
"dataset:BramVanroy/alpaca-cleaned-dutch",
"arxiv:2307.06018",
"arxiv:2312.12852",
"base_model:DAMO-NLP-MT/polylm-1.7b",
"base_model:adapter:DAMO-NLP-MT/polylm-1.7b",
"license:cc-by-nc-4.0",
"region:us"
] | text-generation | 2023-07-21T13:52:11Z | ---
language:
- nl
license: cc-by-nc-4.0
library_name: peft
tags:
- generated_from_trainer
- alpaca
- Transformers
- PolyLM
- text-generation-inference
datasets:
- BramVanroy/alpaca-cleaned-dutch
inference: false
base_model: DAMO-NLP-MT/polylm-1.7b
pipeline_tag: text-generation
model-index:
- name: polylm_1.7b_ft_alpaca_clean_dutch
results: []
---
# polylm_1.7b_ft_alpaca_clean_dutch
## Model description
This adapter model is a fine-tuned version of [DAMO-NLP-MT/polylm-1.7b](https://huggingface.co/DAMO-NLP-MT/polylm-1.7b).
It achieves the following results on the evaluation set:
- Loss: 1.8483
Finetuning was performed on the Dutch [BramVanroy/alpaca-cleaned-dutch](https://www.huggingface.co/datasets/BramVanroy/alpaca-cleaned-dutch) dataset which contains 52K of records with instruction following-data translated from English to Dutch.
See [DAMO-NLP-MT/polylm-1.7b](https://huggingface.co/DAMO-NLP-MT/polylm-1.7b) for all information about the base model.
## Model usage
A basic example of how to use the finetuned model.
```
import torch
from peft import AutoPeftModelForCausalLM
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "robinsmits/polylm_1.7b_ft_alpaca_clean_dutch"
tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast = False, legacy = False)
model = AutoPeftModelForCausalLM.from_pretrained(model_name, device_map = "auto", load_in_4bit = True, torch_dtype = torch.bfloat16)
prompt = "### Instructie:\nWat zijn de drie belangrijkste softwareonderdelen die worden gebruikt bij webontwikkeling?\n\n### Antwoord:\n"
inputs = tokenizer(prompt, return_tensors = "pt")
sample = model.generate(input_ids = inputs.input_ids.cuda(),
attention_mask = inputs.attention_mask.cuda(),
max_new_tokens = 128,
do_sample = True,
top_p = 0.85,
top_k = 50,
temperature = 0.5,
repetition_penalty = 1.2,
length_penalty = -1.0,
num_return_sequences = 1,
pad_token_id = tokenizer.eos_token_id,
forced_eos_token_id = tokenizer.eos_token_id)
output = tokenizer.decode(sample[0], skip_special_tokens = True)
print(output.split(prompt)[1])
```
The prompt and generated output for the above mentioned example is similar to the output shown below.
```
### Instructie:
Wat zijn de drie belangrijkste softwareonderdelen die worden gebruikt bij webontwikkeling?
### Antwoord:
De drie belangrijkste softwareonderdelen die worden gebruikt in webontwikkeling zijn HTML, CSS en Javascript.HTML is het hoofdbestand voor alle inhoud op een website.CSS is het hoofdbestand voor decoraties en scripts om te gebruiken zoals JavaScript en PHP.Javascript wordt meestal gebruikt om verschillende functies uit te voeren of het script te manipuleren.Het laatste bestand maakt het mogelijk om code te schrijven dat aan uw website gekoppeld kan worden door middel van enkele woorden. Daarnaast kunnen er ook andere bestanden nodig zijn als gevolg van gebruik van meerdere servers.Een voorbeeld hiervan zou zijn wanneer u bijvoorbeeld een blog-website
```
For more extensive usage and a lot of generated samples (both good and bad samples) see the following [Inference Notebook](https://github.com/RobinSmits/Dutch-LLMs/blob/main/PolyLM_1_7B_Alpaca_Clean_Dutch_Inference.ipynb)
## Intended uses & limitations
The PolyLM-1.7B model was trained on 18 languages. The primary focus was to create a multi-lingual Open LLM.
Dutch was one of those 18 languages. For training the model a diverse combination of multi-lingual datasets was used.
The generated output and performance of this model for the Dutch language is very likely not always comparable to the various Open-Llama models that have been finetuned on English Alpaca datasets.
The primary intention of this finetuned model is to explore and research the use of the Dutch language in combination with an Open LLM model.
## Bias, Risks, and Limitations
The information below is copied from the base model's [official model card](https://arxiv.org/pdf/2307.06018.pdf):
This applies also to the finetuned model.
> Our contributions are fully methodological: adding the support of multilingualism to LLM during training and SFT phases. It is unavoidable that PolyLM might exhibit several common deficiencies of language models, e.g. hallucination and toxicity. PolyLM should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application.
## Training and evaluation data
This model was trained on the [BramVanroy/alpaca-cleaned-dutch](https://www.huggingface.co/datasets/BramVanroy/alpaca-cleaned-dutch) dataset.
The dataset is the Dutch translation of the English Alpaca Cleaned instruction dataset.
Based on the dataset license only Non-Commercial use is allowed. Commercial use is strictly forbidden.
```
@misc{vanroy2023language,
title={Language Resources for {Dutch} Large Language Modelling},
author={Bram Vanroy},
year={2023},
eprint={2312.12852},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Training procedure
This model was finetuned with a QLoRA setup on a Google Colab A100 GPU in about 1.5 hours.
The notebook used for training can be found here: [Training Notebook](https://github.com/RobinSmits/Dutch-LLMs/blob/main/PolyLM_1_7B_Alpaca_Clean_Dutch_Qlora.ipynb)
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 64
- num_epochs: 2
The following bitsandbytes quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.1248 | 0.16 | 128 | 2.1129 |
| 2.0512 | 0.33 | 256 | 2.0347 |
| 1.9983 | 0.49 | 384 | 1.9948 |
| 1.9557 | 0.66 | 512 | 1.9655 |
| 1.9583 | 0.82 | 640 | 1.9386 |
| 1.916 | 0.99 | 768 | 1.9177 |
| 1.8671 | 1.15 | 896 | 1.9019 |
| 1.8626 | 1.32 | 1024 | 1.8885 |
| 1.8321 | 1.48 | 1152 | 1.8762 |
| 1.8596 | 1.65 | 1280 | 1.8631 |
| 1.843 | 1.81 | 1408 | 1.8539 |
| 1.8333 | 1.98 | 1536 | 1.8483 |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
- PEFT 0.4.0 |
TheBloke/SAM-GPTQ | TheBloke | 2023-12-23T11:30:25Z | 18 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"en",
"base_model:SuperAGI/SAM",
"base_model:quantized:SuperAGI/SAM",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"gptq",
"region:us"
] | text-generation | 2023-12-23T11:01:54Z | ---
base_model: SuperAGI/SAM
inference: false
language:
- en
license: apache-2.0
model_creator: SuperAGI
model_name: SAM
model_type: mistral
prompt_template: '[INST] {prompt} [/INST]
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# SAM - GPTQ
- Model creator: [SuperAGI](https://huggingface.co/SuperAGI)
- Original model: [SAM](https://huggingface.co/SuperAGI/SAM)
<!-- description start -->
# Description
This repo contains GPTQ model files for [SuperAGI's SAM](https://huggingface.co/SuperAGI/SAM).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/SAM-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/SAM-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/SAM-GGUF)
* [SuperAGI's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/SuperAGI/SAM)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Mistral
```
[INST] {prompt} [/INST]
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-compatible clients start -->
## Known compatible clients / servers
GPTQ models are currently supported on Linux (NVidia/AMD) and Windows (NVidia only). macOS users: please use GGUF models.
These GPTQ models are known to work in the following inference servers/webuis.
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
- [KoboldAI United](https://github.com/henk717/koboldai)
- [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui)
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
This may not be a complete list; if you know of others, please let me know!
<!-- README_GPTQ.md-compatible clients end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files, and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
Most GPTQ files are made with AutoGPTQ. Mistral models are currently made with Transformers.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The calibration dataset used during quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ calibration dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama and Mistral models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/SAM-GPTQ/tree/main) | 4 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 4.16 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/SAM-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 4.57 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/SAM-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 7.52 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/SAM-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 7.68 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
| [gptq-8bit-32g-actorder_True](https://huggingface.co/TheBloke/SAM-GPTQ/tree/gptq-8bit-32g-actorder_True) | 8 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 8.17 GB | No | 8-bit, with group size 32g and Act Order for maximum inference quality. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/SAM-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 4.29 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download, including from branches
### In text-generation-webui
To download from the `main` branch, enter `TheBloke/SAM-GPTQ` in the "Download model" box.
To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/SAM-GPTQ:gptq-4bit-32g-actorder_True`
### From the command line
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
To download the `main` branch to a folder called `SAM-GPTQ`:
```shell
mkdir SAM-GPTQ
huggingface-cli download TheBloke/SAM-GPTQ --local-dir SAM-GPTQ --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir SAM-GPTQ
huggingface-cli download TheBloke/SAM-GPTQ --revision gptq-4bit-32g-actorder_True --local-dir SAM-GPTQ --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Hugging Face cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
mkdir SAM-GPTQ
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/SAM-GPTQ --local-dir SAM-GPTQ --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
### With `git` (**not** recommended)
To clone a specific branch with `git`, use a command like this:
```shell
git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/SAM-GPTQ
```
Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/SAM-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/SAM-GPTQ:gptq-4bit-32g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `SAM-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
- Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
It's recommended to use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/SAM-GPTQ --port 3000 --quantize gptq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''[INST] {prompt} [/INST]
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_GPTQ.md-use-from-tgi end -->
<!-- README_GPTQ.md-use-from-python start -->
## Python code example: inference from this GPTQ model
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install --upgrade transformers optimum
# If using PyTorch 2.1 + CUDA 12.x:
pip3 install --upgrade auto-gptq
# or, if using PyTorch 2.1 + CUDA 11.x:
pip3 install --upgrade auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/
```
If you are using PyTorch 2.0, you will need to install AutoGPTQ from source. Likewise if you have problems with the pre-built wheels, you should try building from source:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
git checkout v0.5.1
pip3 install .
```
### Example Python code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/SAM-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-32g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Write a story about llamas"
system_message = "You are a story writing assistant"
prompt_template=f'''[INST] {prompt} [/INST]
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with Transformers. For non-Mistral models, AutoGPTQ can also be used directly.
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama architecture models (including Mistral, Yi, DeepSeek, SOLAR, etc) in 4-bit. Please see the Provided Files table above for per-file compatibility.
For a list of clients/servers, please see "Known compatible clients / servers", above.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: SuperAGI's SAM
# Model Card
SAM (Small Agentic Model), a 7B model that demonstrates impressive reasoning abilities despite its smaller size. SAM-7B has outperformed existing SoTA models on various reasoning benchmarks, including GSM8k and ARC-C.
For full details of this model please read our [release blog post](https://superagi.com/introducing-sam-small-agentic-model/).
# Key Contributions
- SAM-7B outperforms GPT 3.5, Orca, and several other 70B models on multiple reasoning benchmarks, including ARC-C and GSM8k.
- Interestingly, despite being trained on a 97% smaller dataset, SAM-7B surpasses Orca-13B on GSM8k.
- All responses in our fine-tuning dataset are generated by open-source models without any assistance from state-of-the-art models like GPT-3.5 or GPT-4.
## Training
- Trained by: SuperAGI Team
- Hardware: NVIDIA 6 x H100 SxM (80GB)
- Model used: Mistral 7B
- Duration of finetuning: 4 hours
- Number of epochs: 1
- Batch size: 16
- Learning Rate: 2e-5
- Warmup Ratio: 0.1
- Optmizer: AdamW
- Scheduler: Cosine
## Example Prompt
The template used to build a prompt for the Instruct model is defined as follows:
```
<s> [INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST]
```
Note that `<s>` and `</s>` are special tokens for beginning of string (BOS) and end of string (EOS) while [INST] and [/INST] are regular strings.
## Evaluation
These benchmarks show that our model has improved reasoning as compared to orca 2-7b, orca 2-13b and GPT-3.5.
Despite being smaller in size, we show better multi-hop reasoning, as shown below:
<img src = "https://superagi.com/wp-content/uploads/2023/12/image-932.png" alt="Reasoning Benchmark Performance" width="700">
Note: Temperature=0.3 is the suggested for optimal performance
## Run the model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "SuperAGI/SAM"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
text = "Can elephants fly?"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
## Limitations
SAM is a demonstration that better reasoning can be induced using less but high-quality data generated using OpenSource LLMs.
The model is not suitable for conversations and simple Q&A, it performs better in task breakdown and reasoning only.
It does not have any moderation mechanisms. Therefore, the model is not suitable for production usage as it doesn't have guardrails for toxicity, societal bias, and language limitations. We would love to collaborate with the community to build safer and better models.
## The SuperAGI AI Team
Anmol Gautam, Arkajit Datta, Rajat Chawla, Ayush Vatsal, Sukrit Chatterjee, Adarsh Jha, Abhijeet Sinha, Rakesh Krishna, Adarsh Deep, Ishaan Bhola, Mukunda NS, Nishant Gaurav.
|
TheBloke/SAM-AWQ | TheBloke | 2023-12-23T11:02:06Z | 9 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"en",
"base_model:SuperAGI/SAM",
"base_model:quantized:SuperAGI/SAM",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"awq",
"region:us"
] | text-generation | 2023-12-23T10:06:33Z | ---
base_model: SuperAGI/SAM
inference: false
language:
- en
license: apache-2.0
model_creator: SuperAGI
model_name: SAM
model_type: mistral
prompt_template: '[INST] {prompt} [/INST]
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# SAM - AWQ
- Model creator: [SuperAGI](https://huggingface.co/SuperAGI)
- Original model: [SAM](https://huggingface.co/SuperAGI/SAM)
<!-- description start -->
## Description
This repo contains AWQ model files for [SuperAGI's SAM](https://huggingface.co/SuperAGI/SAM).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
### About AWQ
AWQ is an efficient, accurate and blazing-fast low-bit weight quantization method, currently supporting 4-bit quantization. Compared to GPTQ, it offers faster Transformers-based inference with equivalent or better quality compared to the most commonly used GPTQ settings.
AWQ models are currently supported on Linux and Windows, with NVidia GPUs only. macOS users: please use GGUF models instead.
It is supported by:
- [Text Generation Webui](https://github.com/oobabooga/text-generation-webui) - using Loader: AutoAWQ
- [vLLM](https://github.com/vllm-project/vllm) - version 0.2.2 or later for support for all model types.
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
- [Transformers](https://huggingface.co/docs/transformers) version 4.35.0 and later, from any code or client that supports Transformers
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) - for use from Python code
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/SAM-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/SAM-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/SAM-GGUF)
* [SuperAGI's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/SuperAGI/SAM)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Mistral
```
[INST] {prompt} [/INST]
```
<!-- prompt-template end -->
<!-- README_AWQ.md-provided-files start -->
## Provided files, and AWQ parameters
I currently release 128g GEMM models only. The addition of group_size 32 models, and GEMV kernel models, is being actively considered.
Models are released as sharded safetensors files.
| Branch | Bits | GS | AWQ Dataset | Seq Len | Size |
| ------ | ---- | -- | ----------- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/SAM-AWQ/tree/main) | 4 | 128 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 4096 | 4.15 GB
<!-- README_AWQ.md-provided-files end -->
<!-- README_AWQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/SAM-AWQ`.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `SAM-AWQ`
7. Select **Loader: AutoAWQ**.
8. Click Load, and the model will load and is now ready for use.
9. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
10. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_AWQ.md-text-generation-webui end -->
<!-- README_AWQ.md-use-from-vllm start -->
## Multi-user inference server: vLLM
Documentation on installing and using vLLM [can be found here](https://vllm.readthedocs.io/en/latest/).
- Please ensure you are using vLLM version 0.2 or later.
- When using vLLM as a server, pass the `--quantization awq` parameter.
For example:
```shell
python3 -m vllm.entrypoints.api_server --model TheBloke/SAM-AWQ --quantization awq --dtype auto
```
- When using vLLM from Python code, again set `quantization=awq`.
For example:
```python
from vllm import LLM, SamplingParams
prompts = [
"Tell me about AI",
"Write a story about llamas",
"What is 291 - 150?",
"How much wood would a woodchuck chuck if a woodchuck could chuck wood?",
]
prompt_template=f'''[INST] {prompt} [/INST]
'''
prompts = [prompt_template.format(prompt=prompt) for prompt in prompts]
sampling_params = SamplingParams(temperature=0.8, top_p=0.95)
llm = LLM(model="TheBloke/SAM-AWQ", quantization="awq", dtype="auto")
outputs = llm.generate(prompts, sampling_params)
# Print the outputs.
for output in outputs:
prompt = output.prompt
generated_text = output.outputs[0].text
print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
```
<!-- README_AWQ.md-use-from-vllm start -->
<!-- README_AWQ.md-use-from-tgi start -->
## Multi-user inference server: Hugging Face Text Generation Inference (TGI)
Use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/SAM-AWQ --port 3000 --quantize awq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires [huggingface-hub](https://github.com/huggingface/huggingface_hub) 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''[INST] {prompt} [/INST]
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: ", response)
```
<!-- README_AWQ.md-use-from-tgi end -->
<!-- README_AWQ.md-use-from-python start -->
## Inference from Python code using Transformers
### Install the necessary packages
- Requires: [Transformers](https://huggingface.co/docs/transformers) 4.35.0 or later.
- Requires: [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) 0.1.6 or later.
```shell
pip3 install --upgrade "autoawq>=0.1.6" "transformers>=4.35.0"
```
Note that if you are using PyTorch 2.0.1, the above AutoAWQ command will automatically upgrade you to PyTorch 2.1.0.
If you are using CUDA 11.8 and wish to continue using PyTorch 2.0.1, instead run this command:
```shell
pip3 install https://github.com/casper-hansen/AutoAWQ/releases/download/v0.1.6/autoawq-0.1.6+cu118-cp310-cp310-linux_x86_64.whl
```
If you have problems installing [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y autoawq
git clone https://github.com/casper-hansen/AutoAWQ
cd AutoAWQ
pip3 install .
```
### Transformers example code (requires Transformers 4.35.0 and later)
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
model_name_or_path = "TheBloke/SAM-AWQ"
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
model = AutoModelForCausalLM.from_pretrained(
model_name_or_path,
low_cpu_mem_usage=True,
device_map="cuda:0"
)
# Using the text streamer to stream output one token at a time
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
prompt = "Tell me about AI"
prompt_template=f'''[INST] {prompt} [/INST]
'''
# Convert prompt to tokens
tokens = tokenizer(
prompt_template,
return_tensors='pt'
).input_ids.cuda()
generation_params = {
"do_sample": True,
"temperature": 0.7,
"top_p": 0.95,
"top_k": 40,
"max_new_tokens": 512,
"repetition_penalty": 1.1
}
# Generate streamed output, visible one token at a time
generation_output = model.generate(
tokens,
streamer=streamer,
**generation_params
)
# Generation without a streamer, which will include the prompt in the output
generation_output = model.generate(
tokens,
**generation_params
)
# Get the tokens from the output, decode them, print them
token_output = generation_output[0]
text_output = tokenizer.decode(token_output)
print("model.generate output: ", text_output)
# Inference is also possible via Transformers' pipeline
from transformers import pipeline
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
**generation_params
)
pipe_output = pipe(prompt_template)[0]['generated_text']
print("pipeline output: ", pipe_output)
```
<!-- README_AWQ.md-use-from-python end -->
<!-- README_AWQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with:
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui) using `Loader: AutoAWQ`.
- [vLLM](https://github.com/vllm-project/vllm) version 0.2.0 and later.
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) version 1.1.0 and later.
- [Transformers](https://huggingface.co/docs/transformers) version 4.35.0 and later.
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) version 0.1.1 and later.
<!-- README_AWQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: SuperAGI's SAM
# Model Card
SAM (Small Agentic Model), a 7B model that demonstrates impressive reasoning abilities despite its smaller size. SAM-7B has outperformed existing SoTA models on various reasoning benchmarks, including GSM8k and ARC-C.
For full details of this model please read our [release blog post](https://superagi.com/introducing-sam-small-agentic-model/).
# Key Contributions
- SAM-7B outperforms GPT 3.5, Orca, and several other 70B models on multiple reasoning benchmarks, including ARC-C and GSM8k.
- Interestingly, despite being trained on a 97% smaller dataset, SAM-7B surpasses Orca-13B on GSM8k.
- All responses in our fine-tuning dataset are generated by open-source models without any assistance from state-of-the-art models like GPT-3.5 or GPT-4.
## Training
- Trained by: SuperAGI Team
- Hardware: NVIDIA 6 x H100 SxM (80GB)
- Model used: Mistral 7B
- Duration of finetuning: 4 hours
- Number of epochs: 1
- Batch size: 16
- Learning Rate: 2e-5
- Warmup Ratio: 0.1
- Optmizer: AdamW
- Scheduler: Cosine
## Example Prompt
The template used to build a prompt for the Instruct model is defined as follows:
```
<s> [INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST]
```
Note that `<s>` and `</s>` are special tokens for beginning of string (BOS) and end of string (EOS) while [INST] and [/INST] are regular strings.
## Evaluation
These benchmarks show that our model has improved reasoning as compared to orca 2-7b, orca 2-13b and GPT-3.5.
Despite being smaller in size, we show better multi-hop reasoning, as shown below:
<img src = "https://superagi.com/wp-content/uploads/2023/12/image-932.png" alt="Reasoning Benchmark Performance" width="700">
Note: Temperature=0.3 is the suggested for optimal performance
## Run the model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "SuperAGI/SAM"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
text = "Can elephants fly?"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
## Limitations
SAM is a demonstration that better reasoning can be induced using less but high-quality data generated using OpenSource LLMs.
The model is not suitable for conversations and simple Q&A, it performs better in task breakdown and reasoning only.
It does not have any moderation mechanisms. Therefore, the model is not suitable for production usage as it doesn't have guardrails for toxicity, societal bias, and language limitations. We would love to collaborate with the community to build safer and better models.
## The SuperAGI AI Team
Anmol Gautam, Arkajit Datta, Rajat Chawla, Ayush Vatsal, Sukrit Chatterjee, Adarsh Jha, Abhijeet Sinha, Rakesh Krishna, Adarsh Deep, Ishaan Bhola, Mukunda NS, Nishant Gaurav.
|
TheBloke/SAM-GGUF | TheBloke | 2023-12-23T11:02:03Z | 111 | 6 | transformers | [
"transformers",
"gguf",
"mistral",
"en",
"base_model:SuperAGI/SAM",
"base_model:quantized:SuperAGI/SAM",
"license:apache-2.0",
"region:us"
] | null | 2023-12-23T10:06:33Z | ---
base_model: SuperAGI/SAM
inference: false
language:
- en
license: apache-2.0
model_creator: SuperAGI
model_name: SAM
model_type: mistral
prompt_template: '[INST] {prompt} [/INST]
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# SAM - GGUF
- Model creator: [SuperAGI](https://huggingface.co/SuperAGI)
- Original model: [SAM](https://huggingface.co/SuperAGI/SAM)
<!-- description start -->
## Description
This repo contains GGUF format model files for [SuperAGI's SAM](https://huggingface.co/SuperAGI/SAM).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
<!-- README_GGUF.md-about-gguf end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/SAM-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/SAM-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/SAM-GGUF)
* [SuperAGI's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/SuperAGI/SAM)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Mistral
```
[INST] {prompt} [/INST]
```
<!-- prompt-template end -->
<!-- compatibility_gguf start -->
## Compatibility
These quantised GGUFv2 files are compatible with llama.cpp from August 27th onwards, as of commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221)
They are also compatible with many third party UIs and libraries - please see the list at the top of this README.
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
Refer to the Provided Files table below to see what files use which methods, and how.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-provided-files start -->
## Provided files
| Name | Quant method | Bits | Size | Max RAM required | Use case |
| ---- | ---- | ---- | ---- | ---- | ----- |
| [sam.Q2_K.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q2_K.gguf) | Q2_K | 2 | 3.08 GB| 5.58 GB | smallest, significant quality loss - not recommended for most purposes |
| [sam.Q3_K_S.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q3_K_S.gguf) | Q3_K_S | 3 | 3.17 GB| 5.67 GB | very small, high quality loss |
| [sam.Q3_K_M.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q3_K_M.gguf) | Q3_K_M | 3 | 3.52 GB| 6.02 GB | very small, high quality loss |
| [sam.Q3_K_L.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q3_K_L.gguf) | Q3_K_L | 3 | 3.82 GB| 6.32 GB | small, substantial quality loss |
| [sam.Q4_0.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q4_0.gguf) | Q4_0 | 4 | 4.11 GB| 6.61 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [sam.Q4_K_S.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q4_K_S.gguf) | Q4_K_S | 4 | 4.14 GB| 6.64 GB | small, greater quality loss |
| [sam.Q4_K_M.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q4_K_M.gguf) | Q4_K_M | 4 | 4.37 GB| 6.87 GB | medium, balanced quality - recommended |
| [sam.Q5_0.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q5_0.gguf) | Q5_0 | 5 | 5.00 GB| 7.50 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [sam.Q5_K_S.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q5_K_S.gguf) | Q5_K_S | 5 | 5.00 GB| 7.50 GB | large, low quality loss - recommended |
| [sam.Q5_K_M.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q5_K_M.gguf) | Q5_K_M | 5 | 5.13 GB| 7.63 GB | large, very low quality loss - recommended |
| [sam.Q6_K.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q6_K.gguf) | Q6_K | 6 | 5.94 GB| 8.44 GB | very large, extremely low quality loss |
| [sam.Q8_0.gguf](https://huggingface.co/TheBloke/SAM-GGUF/blob/main/sam.Q8_0.gguf) | Q8_0 | 8 | 7.70 GB| 10.20 GB | very large, extremely low quality loss - not recommended |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
<!-- README_GGUF.md-provided-files end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: TheBloke/SAM-GGUF and below it, a specific filename to download, such as: sam.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download TheBloke/SAM-GGUF sam.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download TheBloke/SAM-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/SAM-GGUF sam.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m sam.Q4_K_M.gguf --color -c 32768 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "[INST] {prompt} [/INST]"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 32768` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 โ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./sam.Q4_K_M.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"[INST] {prompt} [/INST]", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./sam.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: SuperAGI's SAM
# Model Card
SAM (Small Agentic Model), a 7B model that demonstrates impressive reasoning abilities despite its smaller size. SAM-7B has outperformed existing SoTA models on various reasoning benchmarks, including GSM8k and ARC-C.
For full details of this model please read our [release blog post](https://superagi.com/introducing-sam-small-agentic-model/).
# Key Contributions
- SAM-7B outperforms GPT 3.5, Orca, and several other 70B models on multiple reasoning benchmarks, including ARC-C and GSM8k.
- Interestingly, despite being trained on a 97% smaller dataset, SAM-7B surpasses Orca-13B on GSM8k.
- All responses in our fine-tuning dataset are generated by open-source models without any assistance from state-of-the-art models like GPT-3.5 or GPT-4.
## Training
- Trained by: SuperAGI Team
- Hardware: NVIDIA 6 x H100 SxM (80GB)
- Model used: Mistral 7B
- Duration of finetuning: 4 hours
- Number of epochs: 1
- Batch size: 16
- Learning Rate: 2e-5
- Warmup Ratio: 0.1
- Optmizer: AdamW
- Scheduler: Cosine
## Example Prompt
The template used to build a prompt for the Instruct model is defined as follows:
```
<s> [INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST]
```
Note that `<s>` and `</s>` are special tokens for beginning of string (BOS) and end of string (EOS) while [INST] and [/INST] are regular strings.
## Evaluation
These benchmarks show that our model has improved reasoning as compared to orca 2-7b, orca 2-13b and GPT-3.5.
Despite being smaller in size, we show better multi-hop reasoning, as shown below:
<img src = "https://superagi.com/wp-content/uploads/2023/12/image-932.png" alt="Reasoning Benchmark Performance" width="700">
Note: Temperature=0.3 is the suggested for optimal performance
## Run the model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "SuperAGI/SAM"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
text = "Can elephants fly?"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
## Limitations
SAM is a demonstration that better reasoning can be induced using less but high-quality data generated using OpenSource LLMs.
The model is not suitable for conversations and simple Q&A, it performs better in task breakdown and reasoning only.
It does not have any moderation mechanisms. Therefore, the model is not suitable for production usage as it doesn't have guardrails for toxicity, societal bias, and language limitations. We would love to collaborate with the community to build safer and better models.
## The SuperAGI AI Team
Anmol Gautam, Arkajit Datta, Rajat Chawla, Ayush Vatsal, Sukrit Chatterjee, Adarsh Jha, Abhijeet Sinha, Rakesh Krishna, Adarsh Deep, Ishaan Bhola, Mukunda NS, Nishant Gaurav.
<!-- original-model-card end -->
|
TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ | TheBloke | 2023-12-23T11:01:39Z | 20 | 2 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"finetune",
"dpo",
"Instruct",
"augmentation",
"german",
"conversational",
"en",
"de",
"dataset:argilla/distilabel-math-preference-dpo",
"base_model:VAGOsolutions/SauerkrautLM-SOLAR-Instruct",
"base_model:quantized:VAGOsolutions/SauerkrautLM-SOLAR-Instruct",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"gptq",
"region:us"
] | text-generation | 2023-12-23T09:56:28Z | ---
base_model: VAGOsolutions/SauerkrautLM-SOLAR-Instruct
datasets:
- argilla/distilabel-math-preference-dpo
inference: false
language:
- en
- de
library_name: transformers
license: cc-by-nc-4.0
model_creator: VAGO solutions
model_name: SauerkrautLM SOLAR Instruct
model_type: solar
pipeline_tag: text-generation
prompt_template: '### User:
{prompt}
### Assistant:
'
quantized_by: TheBloke
tags:
- finetune
- dpo
- Instruct
- augmentation
- german
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# SauerkrautLM SOLAR Instruct - GPTQ
- Model creator: [VAGO solutions](https://huggingface.co/VAGOsolutions)
- Original model: [SauerkrautLM SOLAR Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct)
<!-- description start -->
# Description
This repo contains GPTQ model files for [VAGO solutions's SauerkrautLM SOLAR Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF)
* [VAGO solutions's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: User-Assistant-Newlines
```
### User:
{prompt}
### Assistant:
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-compatible clients start -->
## Known compatible clients / servers
GPTQ models are currently supported on Linux (NVidia/AMD) and Windows (NVidia only). macOS users: please use GGUF models.
These GPTQ models are known to work in the following inference servers/webuis.
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
- [KoboldAI United](https://github.com/henk717/koboldai)
- [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui)
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
This may not be a complete list; if you know of others, please let me know!
<!-- README_GPTQ.md-compatible clients end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files, and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
Most GPTQ files are made with AutoGPTQ. Mistral models are currently made with Transformers.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The calibration dataset used during quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ calibration dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama and Mistral models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ/tree/main) | 4 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 5.98 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 6.59 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 11.01 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 11.25 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
| [gptq-8bit-32g-actorder_True](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ/tree/gptq-8bit-32g-actorder_True) | 8 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 11.99 GB | No | 8-bit, with group size 32g and Act Order for maximum inference quality. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 6.18 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download, including from branches
### In text-generation-webui
To download from the `main` branch, enter `TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ` in the "Download model" box.
To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ:gptq-4bit-32g-actorder_True`
### From the command line
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
To download the `main` branch to a folder called `SauerkrautLM-SOLAR-Instruct-GPTQ`:
```shell
mkdir SauerkrautLM-SOLAR-Instruct-GPTQ
huggingface-cli download TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ --local-dir SauerkrautLM-SOLAR-Instruct-GPTQ --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir SauerkrautLM-SOLAR-Instruct-GPTQ
huggingface-cli download TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ --revision gptq-4bit-32g-actorder_True --local-dir SauerkrautLM-SOLAR-Instruct-GPTQ --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Hugging Face cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
mkdir SauerkrautLM-SOLAR-Instruct-GPTQ
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ --local-dir SauerkrautLM-SOLAR-Instruct-GPTQ --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
### With `git` (**not** recommended)
To clone a specific branch with `git`, use a command like this:
```shell
git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ
```
Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ:gptq-4bit-32g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `SauerkrautLM-SOLAR-Instruct-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
- Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
It's recommended to use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ --port 3000 --quantize gptq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''### User:
{prompt}
### Assistant:
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_GPTQ.md-use-from-tgi end -->
<!-- README_GPTQ.md-use-from-python start -->
## Python code example: inference from this GPTQ model
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install --upgrade transformers optimum
# If using PyTorch 2.1 + CUDA 12.x:
pip3 install --upgrade auto-gptq
# or, if using PyTorch 2.1 + CUDA 11.x:
pip3 install --upgrade auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/
```
If you are using PyTorch 2.0, you will need to install AutoGPTQ from source. Likewise if you have problems with the pre-built wheels, you should try building from source:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
git checkout v0.5.1
pip3 install .
```
### Example Python code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-32g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Write a story about llamas"
system_message = "You are a story writing assistant"
prompt_template=f'''### User:
{prompt}
### Assistant:
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with Transformers. For non-Mistral models, AutoGPTQ can also be used directly.
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama architecture models (including Mistral, Yi, DeepSeek, SOLAR, etc) in 4-bit. Please see the Provided Files table above for per-file compatibility.
For a list of clients/servers, please see "Known compatible clients / servers", above.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: VAGO solutions's SauerkrautLM SOLAR Instruct

## VAGO solutions SauerkrautLM-SOLAR-Instruct
Introducing **SauerkrautLM-SOLAR-Instruct** โ our Sauerkraut version of the powerful [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0) !
Aligned with **DPO**
# Table of Contents
1. [Overview of all SauerkrautLM-SOLAR-Instruct models](#all-sauerkrautlm-solar-instruct-models)
2. [Model Details](#model-details)
- [Prompt template](#prompt-template)
- [Training Dataset](#training-dataset)
- [Data Contamination Test](#data-contamination-test-results)
3. [Evaluation](#evaluation)
5. [Disclaimer](#disclaimer)
6. [Contact](#contact)
7. [Collaborations](#collaborations)
8. [Acknowledgement](#acknowledgement)
## All SauerkrautLM-SOLAR-Instruct Models
| Model | HF | GPTQ | GGUF | AWQ |
|-------|-------|-------|-------|-------|
| SauerkrautLM-SOLAR-Instruct | [Link](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct/) | coming soon | coming soon | coming soon |
## Model Details
**SauerkrautLM-SOLAR-Instruct**
- **Model Type:** SauerkrautLM-SOLAR-Instruct is a finetuned Model based on [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0)
- **Language(s):** English, German
- **License:** cc-by-nc-4.0
- **Contact:** [Website](https://vago-solutions.de/#Kontakt) [David Golchinfar](mailto:[email protected])
### Training Dataset:
SauerkrautLM-SOLAR-Instruct was trained with mix of German data augmentation and translated data.
Aligned through **DPO** with our **new German SauerkrautLM-DPO dataset** based on parts of the SFT SauerkrautLM dataset
as chosen answers and [Sauerkraut-7b-HerO](https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-HerO) as rejected answers. Added with additional **translated Parts of the [HuggingFaceH4/ultrafeedback_binarized](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized)** (Our dataset do not contain any TruthfulQA prompts - check Data Contamination Test Results) and **[argilla/distilabel-math-preference-dpo](https://huggingface.co/datasets/argilla/distilabel-math-preference-dpo).**
We found, that only a simple translation of training data can lead to unnatural German phrasings.
Data augmentation techniques were used to grant grammatical, syntactical correctness and a more natural German wording in our training data.
We improved the German language skills on this model. Nevertheless, certain formulations may occur that are not entirely correct.
### Data Contamination Test Results
Some models on the HuggingFace leaderboard had problems with wrong data getting mixed in.
We checked our SauerkrautLM-DPO dataset with a special test [1] on this model as target model and upstage/SOLAR-10.7B-Instruct-v1.0 as reference model.
The HuggingFace team used the same methods [2, 3].
Our results, with `result < 0.1, %:` being well below 0.9, indicate that our dataset is free from contamination.
*The data contamination test results of HellaSwag and Winograde will be added once [1] supports them.*
| Dataset | ARC | MMLU | TruthfulQA | GSM8K |
|------------------------------|-------|-------|-------|-------|
| **SauerkrautLM-DPO**| result < 0.1, %: 0.0 |result < 0.1, %: 0.09 | result < 0.1, %: 0.13 | result < 0.1, %: 0.16 |
[1] https://github.com/swj0419/detect-pretrain-code-contamination
[2] https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/474#657f2245365456e362412a06
[3] https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/265#657b6debf81f6b44b8966230
### Prompt Template:
```
### User:
Hallo, wie geht es dir?
### Assistant:
Hallo! Es freut mich, dass du mit mir kommunizierst. Ich bin hier, um zu helfen und deine Anfragen zu erfรผllen. Du fragst, wie ich mich fรผhle. Als kรผnstliche Intelligenz habe ich keine eigentlichen Emotionen im Sinne eines Menschen, aber ich funktioniere optimal und bin bereit, Dienste anzubieten.
Wie geht es dir momentan? Kรถnnen wir zusammen etwas interessantes oder hilfreiches erledigen?
```
*Prompt Example on Temp 0.5
```
### User:
Hello, how are you?
### Assistant:
Hi there! I am an AI language model, so I don't have personal feelings or emotions in the traditional sense. However, I can assure you that my systems and processes are functioning well at this moment, allowing me to provide helpful responses for your queries.
How may I assist you today?
```
*Prompt Example on Temp 0.5
## Evaluation
| Metric | Value |
|-----------------------|---------------------------|
| Avg. | 74.21 |
| ARC (25-shot) | 70.82 |
| HellaSwag (10-shot) | 88.63 |
| MMLU (5-shot) | 66.2|
| TruthfulQA (0-shot) | 71.95 |
| Winogrande (5-shot) | 83.5 |
| GSM8K (5-shot) | 64.14 |
## Disclaimer
We must inform users that despite our best efforts in data cleansing, the possibility of uncensored content slipping through cannot be entirely ruled out.
However, we cannot guarantee consistently appropriate behavior. Therefore, if you encounter any issues or come across inappropriate content, we kindly request that you inform us through the contact information provided.
Additionally, it is essential to understand that the licensing of these models does not constitute legal advice. We are not held responsible for the actions of third parties who utilize our models.
ย
## Contact
If you are interested in customized LLMs for business applications, please get in contact with us via our website or contact us at [Dr. Daryoush Vaziri](mailto:[email protected]). We are also grateful for your feedback and suggestions.
ย
## Collaborations
We are also keenly seeking support and investment for our startup, VAGO solutions, where we continuously advance the development of robust language models designed to address a diverse range of purposes and requirements. If the prospect of collaboratively navigating future challenges excites you, we warmly invite you to reach out to us.
## Acknowledgement
Many thanks to [argilla](https://huggingface.co/datasets/argilla) and [Huggingface](https://huggingface.co) for providing such valuable datasets to the Open-Source community. And of course a big thanks to [upstage](https://huggingface.co/upstage) for providing the open source community with their latest technology!
|
Yhyu13/phi-2-sft-dpo-gpt4_en-ep1 | Yhyu13 | 2023-12-23T11:00:20Z | 71 | 8 | transformers | [
"transformers",
"safetensors",
"phi-msft",
"text-generation",
"custom_code",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-22T17:31:17Z | ---
license: other
license_name: microsoft-research-license
license_link: https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE
---
This is the merged model for LoRA https://huggingface.co/Yhyu13/phi-2-sft-dpo-gpt4_en-ep1-lora
This model is a dpo improvement to this base model https://huggingface.co/Yhyu13/phi-2-sft-alpaca_gpt4_en-ep1 who achieve better than text-davinci-003 on AlpcaEval judged by ChatGPT.
# AlpacaEval
Quote from this discussion
https://huggingface.co/microsoft/phi-2/discussions/38
Since phi2 requires remote code which HF open llm leaderboard would not accept at this moment,
I ran phi2 and my dpo to the AlpcaEval benchmark
https://tatsu-lab.github.io/alpaca_eval/
Here is result evaluated by chatpgpt https://github.com/tatsu-lab/alpaca_eval/pull/183
```
win_rate standard_error n_total avg_length
gpt4 73.79 1.54 805 1365
claude 70.37 1.60 805 1082
chatgpt 66.09 1.66 805 811
wizardlm-13b 65.16 1.67 805 985
vicuna-13b 64.10 1.69 805 1037
guanaco-65b 62.36 1.71 805 1249
oasst-rlhf-llama-33b 62.05 1.71 805 1079
alpaca-farm-ppo-human 60.25 1.72 805 803
falcon-40b-instruct 56.52 1.74 805 662
phi-2-alpaca-gpt4-dpo(new)55.60 1.75 804 4532
phi-2-alpaca-gpt4(new) 54.23 1.75 804 1138
text_davinci_003 50.00 0.00 805 307
alpaca-7b 45.22 1.74 805 396
phi-2(new) 43.79 1.74 805 924
text_davinci_001 28.07 1.56 805 296
```
phi-2-alpaca-gpt4-dpo is only slightly better than my previous sft phi-2-alpaca-gpt4, when evaluted by chatgpt, but the dpo tuned model outputs significantly longer result! |
esakov-s/bert-base-cased-finetuned-wikitext2 | esakov-s | 2023-12-23T10:57:45Z | 5 | 0 | transformers | [
"transformers",
"tf",
"tensorboard",
"bert",
"fill-mask",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2023-12-23T10:37:04Z | ---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: esakov-s/bert-base-cased-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# esakov-s/bert-base-cased-finetuned-wikitext2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.9563
- Validation Loss: 6.8870
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.4349 | 7.0185 | 0 |
| 6.9563 | 6.8870 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
Jjateen/Reinforce-CartPole-v1 | Jjateen | 2023-12-23T10:51:36Z | 0 | 0 | null | [
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] | reinforcement-learning | 2023-12-23T10:51:27Z | ---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-CartPole-v1
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
ntc-ai/SDXL-LoRA-slider.paper-mache | ntc-ai | 2023-12-23T10:43:12Z | 60 | 0 | diffusers | [
"diffusers",
"text-to-image",
"stable-diffusion-xl",
"lora",
"template:sd-lora",
"template:sdxl-lora",
"sdxl-sliders",
"ntcai.xyz-sliders",
"concept",
"en",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:mit",
"region:us"
] | text-to-image | 2023-12-23T10:43:08Z |
---
language:
- en
thumbnail: "images/evaluate/paper mache.../paper mache_17_3.0.png"
widget:
- text: paper mache
output:
url: images/paper mache_17_3.0.png
- text: paper mache
output:
url: images/paper mache_19_3.0.png
- text: paper mache
output:
url: images/paper mache_20_3.0.png
- text: paper mache
output:
url: images/paper mache_21_3.0.png
- text: paper mache
output:
url: images/paper mache_22_3.0.png
tags:
- text-to-image
- stable-diffusion-xl
- lora
- template:sd-lora
- template:sdxl-lora
- sdxl-sliders
- ntcai.xyz-sliders
- concept
- diffusers
license: "mit"
inference: false
instance_prompt: "paper mache"
base_model: "stabilityai/stable-diffusion-xl-base-1.0"
---
# ntcai.xyz slider - paper mache (SDXL LoRA)
| Strength: -3 | Strength: 0 | Strength: 3 |
| --- | --- | --- |
| <img src="images/paper mache_17_-3.0.png" width=256 height=256 /> | <img src="images/paper mache_17_0.0.png" width=256 height=256 /> | <img src="images/paper mache_17_3.0.png" width=256 height=256 /> |
| <img src="images/paper mache_19_-3.0.png" width=256 height=256 /> | <img src="images/paper mache_19_0.0.png" width=256 height=256 /> | <img src="images/paper mache_19_3.0.png" width=256 height=256 /> |
| <img src="images/paper mache_20_-3.0.png" width=256 height=256 /> | <img src="images/paper mache_20_0.0.png" width=256 height=256 /> | <img src="images/paper mache_20_3.0.png" width=256 height=256 /> |
## Download
Weights for this model are available in Safetensors format.
## Trigger words
You can apply this LoRA with trigger words for additional effect:
```
paper mache
```
## Use in diffusers
```python
from diffusers import StableDiffusionXLPipeline
from diffusers import EulerAncestralDiscreteScheduler
import torch
pipe = StableDiffusionXLPipeline.from_single_file("https://huggingface.co/martyn/sdxl-turbo-mario-merge-top-rated/blob/main/topRatedTurboxlLCM_v10.safetensors")
pipe.to("cuda")
pipe.scheduler = EulerAncestralDiscreteScheduler.from_config(pipe.scheduler.config)
# Load the LoRA
pipe.load_lora_weights('ntc-ai/SDXL-LoRA-slider.paper-mache', weight_name='paper mache.safetensors', adapter_name="paper mache")
# Activate the LoRA
pipe.set_adapters(["paper mache"], adapter_weights=[2.0])
prompt = "medieval rich kingpin sitting in a tavern, paper mache"
negative_prompt = "nsfw"
width = 512
height = 512
num_inference_steps = 10
guidance_scale = 2
image = pipe(prompt, negative_prompt=negative_prompt, width=width, height=height, guidance_scale=guidance_scale, num_inference_steps=num_inference_steps).images[0]
image.save('result.png')
```
## Support the Patreon
If you like this model please consider [joining our Patreon](https://www.patreon.com/NTCAI).
By joining our Patreon, you'll gain access to an ever-growing library of over 560+ unique and diverse LoRAs, covering a wide range of styles and genres. You'll also receive early access to new models and updates, exclusive behind-the-scenes content, and the powerful LoRA slider creator, allowing you to craft your own custom LoRAs and experiment with endless possibilities.
Your support on Patreon will allow us to continue developing and refining new models.
## Other resources
- [CivitAI](https://civitai.com/user/ntc) - Follow ntc on Civit for even more LoRAs
- [ntcai.xyz](https://ntcai.xyz) - See ntcai.xyz to find more articles and LoRAs
|
esakov-s/gpt2-finetuned-wikitext2 | esakov-s | 2023-12-23T10:32:10Z | 5 | 0 | transformers | [
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-23T10:03:43Z | ---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: esakov-s/gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# esakov-s/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.5025
- Validation Loss: 6.3546
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3185 | 6.7719 | 0 |
| 6.5025 | 6.3546 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ | TheBloke | 2023-12-23T10:25:36Z | 9 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"finetune",
"dpo",
"Instruct",
"augmentation",
"german",
"conversational",
"en",
"de",
"dataset:argilla/distilabel-math-preference-dpo",
"base_model:VAGOsolutions/SauerkrautLM-SOLAR-Instruct",
"base_model:quantized:VAGOsolutions/SauerkrautLM-SOLAR-Instruct",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"awq",
"region:us"
] | text-generation | 2023-12-23T09:56:28Z | ---
base_model: VAGOsolutions/SauerkrautLM-SOLAR-Instruct
datasets:
- argilla/distilabel-math-preference-dpo
inference: false
language:
- en
- de
library_name: transformers
license: cc-by-nc-4.0
model_creator: VAGO solutions
model_name: SauerkrautLM SOLAR Instruct
model_type: solar
pipeline_tag: text-generation
prompt_template: '### User:
{prompt}
### Assistant:
'
quantized_by: TheBloke
tags:
- finetune
- dpo
- Instruct
- augmentation
- german
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# SauerkrautLM SOLAR Instruct - AWQ
- Model creator: [VAGO solutions](https://huggingface.co/VAGOsolutions)
- Original model: [SauerkrautLM SOLAR Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct)
<!-- description start -->
## Description
This repo contains AWQ model files for [VAGO solutions's SauerkrautLM SOLAR Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
### About AWQ
AWQ is an efficient, accurate and blazing-fast low-bit weight quantization method, currently supporting 4-bit quantization. Compared to GPTQ, it offers faster Transformers-based inference with equivalent or better quality compared to the most commonly used GPTQ settings.
AWQ models are currently supported on Linux and Windows, with NVidia GPUs only. macOS users: please use GGUF models instead.
It is supported by:
- [Text Generation Webui](https://github.com/oobabooga/text-generation-webui) - using Loader: AutoAWQ
- [vLLM](https://github.com/vllm-project/vllm) - version 0.2.2 or later for support for all model types.
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
- [Transformers](https://huggingface.co/docs/transformers) version 4.35.0 and later, from any code or client that supports Transformers
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) - for use from Python code
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF)
* [VAGO solutions's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: User-Assistant-Newlines
```
### User:
{prompt}
### Assistant:
```
<!-- prompt-template end -->
<!-- README_AWQ.md-provided-files start -->
## Provided files, and AWQ parameters
I currently release 128g GEMM models only. The addition of group_size 32 models, and GEMV kernel models, is being actively considered.
Models are released as sharded safetensors files.
| Branch | Bits | GS | AWQ Dataset | Seq Len | Size |
| ------ | ---- | -- | ----------- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ/tree/main) | 4 | 128 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 5.96 GB
<!-- README_AWQ.md-provided-files end -->
<!-- README_AWQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ`.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `SauerkrautLM-SOLAR-Instruct-AWQ`
7. Select **Loader: AutoAWQ**.
8. Click Load, and the model will load and is now ready for use.
9. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
10. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_AWQ.md-text-generation-webui end -->
<!-- README_AWQ.md-use-from-vllm start -->
## Multi-user inference server: vLLM
Documentation on installing and using vLLM [can be found here](https://vllm.readthedocs.io/en/latest/).
- Please ensure you are using vLLM version 0.2 or later.
- When using vLLM as a server, pass the `--quantization awq` parameter.
For example:
```shell
python3 -m vllm.entrypoints.api_server --model TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ --quantization awq --dtype auto
```
- When using vLLM from Python code, again set `quantization=awq`.
For example:
```python
from vllm import LLM, SamplingParams
prompts = [
"Tell me about AI",
"Write a story about llamas",
"What is 291 - 150?",
"How much wood would a woodchuck chuck if a woodchuck could chuck wood?",
]
prompt_template=f'''### User:
{prompt}
### Assistant:
'''
prompts = [prompt_template.format(prompt=prompt) for prompt in prompts]
sampling_params = SamplingParams(temperature=0.8, top_p=0.95)
llm = LLM(model="TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ", quantization="awq", dtype="auto")
outputs = llm.generate(prompts, sampling_params)
# Print the outputs.
for output in outputs:
prompt = output.prompt
generated_text = output.outputs[0].text
print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
```
<!-- README_AWQ.md-use-from-vllm start -->
<!-- README_AWQ.md-use-from-tgi start -->
## Multi-user inference server: Hugging Face Text Generation Inference (TGI)
Use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ --port 3000 --quantize awq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires [huggingface-hub](https://github.com/huggingface/huggingface_hub) 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''### User:
{prompt}
### Assistant:
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: ", response)
```
<!-- README_AWQ.md-use-from-tgi end -->
<!-- README_AWQ.md-use-from-python start -->
## Inference from Python code using Transformers
### Install the necessary packages
- Requires: [Transformers](https://huggingface.co/docs/transformers) 4.35.0 or later.
- Requires: [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) 0.1.6 or later.
```shell
pip3 install --upgrade "autoawq>=0.1.6" "transformers>=4.35.0"
```
Note that if you are using PyTorch 2.0.1, the above AutoAWQ command will automatically upgrade you to PyTorch 2.1.0.
If you are using CUDA 11.8 and wish to continue using PyTorch 2.0.1, instead run this command:
```shell
pip3 install https://github.com/casper-hansen/AutoAWQ/releases/download/v0.1.6/autoawq-0.1.6+cu118-cp310-cp310-linux_x86_64.whl
```
If you have problems installing [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y autoawq
git clone https://github.com/casper-hansen/AutoAWQ
cd AutoAWQ
pip3 install .
```
### Transformers example code (requires Transformers 4.35.0 and later)
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
model_name_or_path = "TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ"
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
model = AutoModelForCausalLM.from_pretrained(
model_name_or_path,
low_cpu_mem_usage=True,
device_map="cuda:0"
)
# Using the text streamer to stream output one token at a time
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
prompt = "Tell me about AI"
prompt_template=f'''### User:
{prompt}
### Assistant:
'''
# Convert prompt to tokens
tokens = tokenizer(
prompt_template,
return_tensors='pt'
).input_ids.cuda()
generation_params = {
"do_sample": True,
"temperature": 0.7,
"top_p": 0.95,
"top_k": 40,
"max_new_tokens": 512,
"repetition_penalty": 1.1
}
# Generate streamed output, visible one token at a time
generation_output = model.generate(
tokens,
streamer=streamer,
**generation_params
)
# Generation without a streamer, which will include the prompt in the output
generation_output = model.generate(
tokens,
**generation_params
)
# Get the tokens from the output, decode them, print them
token_output = generation_output[0]
text_output = tokenizer.decode(token_output)
print("model.generate output: ", text_output)
# Inference is also possible via Transformers' pipeline
from transformers import pipeline
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
**generation_params
)
pipe_output = pipe(prompt_template)[0]['generated_text']
print("pipeline output: ", pipe_output)
```
<!-- README_AWQ.md-use-from-python end -->
<!-- README_AWQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with:
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui) using `Loader: AutoAWQ`.
- [vLLM](https://github.com/vllm-project/vllm) version 0.2.0 and later.
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) version 1.1.0 and later.
- [Transformers](https://huggingface.co/docs/transformers) version 4.35.0 and later.
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) version 0.1.1 and later.
<!-- README_AWQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: VAGO solutions's SauerkrautLM SOLAR Instruct

## VAGO solutions SauerkrautLM-SOLAR-Instruct
Introducing **SauerkrautLM-SOLAR-Instruct** โ our Sauerkraut version of the powerful [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0) !
Aligned with **DPO**
# Table of Contents
1. [Overview of all SauerkrautLM-SOLAR-Instruct models](#all-sauerkrautlm-solar-instruct-models)
2. [Model Details](#model-details)
- [Prompt template](#prompt-template)
- [Training Dataset](#training-dataset)
- [Data Contamination Test](#data-contamination-test-results)
3. [Evaluation](#evaluation)
5. [Disclaimer](#disclaimer)
6. [Contact](#contact)
7. [Collaborations](#collaborations)
8. [Acknowledgement](#acknowledgement)
## All SauerkrautLM-SOLAR-Instruct Models
| Model | HF | GPTQ | GGUF | AWQ |
|-------|-------|-------|-------|-------|
| SauerkrautLM-SOLAR-Instruct | [Link](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct/) | coming soon | coming soon | coming soon |
## Model Details
**SauerkrautLM-SOLAR-Instruct**
- **Model Type:** SauerkrautLM-SOLAR-Instruct is a finetuned Model based on [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0)
- **Language(s):** English, German
- **License:** cc-by-nc-4.0
- **Contact:** [Website](https://vago-solutions.de/#Kontakt) [David Golchinfar](mailto:[email protected])
### Training Dataset:
SauerkrautLM-SOLAR-Instruct was trained with mix of German data augmentation and translated data.
Aligned through **DPO** with our **new German SauerkrautLM-DPO dataset** based on parts of the SFT SauerkrautLM dataset
as chosen answers and [Sauerkraut-7b-HerO](https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-HerO) as rejected answers. Added with additional **translated Parts of the [HuggingFaceH4/ultrafeedback_binarized](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized)** (Our dataset do not contain any TruthfulQA prompts - check Data Contamination Test Results) and **[argilla/distilabel-math-preference-dpo](https://huggingface.co/datasets/argilla/distilabel-math-preference-dpo).**
We found, that only a simple translation of training data can lead to unnatural German phrasings.
Data augmentation techniques were used to grant grammatical, syntactical correctness and a more natural German wording in our training data.
We improved the German language skills on this model. Nevertheless, certain formulations may occur that are not entirely correct.
### Data Contamination Test Results
Some models on the HuggingFace leaderboard had problems with wrong data getting mixed in.
We checked our SauerkrautLM-DPO dataset with a special test [1] on this model as target model and upstage/SOLAR-10.7B-Instruct-v1.0 as reference model.
The HuggingFace team used the same methods [2, 3].
Our results, with `result < 0.1, %:` being well below 0.9, indicate that our dataset is free from contamination.
*The data contamination test results of HellaSwag and Winograde will be added once [1] supports them.*
| Dataset | ARC | MMLU | TruthfulQA | GSM8K |
|------------------------------|-------|-------|-------|-------|
| **SauerkrautLM-DPO**| result < 0.1, %: 0.0 |result < 0.1, %: 0.09 | result < 0.1, %: 0.13 | result < 0.1, %: 0.16 |
[1] https://github.com/swj0419/detect-pretrain-code-contamination
[2] https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/474#657f2245365456e362412a06
[3] https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/265#657b6debf81f6b44b8966230
### Prompt Template:
```
### User:
Hallo, wie geht es dir?
### Assistant:
Hallo! Es freut mich, dass du mit mir kommunizierst. Ich bin hier, um zu helfen und deine Anfragen zu erfรผllen. Du fragst, wie ich mich fรผhle. Als kรผnstliche Intelligenz habe ich keine eigentlichen Emotionen im Sinne eines Menschen, aber ich funktioniere optimal und bin bereit, Dienste anzubieten.
Wie geht es dir momentan? Kรถnnen wir zusammen etwas interessantes oder hilfreiches erledigen?
```
*Prompt Example on Temp 0.5
```
### User:
Hello, how are you?
### Assistant:
Hi there! I am an AI language model, so I don't have personal feelings or emotions in the traditional sense. However, I can assure you that my systems and processes are functioning well at this moment, allowing me to provide helpful responses for your queries.
How may I assist you today?
```
*Prompt Example on Temp 0.5
## Evaluation
| Metric | Value |
|-----------------------|---------------------------|
| Avg. | 74.21 |
| ARC (25-shot) | 70.82 |
| HellaSwag (10-shot) | 88.63 |
| MMLU (5-shot) | 66.2|
| TruthfulQA (0-shot) | 71.95 |
| Winogrande (5-shot) | 83.5 |
| GSM8K (5-shot) | 64.14 |
## Disclaimer
We must inform users that despite our best efforts in data cleansing, the possibility of uncensored content slipping through cannot be entirely ruled out.
However, we cannot guarantee consistently appropriate behavior. Therefore, if you encounter any issues or come across inappropriate content, we kindly request that you inform us through the contact information provided.
Additionally, it is essential to understand that the licensing of these models does not constitute legal advice. We are not held responsible for the actions of third parties who utilize our models.
ย
## Contact
If you are interested in customized LLMs for business applications, please get in contact with us via our website or contact us at [Dr. Daryoush Vaziri](mailto:[email protected]). We are also grateful for your feedback and suggestions.
ย
## Collaborations
We are also keenly seeking support and investment for our startup, VAGO solutions, where we continuously advance the development of robust language models designed to address a diverse range of purposes and requirements. If the prospect of collaboratively navigating future challenges excites you, we warmly invite you to reach out to us.
## Acknowledgement
Many thanks to [argilla](https://huggingface.co/datasets/argilla) and [Huggingface](https://huggingface.co) for providing such valuable datasets to the Open-Source community. And of course a big thanks to [upstage](https://huggingface.co/upstage) for providing the open source community with their latest technology!
|
yijisuk/segformer-b0-finetuned-segments-ic-chip-sample | yijisuk | 2023-12-23T10:22:16Z | 4 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"base_model:nvidia/mit-b0",
"base_model:finetune:nvidia/mit-b0",
"license:other",
"endpoints_compatible",
"region:us"
] | image-segmentation | 2023-12-23T10:02:27Z | ---
license: other
base_model: nvidia/mit-b0
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-segments-ic-chip-sample
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-segments-ic-chip-sample
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the yijisuk/ic-chip-sample dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0772
- Mean Iou: 0.4863
- Mean Accuracy: 0.9725
- Overall Accuracy: 0.9725
- Accuracy Unlabeled: nan
- Accuracy Circuit: 0.9725
- Iou Unlabeled: 0.0
- Iou Circuit: 0.9725
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|
| 0.0885 | 1.0 | 20 | 0.0858 | 0.4907 | 0.9815 | 0.9815 | nan | 0.9815 | 0.0 | 0.9815 |
| 0.1764 | 2.0 | 40 | 0.0854 | 0.4734 | 0.9469 | 0.9469 | nan | 0.9469 | 0.0 | 0.9469 |
| 0.0569 | 3.0 | 60 | 0.0854 | 0.4702 | 0.9404 | 0.9404 | nan | 0.9404 | 0.0 | 0.9404 |
| 0.0959 | 4.0 | 80 | 0.0851 | 0.4893 | 0.9786 | 0.9786 | nan | 0.9786 | 0.0 | 0.9786 |
| 0.2969 | 5.0 | 100 | 0.0825 | 0.4863 | 0.9727 | 0.9727 | nan | 0.9727 | 0.0 | 0.9727 |
| 0.1979 | 6.0 | 120 | 0.0824 | 0.4873 | 0.9746 | 0.9746 | nan | 0.9746 | 0.0 | 0.9746 |
| 0.0906 | 7.0 | 140 | 0.0824 | 0.4740 | 0.9480 | 0.9480 | nan | 0.9480 | 0.0 | 0.9480 |
| 0.2879 | 8.0 | 160 | 0.0821 | 0.4882 | 0.9764 | 0.9764 | nan | 0.9764 | 0.0 | 0.9764 |
| 0.1366 | 9.0 | 180 | 0.0807 | 0.4833 | 0.9666 | 0.9666 | nan | 0.9666 | 0.0 | 0.9666 |
| 0.1664 | 10.0 | 200 | 0.0813 | 0.4860 | 0.9720 | 0.9720 | nan | 0.9720 | 0.0 | 0.9720 |
| 0.1521 | 11.0 | 220 | 0.0831 | 0.4830 | 0.9660 | 0.9660 | nan | 0.9660 | 0.0 | 0.9660 |
| 0.2004 | 12.0 | 240 | 0.0795 | 0.4825 | 0.9651 | 0.9651 | nan | 0.9651 | 0.0 | 0.9651 |
| 0.1547 | 13.0 | 260 | 0.0793 | 0.4812 | 0.9625 | 0.9625 | nan | 0.9625 | 0.0 | 0.9625 |
| 0.4191 | 14.0 | 280 | 0.0788 | 0.4830 | 0.9659 | 0.9659 | nan | 0.9659 | 0.0 | 0.9659 |
| 0.0431 | 15.0 | 300 | 0.0782 | 0.4815 | 0.9630 | 0.9630 | nan | 0.9630 | 0.0 | 0.9630 |
| 1.3911 | 16.0 | 320 | 0.0793 | 0.4820 | 0.9640 | 0.9640 | nan | 0.9640 | 0.0 | 0.9640 |
| 0.0217 | 17.0 | 340 | 0.0814 | 0.4836 | 0.9671 | 0.9671 | nan | 0.9671 | 0.0 | 0.9671 |
| 0.1116 | 18.0 | 360 | 0.0789 | 0.4839 | 0.9678 | 0.9678 | nan | 0.9678 | 0.0 | 0.9678 |
| 0.3295 | 19.0 | 380 | 0.0791 | 0.4763 | 0.9526 | 0.9526 | nan | 0.9526 | 0.0 | 0.9526 |
| 0.0327 | 20.0 | 400 | 0.0792 | 0.4829 | 0.9658 | 0.9658 | nan | 0.9658 | 0.0 | 0.9658 |
| 0.2542 | 21.0 | 420 | 0.0787 | 0.4861 | 0.9722 | 0.9722 | nan | 0.9722 | 0.0 | 0.9722 |
| 0.1587 | 22.0 | 440 | 0.0783 | 0.4772 | 0.9543 | 0.9543 | nan | 0.9543 | 0.0 | 0.9543 |
| 0.2721 | 23.0 | 460 | 0.0804 | 0.4913 | 0.9825 | 0.9825 | nan | 0.9825 | 0.0 | 0.9825 |
| 0.0505 | 24.0 | 480 | 0.0781 | 0.4827 | 0.9655 | 0.9655 | nan | 0.9655 | 0.0 | 0.9655 |
| 0.1417 | 25.0 | 500 | 0.0801 | 0.4834 | 0.9669 | 0.9669 | nan | 0.9669 | 0.0 | 0.9669 |
| 0.1371 | 26.0 | 520 | 0.0777 | 0.4838 | 0.9676 | 0.9676 | nan | 0.9676 | 0.0 | 0.9676 |
| 0.1282 | 27.0 | 540 | 0.0773 | 0.4807 | 0.9613 | 0.9613 | nan | 0.9613 | 0.0 | 0.9613 |
| 0.057 | 28.0 | 560 | 0.0772 | 0.4829 | 0.9657 | 0.9657 | nan | 0.9657 | 0.0 | 0.9657 |
| 0.2592 | 29.0 | 580 | 0.0807 | 0.4872 | 0.9744 | 0.9744 | nan | 0.9744 | 0.0 | 0.9744 |
| 0.1687 | 30.0 | 600 | 0.0794 | 0.4825 | 0.9649 | 0.9649 | nan | 0.9649 | 0.0 | 0.9649 |
| 0.499 | 31.0 | 620 | 0.0805 | 0.4853 | 0.9706 | 0.9706 | nan | 0.9706 | 0.0 | 0.9706 |
| 0.1584 | 32.0 | 640 | 0.0790 | 0.4845 | 0.9691 | 0.9691 | nan | 0.9691 | 0.0 | 0.9691 |
| 0.0689 | 33.0 | 660 | 0.0785 | 0.4845 | 0.9690 | 0.9690 | nan | 0.9690 | 0.0 | 0.9690 |
| 1.3764 | 34.0 | 680 | 0.0790 | 0.4848 | 0.9696 | 0.9696 | nan | 0.9696 | 0.0 | 0.9696 |
| 0.2597 | 35.0 | 700 | 0.0808 | 0.4875 | 0.9751 | 0.9751 | nan | 0.9751 | 0.0 | 0.9751 |
| 1.0757 | 36.0 | 720 | 0.0761 | 0.4841 | 0.9681 | 0.9681 | nan | 0.9681 | 0.0 | 0.9681 |
| 0.6112 | 37.0 | 740 | 0.0779 | 0.4825 | 0.9650 | 0.9650 | nan | 0.9650 | 0.0 | 0.9650 |
| 0.2899 | 38.0 | 760 | 0.0787 | 0.4796 | 0.9591 | 0.9591 | nan | 0.9591 | 0.0 | 0.9591 |
| 0.3402 | 39.0 | 780 | 0.0777 | 0.4838 | 0.9676 | 0.9676 | nan | 0.9676 | 0.0 | 0.9676 |
| 0.0183 | 40.0 | 800 | 0.0771 | 0.4829 | 0.9657 | 0.9657 | nan | 0.9657 | 0.0 | 0.9657 |
| 0.1407 | 41.0 | 820 | 0.0774 | 0.4809 | 0.9617 | 0.9617 | nan | 0.9617 | 0.0 | 0.9617 |
| 0.4045 | 42.0 | 840 | 0.0767 | 0.4819 | 0.9638 | 0.9638 | nan | 0.9638 | 0.0 | 0.9638 |
| 0.2159 | 43.0 | 860 | 0.0780 | 0.4850 | 0.9699 | 0.9699 | nan | 0.9699 | 0.0 | 0.9699 |
| 0.0541 | 44.0 | 880 | 0.0768 | 0.4812 | 0.9624 | 0.9624 | nan | 0.9624 | 0.0 | 0.9624 |
| 0.0638 | 45.0 | 900 | 0.0774 | 0.4863 | 0.9726 | 0.9726 | nan | 0.9726 | 0.0 | 0.9726 |
| 0.0409 | 46.0 | 920 | 0.0788 | 0.4875 | 0.9749 | 0.9749 | nan | 0.9749 | 0.0 | 0.9749 |
| 0.1662 | 47.0 | 940 | 0.0774 | 0.4871 | 0.9743 | 0.9743 | nan | 0.9743 | 0.0 | 0.9743 |
| 0.1636 | 48.0 | 960 | 0.0783 | 0.4860 | 0.9720 | 0.9720 | nan | 0.9720 | 0.0 | 0.9720 |
| 0.033 | 49.0 | 980 | 0.0791 | 0.4882 | 0.9764 | 0.9764 | nan | 0.9764 | 0.0 | 0.9764 |
| 0.171 | 50.0 | 1000 | 0.0772 | 0.4863 | 0.9725 | 0.9725 | nan | 0.9725 | 0.0 | 0.9725 |
### Framework versions
- Transformers 4.36.2
- Pytorch 1.11.0+cu115
- Datasets 2.15.0
- Tokenizers 0.15.0
|
TheBloke/firefly-mixtral-8x7b-GGUF | TheBloke | 2023-12-23T10:14:57Z | 225 | 10 | transformers | [
"transformers",
"gguf",
"mixtral",
"en",
"base_model:YeungNLP/firefly-mixtral-8x7b",
"base_model:quantized:YeungNLP/firefly-mixtral-8x7b",
"license:apache-2.0",
"region:us"
] | null | 2023-12-19T10:04:23Z | ---
base_model: YeungNLP/firefly-mixtral-8x7b
inference: false
language:
- en
license: apache-2.0
model_creator: YeungNLP
model_name: Firefly Mixtral 8X7B
model_type: mixtral
prompt_template: '{prompt}
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Firefly Mixtral 8X7B - GGUF
- Model creator: [YeungNLP](https://huggingface.co/YeungNLP)
- Original model: [Firefly Mixtral 8X7B](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b)
<!-- description start -->
## Description
This repo contains GGUF format model files for [YeungNLP's Firefly Mixtral 8X7B](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
<!-- README_GGUF.md-about-gguf end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF)
* [YeungNLP's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: None
```
{prompt}
```
<!-- prompt-template end -->
<!-- compatibility_gguf start -->
## Compatibility
These quantised GGUFv2 files are compatible with llama.cpp from August 27th onwards, as of commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221)
They are also compatible with many third party UIs and libraries - please see the list at the top of this README.
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
Refer to the Provided Files table below to see what files use which methods, and how.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-provided-files start -->
## Provided files
| Name | Quant method | Bits | Size | Max RAM required | Use case |
| ---- | ---- | ---- | ---- | ---- | ----- |
| [firefly-mixtral-8x7b.Q2_K.gguf](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF/blob/main/firefly-mixtral-8x7b.Q2_K.gguf) | Q2_K | 2 | 15.64 GB| 18.14 GB | smallest, significant quality loss - not recommended for most purposes |
| [firefly-mixtral-8x7b.Q3_K_M.gguf](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF/blob/main/firefly-mixtral-8x7b.Q3_K_M.gguf) | Q3_K_M | 3 | 20.36 GB| 22.86 GB | very small, high quality loss |
| [firefly-mixtral-8x7b.Q4_0.gguf](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF/blob/main/firefly-mixtral-8x7b.Q4_0.gguf) | Q4_0 | 4 | 26.44 GB| 28.94 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [firefly-mixtral-8x7b.Q4_K_M.gguf](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF/blob/main/firefly-mixtral-8x7b.Q4_K_M.gguf) | Q4_K_M | 4 | 26.44 GB| 28.94 GB | medium, balanced quality - recommended |
| [firefly-mixtral-8x7b.Q5_0.gguf](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF/blob/main/firefly-mixtral-8x7b.Q5_0.gguf) | Q5_0 | 5 | 32.23 GB| 34.73 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [firefly-mixtral-8x7b.Q5_K_M.gguf](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF/blob/main/firefly-mixtral-8x7b.Q5_K_M.gguf) | Q5_K_M | 5 | 32.23 GB| 34.73 GB | large, very low quality loss - recommended |
| [firefly-mixtral-8x7b.Q6_K.gguf](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF/blob/main/firefly-mixtral-8x7b.Q6_K.gguf) | Q6_K | 6 | 38.38 GB| 40.88 GB | very large, extremely low quality loss |
| [firefly-mixtral-8x7b.Q8_0.gguf](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF/blob/main/firefly-mixtral-8x7b.Q8_0.gguf) | Q8_0 | 8 | 49.63 GB| 52.13 GB | very large, extremely low quality loss - not recommended |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
<!-- README_GGUF.md-provided-files end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: TheBloke/firefly-mixtral-8x7b-GGUF and below it, a specific filename to download, such as: firefly-mixtral-8x7b.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download TheBloke/firefly-mixtral-8x7b-GGUF firefly-mixtral-8x7b.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download TheBloke/firefly-mixtral-8x7b-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/firefly-mixtral-8x7b-GGUF firefly-mixtral-8x7b.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m firefly-mixtral-8x7b.Q4_K_M.gguf --color -c 32768 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "{prompt}"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 32768` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 โ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./firefly-mixtral-8x7b.Q4_K_M.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"{prompt}", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./firefly-mixtral-8x7b.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: YeungNLP's Firefly Mixtral 8X7B
This model is finetuend on "mistralai/Mixtral-8x7B-v0.1" with Firefly
## Run the model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_name_or_path = 'YeungNLP/firefly-mixtral-8x7b'
max_new_tokens = 500
top_p = 0.9
temperature = 0.35
repetition_penalty = 1.0
model = AutoModelForCausalLM.from_pretrained(
model_name_or_path,
trust_remote_code=True,
low_cpu_mem_usage=True,
torch_dtype=torch.float16,
device_map='auto'
)
model = model.eval()
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
text = "Compose an engaging travel blog post about a recent trip to Hawaii, highlighting cultural experiences and must-see attractions."
inst_begin_tokens = tokenizer.encode('[INST]', add_special_tokens=False)
inst_end_tokens = tokenizer.encode('[/INST]', add_special_tokens=False)
human_tokens = tokenizer.encode(text, add_special_tokens=False)
input_ids = [tokenizer.bos_token_id] + inst_begin_tokens + human_tokens + inst_end_tokens
# input_ids = human_tokens
input_ids = torch.tensor([input_ids], dtype=torch.long).cuda()
with torch.no_grad():
outputs = model.generate(
input_ids=input_ids, max_new_tokens=max_new_tokens, do_sample=True,
top_p=top_p, temperature=temperature, repetition_penalty=repetition_penalty,
eos_token_id=tokenizer.eos_token_id
)
outputs = outputs.tolist()[0][len(input_ids[0]):]
response = tokenizer.decode(outputs)
response = response.strip().replace(tokenizer.eos_token, "").strip()
print("Chatbot๏ผ{}".format(response))
```
<!-- original-model-card end -->
|
TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF | TheBloke | 2023-12-23T10:05:06Z | 148 | 2 | transformers | [
"transformers",
"gguf",
"solar",
"finetune",
"dpo",
"Instruct",
"augmentation",
"german",
"text-generation",
"en",
"de",
"dataset:argilla/distilabel-math-preference-dpo",
"base_model:VAGOsolutions/SauerkrautLM-SOLAR-Instruct",
"base_model:quantized:VAGOsolutions/SauerkrautLM-SOLAR-Instruct",
"license:cc-by-nc-4.0",
"region:us",
"conversational"
] | text-generation | 2023-12-23T09:56:28Z | ---
base_model: VAGOsolutions/SauerkrautLM-SOLAR-Instruct
datasets:
- argilla/distilabel-math-preference-dpo
inference: false
language:
- en
- de
library_name: transformers
license: cc-by-nc-4.0
model_creator: VAGO solutions
model_name: SauerkrautLM SOLAR Instruct
model_type: solar
pipeline_tag: text-generation
prompt_template: '### User:
{prompt}
### Assistant:
'
quantized_by: TheBloke
tags:
- finetune
- dpo
- Instruct
- augmentation
- german
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# SauerkrautLM SOLAR Instruct - GGUF
- Model creator: [VAGO solutions](https://huggingface.co/VAGOsolutions)
- Original model: [SauerkrautLM SOLAR Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct)
<!-- description start -->
## Description
This repo contains GGUF format model files for [VAGO solutions's SauerkrautLM SOLAR Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
<!-- README_GGUF.md-about-gguf end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF)
* [VAGO solutions's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: User-Assistant-Newlines
```
### User:
{prompt}
### Assistant:
```
<!-- prompt-template end -->
<!-- compatibility_gguf start -->
## Compatibility
These quantised GGUFv2 files are compatible with llama.cpp from August 27th onwards, as of commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221)
They are also compatible with many third party UIs and libraries - please see the list at the top of this README.
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
Refer to the Provided Files table below to see what files use which methods, and how.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-provided-files start -->
## Provided files
| Name | Quant method | Bits | Size | Max RAM required | Use case |
| ---- | ---- | ---- | ---- | ---- | ----- |
| [sauerkrautlm-solar-instruct.Q2_K.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q2_K.gguf) | Q2_K | 2 | 4.55 GB| 7.05 GB | smallest, significant quality loss - not recommended for most purposes |
| [sauerkrautlm-solar-instruct.Q3_K_S.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q3_K_S.gguf) | Q3_K_S | 3 | 4.66 GB| 7.16 GB | very small, high quality loss |
| [sauerkrautlm-solar-instruct.Q3_K_M.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q3_K_M.gguf) | Q3_K_M | 3 | 5.19 GB| 7.69 GB | very small, high quality loss |
| [sauerkrautlm-solar-instruct.Q3_K_L.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q3_K_L.gguf) | Q3_K_L | 3 | 5.65 GB| 8.15 GB | small, substantial quality loss |
| [sauerkrautlm-solar-instruct.Q4_0.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q4_0.gguf) | Q4_0 | 4 | 6.07 GB| 8.57 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [sauerkrautlm-solar-instruct.Q4_K_S.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q4_K_S.gguf) | Q4_K_S | 4 | 6.10 GB| 8.60 GB | small, greater quality loss |
| [sauerkrautlm-solar-instruct.Q4_K_M.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q4_K_M.gguf) | Q4_K_M | 4 | 6.46 GB| 8.96 GB | medium, balanced quality - recommended |
| [sauerkrautlm-solar-instruct.Q5_0.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q5_0.gguf) | Q5_0 | 5 | 7.40 GB| 9.90 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [sauerkrautlm-solar-instruct.Q5_K_S.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q5_K_S.gguf) | Q5_K_S | 5 | 7.40 GB| 9.90 GB | large, low quality loss - recommended |
| [sauerkrautlm-solar-instruct.Q5_K_M.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q5_K_M.gguf) | Q5_K_M | 5 | 7.60 GB| 10.10 GB | large, very low quality loss - recommended |
| [sauerkrautlm-solar-instruct.Q6_K.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q6_K.gguf) | Q6_K | 6 | 8.81 GB| 11.31 GB | very large, extremely low quality loss |
| [sauerkrautlm-solar-instruct.Q8_0.gguf](https://huggingface.co/TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF/blob/main/sauerkrautlm-solar-instruct.Q8_0.gguf) | Q8_0 | 8 | 11.40 GB| 13.90 GB | very large, extremely low quality loss - not recommended |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
<!-- README_GGUF.md-provided-files end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF and below it, a specific filename to download, such as: sauerkrautlm-solar-instruct.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF sauerkrautlm-solar-instruct.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/SauerkrautLM-SOLAR-Instruct-GGUF sauerkrautlm-solar-instruct.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m sauerkrautlm-solar-instruct.Q4_K_M.gguf --color -c 8192 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "### User:\n{prompt}\n\n### Assistant:"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 8192` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 โ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./sauerkrautlm-solar-instruct.Q4_K_M.gguf", # Download the model file first
n_ctx=8192, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"### User:\n{prompt}\n\n### Assistant:", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./sauerkrautlm-solar-instruct.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: VAGO solutions's SauerkrautLM SOLAR Instruct

## VAGO solutions SauerkrautLM-SOLAR-Instruct
Introducing **SauerkrautLM-SOLAR-Instruct** โ our Sauerkraut version of the powerful [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0) !
Aligned with **DPO**
# Table of Contents
1. [Overview of all SauerkrautLM-SOLAR-Instruct models](#all-sauerkrautlm-solar-instruct-models)
2. [Model Details](#model-details)
- [Prompt template](#prompt-template)
- [Training Dataset](#training-dataset)
- [Data Contamination Test](#data-contamination-test-results)
3. [Evaluation](#evaluation)
5. [Disclaimer](#disclaimer)
6. [Contact](#contact)
7. [Collaborations](#collaborations)
8. [Acknowledgement](#acknowledgement)
## All SauerkrautLM-SOLAR-Instruct Models
| Model | HF | GPTQ | GGUF | AWQ |
|-------|-------|-------|-------|-------|
| SauerkrautLM-SOLAR-Instruct | [Link](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct/) | coming soon | coming soon | coming soon |
## Model Details
**SauerkrautLM-SOLAR-Instruct**
- **Model Type:** SauerkrautLM-SOLAR-Instruct is a finetuned Model based on [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0)
- **Language(s):** English, German
- **License:** cc-by-nc-4.0
- **Contact:** [Website](https://vago-solutions.de/#Kontakt) [David Golchinfar](mailto:[email protected])
### Training Dataset:
SauerkrautLM-SOLAR-Instruct was trained with mix of German data augmentation and translated data.
Aligned through **DPO** with our **new German SauerkrautLM-DPO dataset** based on parts of the SFT SauerkrautLM dataset
as chosen answers and [Sauerkraut-7b-HerO](https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-HerO) as rejected answers. Added with additional **translated Parts of the [HuggingFaceH4/ultrafeedback_binarized](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized)** (Our dataset do not contain any TruthfulQA prompts - check Data Contamination Test Results) and **[argilla/distilabel-math-preference-dpo](https://huggingface.co/datasets/argilla/distilabel-math-preference-dpo).**
We found, that only a simple translation of training data can lead to unnatural German phrasings.
Data augmentation techniques were used to grant grammatical, syntactical correctness and a more natural German wording in our training data.
We improved the German language skills on this model. Nevertheless, certain formulations may occur that are not entirely correct.
### Data Contamination Test Results
Some models on the HuggingFace leaderboard had problems with wrong data getting mixed in.
We checked our SauerkrautLM-DPO dataset with a special test [1] on this model as target model and upstage/SOLAR-10.7B-Instruct-v1.0 as reference model.
The HuggingFace team used the same methods [2, 3].
Our results, with `result < 0.1, %:` being well below 0.9, indicate that our dataset is free from contamination.
*The data contamination test results of HellaSwag and Winograde will be added once [1] supports them.*
| Dataset | ARC | MMLU | TruthfulQA | GSM8K |
|------------------------------|-------|-------|-------|-------|
| **SauerkrautLM-DPO**| result < 0.1, %: 0.0 |result < 0.1, %: 0.09 | result < 0.1, %: 0.13 | result < 0.1, %: 0.16 |
[1] https://github.com/swj0419/detect-pretrain-code-contamination
[2] https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/474#657f2245365456e362412a06
[3] https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/265#657b6debf81f6b44b8966230
### Prompt Template:
```
### User:
Hallo, wie geht es dir?
### Assistant:
Hallo! Es freut mich, dass du mit mir kommunizierst. Ich bin hier, um zu helfen und deine Anfragen zu erfรผllen. Du fragst, wie ich mich fรผhle. Als kรผnstliche Intelligenz habe ich keine eigentlichen Emotionen im Sinne eines Menschen, aber ich funktioniere optimal und bin bereit, Dienste anzubieten.
Wie geht es dir momentan? Kรถnnen wir zusammen etwas interessantes oder hilfreiches erledigen?
```
*Prompt Example on Temp 0.5
```
### User:
Hello, how are you?
### Assistant:
Hi there! I am an AI language model, so I don't have personal feelings or emotions in the traditional sense. However, I can assure you that my systems and processes are functioning well at this moment, allowing me to provide helpful responses for your queries.
How may I assist you today?
```
*Prompt Example on Temp 0.5
## Evaluation
| Metric | Value |
|-----------------------|---------------------------|
| Avg. | 74.21 |
| ARC (25-shot) | 70.82 |
| HellaSwag (10-shot) | 88.63 |
| MMLU (5-shot) | 66.2|
| TruthfulQA (0-shot) | 71.95 |
| Winogrande (5-shot) | 83.5 |
| GSM8K (5-shot) | 64.14 |
## Disclaimer
We must inform users that despite our best efforts in data cleansing, the possibility of uncensored content slipping through cannot be entirely ruled out.
However, we cannot guarantee consistently appropriate behavior. Therefore, if you encounter any issues or come across inappropriate content, we kindly request that you inform us through the contact information provided.
Additionally, it is essential to understand that the licensing of these models does not constitute legal advice. We are not held responsible for the actions of third parties who utilize our models.
## Contact
If you are interested in customized LLMs for business applications, please get in contact with us via our website or contact us at [Dr. Daryoush Vaziri](mailto:[email protected]). We are also grateful for your feedback and suggestions.
## Collaborations
We are also keenly seeking support and investment for our startup, VAGO solutions, where we continuously advance the development of robust language models designed to address a diverse range of purposes and requirements. If the prospect of collaboratively navigating future challenges excites you, we warmly invite you to reach out to us.
## Acknowledgement
Many thanks to [argilla](https://huggingface.co/datasets/argilla) and [Huggingface](https://huggingface.co) for providing such valuable datasets to the Open-Source community. And of course a big thanks to [upstage](https://huggingface.co/upstage) for providing the open source community with their latest technology!
<!-- original-model-card end -->
|
Ragzz258/Falcon7B-prompt-to-html | Ragzz258 | 2023-12-23T10:04:24Z | 1 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:vilsonrodrigues/falcon-7b-instruct-sharded",
"base_model:adapter:vilsonrodrigues/falcon-7b-instruct-sharded",
"region:us"
] | null | 2023-12-23T10:04:06Z | ---
library_name: peft
base_model: vilsonrodrigues/falcon-7b-instruct-sharded
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.7.2.dev0 |
UMCU/RobBERT_NegationDetection_32xTokenWindow | UMCU | 2023-12-23T10:03:26Z | 8 | 1 | transformers | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"token-classification",
"nl",
"arxiv:2209.00470",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2022-09-08T17:49:03Z | ---
language: nl
license: mit
---
# MedRoBERTa.nl finetuned for negation
## Description
This model is a finetuned RoBERTa-based model called RobBERT, this model is pre-trained on the Dutch section of OSCAR. All code used for the creation of RobBERT can be found here https://github.com/iPieter/RobBERT. The publication associated with the negation detection task can be found at https://arxiv.org/abs/2209.00470. The code for finetuning the model can be found at https://github.com/umcu/negation-detection.
## Intended use
The model is finetuned for negation detection on Dutch clinical text. Since it is a domain-specific model trained on medical data, it is meant to be used on medical NLP tasks for Dutch. This particular model is trained on a 32-max token windows surrounding the concept-to-be negated. Note that we also trained a biLSTM which can be incorporated in [MedCAT](https://github.com/CogStack/MedCAT).
## Minimal example
```python
tokenizer = AutoTokenizer\
.from_pretrained("UMCU/MedRoBERTa.nl_NegationDetection")
model = AutoModelForTokenClassification\
.from_pretrained("UMCU/MedRoBERTa.nl_NegationDetection")
some_text = "De patient was niet aanspreekbaar en hij zag er grauw uit. \
Hij heeft de inspanningstest echter goed doorstaan."
inputs = tokenizer(some_text, return_tensors='pt')
output = model.forward(inputs)
probas = torch.nn.functional.softmax(output.logits[0]).detach().numpy()
# koppel aan tokens
input_tokens = tokenizer.convert_ids_to_tokens(inputs['input_ids'][0])
target_map = {0: 'B-Negated', 1:'B-NotNegated',2:'I-Negated',3:'I-NotNegated'}
results = [{'token': input_tokens[idx],
'proba_negated': proba_arr[0]+proba_arr[2],
'proba_not_negated': proba_arr[1]+proba_arr[3]
}
for idx,proba_arr in enumerate(probas)]
```
It is perhaps good to note that we assume the [Inside-Outside-Beginning](https://en.wikipedia.org/wiki/Inside%E2%80%93outside%E2%80%93beginning_(tagging)) format.
## Data
The pre-trained model was trained the Dutch section of OSCAR (about 39GB), and is described here: http://dx.doi.org/10.18653/v1/2020.findings-emnlp.292.
## Authors
RobBERT: Pieter Delobelle, Thomas Winters, Bettina Berendt,
Finetuning: Bram van Es, Sebastiaan Arends.
## Contact
If you are having problems with this model please add an issue on our git: https://github.com/umcu/negation-detection/issues
## Usage
If you use the model in your work please use the following referrals;
(model) https://doi.org/10.5281/zenodo.6980076 and (paper) https://doi.org/10.1186/s12859-022-05130-x
## References
Paper: Pieter Delobelle, Thomas Winters, Bettina Berendt (2020), RobBERT: a Dutch RoBERTa-based Language Model, Findings of the Association for Computational Linguistics: EMNLP 2020
Paper: Bram van Es, Leon C. Reteig, Sander C. Tan, Marijn Schraagen, Myrthe M. Hemker, Sebastiaan R.S. Arends, Miguel A.R. Rios, Saskia Haitjema (2022): Negation detection in Dutch clinical texts: an evaluation of rule-based and machine learning methods, Arxiv
|
TheBloke/firefly-mixtral-8x7b-GPTQ | TheBloke | 2023-12-23T10:01:39Z | 24 | 3 | transformers | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"en",
"base_model:YeungNLP/firefly-mixtral-8x7b",
"base_model:quantized:YeungNLP/firefly-mixtral-8x7b",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"gptq",
"region:us"
] | text-generation | 2023-12-19T10:04:23Z | ---
base_model: YeungNLP/firefly-mixtral-8x7b
inference: false
language:
- en
license: apache-2.0
model_creator: YeungNLP
model_name: Firefly Mixtral 8X7B
model_type: mixtral
prompt_template: '[INST] {prompt} [/INST]
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Firefly Mixtral 8X7B - GPTQ
- Model creator: [YeungNLP](https://huggingface.co/YeungNLP)
- Original model: [Firefly Mixtral 8X7B](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b)
<!-- description start -->
# Description
This repo contains GPTQ model files for [YeungNLP's Firefly Mixtral 8X7B](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GGUF)
* [YeungNLP's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/YeungNLP/firefly-mixtral-8x7b)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Mistral
```
[INST] {prompt} [/INST]
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-compatible clients start -->
## Known compatible clients / servers
GPTQ models are currently supported on Linux (NVidia/AMD) and Windows (NVidia only). macOS users: please use GGUF models.
These GPTQ models are known to work in the following inference servers/webuis.
- [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
- [KoboldAI United](https://github.com/henk717/koboldai)
- [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui)
- [Hugging Face Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
This may not be a complete list; if you know of others, please let me know!
<!-- README_GPTQ.md-compatible clients end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files, and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
Most GPTQ files are made with AutoGPTQ. Mistral models are currently made with Transformers.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The calibration dataset used during quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ calibration dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama and Mistral models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ/tree/main) | 4 | None | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 23.81 GB | No | 4-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 24.70 GB | No | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 27.42 GB | No | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-3bit--1g-actorder_True](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ/tree/gptq-3bit--1g-actorder_True) | 3 | None | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 18.01 GB | No | 3-bit, with Act Order and no group size. Lowest possible VRAM requirements. May be lower quality than 3-bit 128g. |
| [gptq-3bit-128g-actorder_True](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ/tree/gptq-3bit-128g-actorder_True) | 3 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 18.85 GB | No | 3-bit, with group size 128g and act-order. Higher quality than 128g-False. |
| [gptq-3bit-32g-actorder_True](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ/tree/gptq-3bit-32g-actorder_True) | 3 | 32 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 21.43 GB | No | 3-bit, with group size 64g and act-order. Highest quality 3-bit option. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 47.04 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [VMware Open Instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer/) | 8192 | 48.10 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download, including from branches
### In text-generation-webui
To download from the `main` branch, enter `TheBloke/firefly-mixtral-8x7b-GPTQ` in the "Download model" box.
To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/firefly-mixtral-8x7b-GPTQ:gptq-4bit-128g-actorder_True`
### From the command line
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
To download the `main` branch to a folder called `firefly-mixtral-8x7b-GPTQ`:
```shell
mkdir firefly-mixtral-8x7b-GPTQ
huggingface-cli download TheBloke/firefly-mixtral-8x7b-GPTQ --local-dir firefly-mixtral-8x7b-GPTQ --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir firefly-mixtral-8x7b-GPTQ
huggingface-cli download TheBloke/firefly-mixtral-8x7b-GPTQ --revision gptq-4bit-128g-actorder_True --local-dir firefly-mixtral-8x7b-GPTQ --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Hugging Face cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
mkdir firefly-mixtral-8x7b-GPTQ
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/firefly-mixtral-8x7b-GPTQ --local-dir firefly-mixtral-8x7b-GPTQ --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
### With `git` (**not** recommended)
To clone a specific branch with `git`, use a command like this:
```shell
git clone --single-branch --branch gptq-4bit-128g-actorder_True https://huggingface.co/TheBloke/firefly-mixtral-8x7b-GPTQ
```
Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/firefly-mixtral-8x7b-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/firefly-mixtral-8x7b-GPTQ:gptq-4bit-128g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `firefly-mixtral-8x7b-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
- Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation** tab and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
It's recommended to use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/firefly-mixtral-8x7b-GPTQ --port 3000 --quantize gptq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''[INST] {prompt} [/INST]
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_GPTQ.md-use-from-tgi end -->
<!-- README_GPTQ.md-use-from-python start -->
## Python code example: inference from this GPTQ model
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install --upgrade transformers optimum
# If using PyTorch 2.1 + CUDA 12.x:
pip3 install --upgrade auto-gptq
# or, if using PyTorch 2.1 + CUDA 11.x:
pip3 install --upgrade auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/
```
If you are using PyTorch 2.0, you will need to install AutoGPTQ from source. Likewise if you have problems with the pre-built wheels, you should try building from source:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
git checkout v0.5.1
pip3 install .
```
### Example Python code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/firefly-mixtral-8x7b-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-128g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Write a story about llamas"
system_message = "You are a story writing assistant"
prompt_template=f'''[INST] {prompt} [/INST]
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with Transformers. For non-Mistral models, AutoGPTQ can also be used directly.
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama architecture models (including Mistral, Yi, DeepSeek, SOLAR, etc) in 4-bit. Please see the Provided Files table above for per-file compatibility.
For a list of clients/servers, please see "Known compatible clients / servers", above.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Michael Levine, ้ฟๆ, Trailburnt, Nikolai Manek, John Detwiler, Randy H, Will Dee, Sebastain Graf, NimbleBox.ai, Eugene Pentland, Emad Mostaque, Ai Maven, Jim Angel, Jeff Scroggin, Michael Davis, Manuel Alberto Morcote, Stephen Murray, Robert, Justin Joy, Luke @flexchar, Brandon Frisco, Elijah Stavena, S_X, Dan Guido, Undi ., Komninos Chatzipapas, Shadi, theTransient, Lone Striker, Raven Klaugh, jjj, Cap'n Zoog, Michel-Marie MAUDET (LINAGORA), Matthew Berman, David, Fen Risland, Omer Bin Jawed, Luke Pendergrass, Kalila, OG, Erik Bjรคreholt, Rooh Singh, Joseph William Delisle, Dan Lewis, TL, John Villwock, AzureBlack, Brad, Pedro Madruga, Caitlyn Gatomon, K, jinyuan sun, Mano Prime, Alex, Jeffrey Morgan, Alicia Loh, Illia Dulskyi, Chadd, transmissions 11, fincy, Rainer Wilmers, ReadyPlayerEmma, knownsqashed, Mandus, biorpg, Deo Leter, Brandon Phillips, SuperWojo, Sean Connelly, Iucharbius, Jack West, Harry Royden McLaughlin, Nicholas, terasurfer, Vitor Caleffi, Duane Dunston, Johann-Peter Hartmann, David Ziegler, Olakabola, Ken Nordquist, Trenton Dambrowitz, Tom X Nguyen, Vadim, Ajan Kanaga, Leonard Tan, Clay Pascal, Alexandros Triantafyllidis, JM33133, Xule, vamX, ya boyyy, subjectnull, Talal Aujan, Alps Aficionado, wassieverse, Ari Malik, James Bentley, Woland, Spencer Kim, Michael Dempsey, Fred von Graf, Elle, zynix, William Richards, Stanislav Ovsiannikov, Edmond Seymore, Jonathan Leane, Martin Kemka, usrbinkat, Enrico Ros
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: YeungNLP's Firefly Mixtral 8X7B
This model is finetuend on "mistralai/Mixtral-8x7B-v0.1" with Firefly
## Run the model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_name_or_path = 'YeungNLP/firefly-mixtral-8x7b'
max_new_tokens = 500
top_p = 0.9
temperature = 0.35
repetition_penalty = 1.0
model = AutoModelForCausalLM.from_pretrained(
model_name_or_path,
trust_remote_code=True,
low_cpu_mem_usage=True,
torch_dtype=torch.float16,
device_map='auto'
)
model = model.eval()
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
text = "Compose an engaging travel blog post about a recent trip to Hawaii, highlighting cultural experiences and must-see attractions."
inst_begin_tokens = tokenizer.encode('[INST]', add_special_tokens=False)
inst_end_tokens = tokenizer.encode('[/INST]', add_special_tokens=False)
human_tokens = tokenizer.encode(text, add_special_tokens=False)
input_ids = [tokenizer.bos_token_id] + inst_begin_tokens + human_tokens + inst_end_tokens
# input_ids = human_tokens
input_ids = torch.tensor([input_ids], dtype=torch.long).cuda()
with torch.no_grad():
outputs = model.generate(
input_ids=input_ids, max_new_tokens=max_new_tokens, do_sample=True,
top_p=top_p, temperature=temperature, repetition_penalty=repetition_penalty,
eos_token_id=tokenizer.eos_token_id
)
outputs = outputs.tolist()[0][len(input_ids[0]):]
response = tokenizer.decode(outputs)
response = response.strip().replace(tokenizer.eos_token, "").strip()
print("Chatbot๏ผ{}".format(response))
```
|
halilozturkci/Llama-2-7b-chat-hf-fine-tuned-adapters | halilozturkci | 2023-12-23T09:52:38Z | 0 | 0 | peft | [
"peft",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"base_model:adapter:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | 2023-12-23T09:43:19Z | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.7.2.dev0 |
ICM1/ICM1 | ICM1 | 2023-12-23T09:51:57Z | 0 | 0 | null | [
"en",
"hi",
"bn",
"region:us"
] | null | 2023-12-23T09:48:46Z | ---
language:
- en
- hi
- bn
--- |
lspahija/ppo-LunarLander-v2 | lspahija | 2023-12-23T09:35:04Z | 0 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-12-23T09:34:46Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 266.71 +/- 19.41
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Nour17/speecht5_finetuned_Andrew_NG_small | Nour17 | 2023-12-23T09:27:02Z | 3 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"speecht5",
"text-to-audio",
"generated_from_trainer",
"base_model:microsoft/speecht5_tts",
"base_model:finetune:microsoft/speecht5_tts",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-to-audio | 2023-12-23T08:56:04Z | ---
license: mit
base_model: microsoft/speecht5_tts
tags:
- generated_from_trainer
model-index:
- name: speecht5_finetuned_Andrew_NG_small
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# speecht5_finetuned_Andrew_NG_small
This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4969
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- training_steps: 500
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.5266 | 35.4 | 500 | 0.4969 |
### Framework versions
- Transformers 4.35.0
- Pytorch 2.0.0
- Datasets 2.16.0
- Tokenizers 0.14.1
|
MaksKhramtsov/bert-base-cased-finetuned-wikitext2 | MaksKhramtsov | 2023-12-23T09:17:31Z | 3 | 0 | transformers | [
"transformers",
"tf",
"tensorboard",
"bert",
"fill-mask",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2023-12-23T08:55:33Z | ---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: MaksKhramtsov/bert-base-cased-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# MaksKhramtsov/bert-base-cased-finetuned-wikitext2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.9564
- Validation Loss: 6.9197
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.4237 | 7.0224 | 0 |
| 6.9564 | 6.9197 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
urbija/llama-fine-tuned-i | urbija | 2023-12-23T09:14:29Z | 0 | 0 | peft | [
"peft",
"region:us"
] | null | 2023-12-23T09:14:21Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float32
### Framework versions
- PEFT 0.4.0
|
olga-mi-2002/bert-base-cased-finetuned-wikitext2 | olga-mi-2002 | 2023-12-23T08:54:50Z | 3 | 0 | transformers | [
"transformers",
"tf",
"tensorboard",
"bert",
"fill-mask",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2023-12-23T08:32:43Z | ---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: olga-mi-2002/bert-base-cased-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# olga-mi-2002/bert-base-cased-finetuned-wikitext2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.9575
- Validation Loss: 6.8966
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.4333 | 7.0572 | 0 |
| 6.9575 | 6.8966 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
carles-undergrad-thesis/st-indobert-mmarco-inbatch | carles-undergrad-thesis | 2023-12-23T08:54:06Z | 3 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2023-12-23T08:53:16Z | ---
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# carles-undergrad-thesis/st-indobert-mmarco-inbatch
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('carles-undergrad-thesis/st-indobert-mmarco-inbatch')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
def cls_pooling(model_output, attention_mask):
return model_output[0][:,0]
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('carles-undergrad-thesis/st-indobert-mmarco-inbatch')
model = AutoModel.from_pretrained('carles-undergrad-thesis/st-indobert-mmarco-inbatch')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, cls pooling.
sentence_embeddings = cls_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=carles-undergrad-thesis/st-indobert-mmarco-inbatch)
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 16649 with parameters:
```
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
```
{'scale': 1.0, 'similarity_fct': 'dot_score'}
```
Parameters of the fit()-Method:
```
{
"epochs": 5,
"evaluation_steps": 1000000,
"evaluator": "sentence_transformers.evaluation.SequentialEvaluator.SequentialEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"correct_bias": false,
"eps": 1e-06,
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 8324,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> |
MaksKhramtsov/gpt2-finetuned-wikitext2 | MaksKhramtsov | 2023-12-23T08:53:05Z | 4 | 0 | transformers | [
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-22T21:04:09Z | ---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: MaksKhramtsov/gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# MaksKhramtsov/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.4992
- Validation Loss: 6.3552
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3172 | 6.7732 | 0 |
| 6.4992 | 6.3552 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
Deer8dog9/nm001 | Deer8dog9 | 2023-12-23T08:49:33Z | 5 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-12-23T01:39:23Z | ---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: nm001
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# nm001
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0568
- Matthews Correlation: 0.5400
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.2181 | 1.0 | 535 | 0.5151 | 0.5129 |
| 0.1866 | 2.0 | 1070 | 0.6990 | 0.5327 |
| 0.1425 | 3.0 | 1605 | 0.9239 | 0.5117 |
| 0.103 | 4.0 | 2140 | 1.0568 | 0.5400 |
| 0.0666 | 5.0 | 2675 | 1.0856 | 0.5328 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
hitakura/distilbert-base-uncased-finetuned-emotion | hitakura | 2023-12-23T08:39:42Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-12-23T08:01:52Z | ---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
config: split
split: validation
args: split
metrics:
- name: Accuracy
type: accuracy
value: 0.9275
- name: F1
type: f1
value: 0.9274091856141289
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2270
- Accuracy: 0.9275
- F1: 0.9274
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8872 | 1.0 | 250 | 0.3277 | 0.9085 | 0.9076 |
| 0.2674 | 2.0 | 500 | 0.2270 | 0.9275 | 0.9274 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2
- Datasets 2.15.0
- Tokenizers 0.15.0
|
GlebPS/bert-base-cased-finetuned-wikitext2 | GlebPS | 2023-12-23T08:38:50Z | 4 | 0 | transformers | [
"transformers",
"tf",
"tensorboard",
"bert",
"fill-mask",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2023-12-23T08:16:48Z | ---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: GlebPS/bert-base-cased-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# GlebPS/bert-base-cased-finetuned-wikitext2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.9635
- Validation Loss: 6.8748
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.4374 | 7.0500 | 0 |
| 6.9635 | 6.8748 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
sdachen/path_to_saved_model | sdachen | 2023-12-23T08:31:57Z | 0 | 0 | diffusers | [
"diffusers",
"tensorboard",
"safetensors",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"dreambooth",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:finetune:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | 2023-12-23T01:26:57Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
instance_prompt: a photo of sks dog
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- dreambooth
inference: true
---
# DreamBooth - sdachen/path_to_saved_model
This is a dreambooth model derived from runwayml/stable-diffusion-v1-5. The weights were trained on a photo of sks dog using [DreamBooth](https://dreambooth.github.io/).
You can find some example images in the following.
DreamBooth for the text encoder was enabled: False.
|
Cloud1989/Taxi-v3-Cloud1989 | Cloud1989 | 2023-12-23T08:22:56Z | 0 | 0 | null | [
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | reinforcement-learning | 2023-12-23T08:22:53Z | ---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: Taxi-v3-Cloud1989
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.50 +/- 2.67
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="Cloud1989/Taxi-v3-Cloud1989", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
Shriganesh/bert-finetuned-squad | Shriganesh | 2023-12-23T08:16:04Z | 3 | 0 | transformers | [
"transformers",
"tf",
"bert",
"question-answering",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | question-answering | 2023-12-23T07:25:27Z | ---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: Shriganesh/bert-finetuned-squad
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Shriganesh/bert-finetuned-squad
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.6861
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 1875, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Epoch |
|:----------:|:-----:|
| 1.7625 | 0 |
| 0.9591 | 1 |
| 0.6861 | 2 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
polina164/bert-base-cased-finetuned-wikitext2 | polina164 | 2023-12-23T08:05:50Z | 1 | 0 | transformers | [
"transformers",
"tf",
"tensorboard",
"bert",
"fill-mask",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2023-12-23T07:45:18Z | ---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: polina164/bert-base-cased-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# polina164/bert-base-cased-finetuned-wikitext2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.9651
- Validation Loss: 6.9192
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.4286 | 7.0417 | 0 |
| 6.9651 | 6.9192 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
erfanzar/LinguaMatic-1B-GGUF | erfanzar | 2023-12-23T08:05:47Z | 23 | 0 | null | [
"gguf",
"code",
"text-generation",
"en",
"fr",
"es",
"dataset:erfanzar/UltraChat-Mixin",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-21T20:09:33Z | ---
datasets:
- erfanzar/UltraChat-Mixin
language:
- en
- fr
- es
metrics:
- accuracy
pipeline_tag: text-generation
tags:
- code
---
# LinguaMatic
LinguaMatic is an advanced AI model designed to handle a wide range of Natural Language Processing (NLP) tasks. With its powerful capabilities, LinguaMatic can assist with tasks such as text classification, sentiment analysis, language translation, question answering, and much more.
## EasyDel
The model is finetuned Using a custom version of UltraChat on TPU-v4 POD using [EasyDel](https://github.com/erfanzar/EasyDeL)
## Prompting Method
LinguaMatic utilizes the OC prompting method to generate responses. This method, named after the friendly and intelligent llama, enhances the model's ability to engage in meaningful conversations. The `prompt_model` function provided below demonstrates how the llama2 prompting method is implemented:
```python
def prompt_model(
message: str,
chat_history: Optional[List[str] | List[List[str]]] = None,
system_prompt: Optional[str] = None
):
if chat_history is None:
chat_history = []
system = f"<|system|>\n{system_prompt}</s>" if system_prompt is not None else ""
ua = ""
for user_input, response in chat_history:
ua += f"<|user|>\n{user_input}</s>\n" + f"<|assistant|>\n{response}</s>\n"
return system + ua + f"<|user|>\n{message}</s>\n<|assistant|>\n"
```
The `prompt_model` function takes a `message` as input, along with the `chat_history` and `system_prompt`. It generates a formatted text that includes the system prompt, user inputs, and the current message. This approach allows LinguaMatic to maintain context and provide more coherent and context-aware responses.
## Contributing
We welcome contributions to enhance LinguaMatic's capabilities and improve its performance. If you encounter any issues or have suggestions for improvement, please feel free to submit a pull request or open an issue on [EasyDel](https://github.com/erfanzar/EasyDeL) GitHub repository.
|
bhuvana1/Khuze_512_resolution | bhuvana1 | 2023-12-23T08:01:50Z | 1 | 1 | diffusers | [
"diffusers",
"text-to-image",
"autotrain",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:finetune:stabilityai/stable-diffusion-xl-base-1.0",
"region:us"
] | text-to-image | 2023-12-22T07:56:51Z |
---
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: photo of a Khuzesiam person
tags:
- text-to-image
- diffusers
- autotrain
inference: true
---
# DreamBooth trained by AutoTrain
Text encoder was not trained.
|
ntc-ai/SDXL-LoRA-slider.mad-scientist | ntc-ai | 2023-12-23T07:42:57Z | 25 | 0 | diffusers | [
"diffusers",
"text-to-image",
"stable-diffusion-xl",
"lora",
"template:sd-lora",
"template:sdxl-lora",
"sdxl-sliders",
"ntcai.xyz-sliders",
"concept",
"en",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:mit",
"region:us"
] | text-to-image | 2023-12-23T07:42:53Z |
---
language:
- en
thumbnail: "images/evaluate/mad scientist.../mad scientist_17_3.0.png"
widget:
- text: mad scientist
output:
url: images/mad scientist_17_3.0.png
- text: mad scientist
output:
url: images/mad scientist_19_3.0.png
- text: mad scientist
output:
url: images/mad scientist_20_3.0.png
- text: mad scientist
output:
url: images/mad scientist_21_3.0.png
- text: mad scientist
output:
url: images/mad scientist_22_3.0.png
tags:
- text-to-image
- stable-diffusion-xl
- lora
- template:sd-lora
- template:sdxl-lora
- sdxl-sliders
- ntcai.xyz-sliders
- concept
- diffusers
license: "mit"
inference: false
instance_prompt: "mad scientist"
base_model: "stabilityai/stable-diffusion-xl-base-1.0"
---
# ntcai.xyz slider - mad scientist (SDXL LoRA)
| Strength: -3 | Strength: 0 | Strength: 3 |
| --- | --- | --- |
| <img src="images/mad scientist_17_-3.0.png" width=256 height=256 /> | <img src="images/mad scientist_17_0.0.png" width=256 height=256 /> | <img src="images/mad scientist_17_3.0.png" width=256 height=256 /> |
| <img src="images/mad scientist_19_-3.0.png" width=256 height=256 /> | <img src="images/mad scientist_19_0.0.png" width=256 height=256 /> | <img src="images/mad scientist_19_3.0.png" width=256 height=256 /> |
| <img src="images/mad scientist_20_-3.0.png" width=256 height=256 /> | <img src="images/mad scientist_20_0.0.png" width=256 height=256 /> | <img src="images/mad scientist_20_3.0.png" width=256 height=256 /> |
## Download
Weights for this model are available in Safetensors format.
## Trigger words
You can apply this LoRA with trigger words for additional effect:
```
mad scientist
```
## Use in diffusers
```python
from diffusers import StableDiffusionXLPipeline
from diffusers import EulerAncestralDiscreteScheduler
import torch
pipe = StableDiffusionXLPipeline.from_single_file("https://huggingface.co/martyn/sdxl-turbo-mario-merge-top-rated/blob/main/topRatedTurboxlLCM_v10.safetensors")
pipe.to("cuda")
pipe.scheduler = EulerAncestralDiscreteScheduler.from_config(pipe.scheduler.config)
# Load the LoRA
pipe.load_lora_weights('ntc-ai/SDXL-LoRA-slider.mad-scientist', weight_name='mad scientist.safetensors', adapter_name="mad scientist")
# Activate the LoRA
pipe.set_adapters(["mad scientist"], adapter_weights=[2.0])
prompt = "medieval rich kingpin sitting in a tavern, mad scientist"
negative_prompt = "nsfw"
width = 512
height = 512
num_inference_steps = 10
guidance_scale = 2
image = pipe(prompt, negative_prompt=negative_prompt, width=width, height=height, guidance_scale=guidance_scale, num_inference_steps=num_inference_steps).images[0]
image.save('result.png')
```
## Support the Patreon
If you like this model please consider [joining our Patreon](https://www.patreon.com/NTCAI).
By joining our Patreon, you'll gain access to an ever-growing library of over 560+ unique and diverse LoRAs, covering a wide range of styles and genres. You'll also receive early access to new models and updates, exclusive behind-the-scenes content, and the powerful LoRA slider creator, allowing you to craft your own custom LoRAs and experiment with endless possibilities.
Your support on Patreon will allow us to continue developing and refining new models.
## Other resources
- [CivitAI](https://civitai.com/user/ntc) - Follow ntc on Civit for even more LoRAs
- [ntcai.xyz](https://ntcai.xyz) - See ntcai.xyz to find more articles and LoRAs
|
polina164/gpt2-finetuned-wikitext2 | polina164 | 2023-12-23T07:38:38Z | 5 | 0 | transformers | [
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-23T07:17:56Z | ---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: polina164/gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# polina164/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.4867
- Validation Loss: 6.3421
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3108 | 6.7566 | 0 |
| 6.4867 | 6.3421 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
hfl/chinese-alpaca-2-lora-7b-16k | hfl | 2023-12-23T07:29:21Z | 6 | 1 | transformers | [
"transformers",
"llama",
"text-generation",
"zh",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-08-31T09:00:45Z | ---
license: apache-2.0
language:
- zh
- en
---
# Chinese-Alpaca-2-LoRA-7B-16K
**This is the LoRA model for Chinese-Alpaca-2-7B-16K (context size 16K)๏ผwhich should be merged with original Llama-2-7b-hf model before inference or training.**
**Related models๐**
* Long context base models (16K)
* [Chinese-LLaMA-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-7b-16k)
* [Chinese-LLaMA-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b-16k)
* [Chinese-LLaMA-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-13b-16k)
* [Chinese-LLaMA-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b-16k)
* Long context Instruction/Chat models
* [Chinese-Alpaca-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b-16k)
* [Chinese-Alpaca-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b-16k)
* [Chinese-Alpaca-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b-16k)
* [Chinese-Alpaca-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b-16k)
* Base models
* [Chinese-LLaMA-2-7B (full model)](https://huggingface.co/hfl/chinese-llama-2-7b)
* [Chinese-LLaMA-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b)
* [Chinese-LLaMA-2-13B (full model)](https://huggingface.co/hfl/chinese-llama-2-13b)
* [Chinese-LLaMA-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b)
* Instruction/Chat models
* [Chinese-Alpaca-2-7B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b)
* [Chinese-Alpaca-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b)
* [Chinese-Alpaca-2-13B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b)
* [Chinese-Alpaca-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b)
# Description of Chinese-LLaMA-Alpaca-2
This project is based on the Llama-2, released by Meta, and it is the second generation of the Chinese LLaMA & Alpaca LLM project. We open-source Chinese LLaMA-2 (foundation model) and Alpaca-2 (instruction-following model). These models have been expanded and optimized with Chinese vocabulary beyond the original Llama-2. We used large-scale Chinese data for incremental pre-training, which further improved the fundamental semantic understanding of the Chinese language, resulting in a significant performance improvement compared to the first-generation models. The relevant models support a 4K context and can be expanded up to 18K+ using the NTK method.
The main contents of this project include:
* ๐ New extended Chinese vocabulary beyond Llama-2, open-sourcing the Chinese LLaMA-2 and Alpaca-2 LLMs.
* ๐ Open-sourced the pre-training and instruction finetuning (SFT) scripts for further tuning on user's data
* ๐ Quickly deploy and experience the quantized LLMs on CPU/GPU of personal PC
* ๐ Support for LLaMA ecosystems like ๐คtransformers, llama.cpp, text-generation-webui, LangChain, vLLM etc.
Please refer to [https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/) for details. |
hfl/chinese-alpaca-2-lora-7b | hfl | 2023-12-23T07:29:06Z | 0 | 16 | null | [
"zh",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-07-31T03:55:19Z | ---
license: apache-2.0
language:
- zh
- en
---
# Chinese-Alpaca-2-LoRA-7B
**This is the LoRA model for Chinese-Alpaca-2-7B๏ผwhich should be merged with original Llama-2-7b-hf model before inference or training.**
**Related models๐**
* Long context base models
* [Chinese-LLaMA-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-7b-16k)
* [Chinese-LLaMA-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b-16k)
* [Chinese-LLaMA-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-13b-16k)
* [Chinese-LLaMA-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b-16k)
* Base models
* [Chinese-LLaMA-2-7B (full model)](https://huggingface.co/hfl/chinese-llama-2-7b)
* [Chinese-LLaMA-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b)
* [Chinese-LLaMA-2-13B (full model)](https://huggingface.co/hfl/chinese-llama-2-13b)
* [Chinese-LLaMA-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b)
* Instruction/Chat models
* [Chinese-Alpaca-2-7B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b)
* [Chinese-Alpaca-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b)
* [Chinese-Alpaca-2-13B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b)
* [Chinese-Alpaca-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b)
# Description of Chinese-LLaMA-Alpaca-2
This project is based on the Llama-2, released by Meta, and it is the second generation of the Chinese LLaMA & Alpaca LLM project. We open-source Chinese LLaMA-2 (foundation model) and Alpaca-2 (instruction-following model). These models have been expanded and optimized with Chinese vocabulary beyond the original Llama-2. We used large-scale Chinese data for incremental pre-training, which further improved the fundamental semantic understanding of the Chinese language, resulting in a significant performance improvement compared to the first-generation models. The relevant models support a 4K context and can be expanded up to 18K+ using the NTK method.
The main contents of this project include:
* ๐ New extended Chinese vocabulary beyond Llama-2, open-sourcing the Chinese LLaMA-2 and Alpaca-2 LLMs.
* ๐ Open-sourced the pre-training and instruction finetuning (SFT) scripts for further tuning on user's data
* ๐ Quickly deploy and experience the quantized LLMs on CPU/GPU of personal PC
* ๐ Support for LLaMA ecosystems like ๐คtransformers, llama.cpp, text-generation-webui, LangChain, vLLM etc.
Please refer to [https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/) for details. |
hfl/chinese-alpaca-2-7b | hfl | 2023-12-23T07:28:12Z | 170 | 161 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"zh",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-07-31T03:53:55Z | ---
license: apache-2.0
language:
- zh
- en
---
# Chinese-Alpaca-2-7B
**This is the full Chinese-Alpaca-2-7B model๏ผwhich can be loaded directly for inference and full-parameter training.**
**Related models๐**
* Long context base models
* [Chinese-LLaMA-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-7b-16k)
* [Chinese-LLaMA-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b-16k)
* [Chinese-LLaMA-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-13b-16k)
* [Chinese-LLaMA-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b-16k)
* Base models
* [Chinese-LLaMA-2-7B (full model)](https://huggingface.co/hfl/chinese-llama-2-7b)
* [Chinese-LLaMA-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b)
* [Chinese-LLaMA-2-13B (full model)](https://huggingface.co/hfl/chinese-llama-2-13b)
* [Chinese-LLaMA-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b)
* Instruction/Chat models
* [Chinese-Alpaca-2-7B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b)
* [Chinese-Alpaca-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b)
* [Chinese-Alpaca-2-13B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b)
* [Chinese-Alpaca-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b)
# Description of Chinese-LLaMA-Alpaca-2
This project is based on the Llama-2, released by Meta, and it is the second generation of the Chinese LLaMA & Alpaca LLM project. We open-source Chinese LLaMA-2 (foundation model) and Alpaca-2 (instruction-following model). These models have been expanded and optimized with Chinese vocabulary beyond the original Llama-2. We used large-scale Chinese data for incremental pre-training, which further improved the fundamental semantic understanding of the Chinese language, resulting in a significant performance improvement compared to the first-generation models. The relevant models support a 4K context and can be expanded up to 18K+ using the NTK method.
The main contents of this project include:
* ๐ New extended Chinese vocabulary beyond Llama-2, open-sourcing the Chinese LLaMA-2 and Alpaca-2 LLMs.
* ๐ Open-sourced the pre-training and instruction finetuning (SFT) scripts for further tuning on user's data
* ๐ Quickly deploy and experience the quantized LLMs on CPU/GPU of personal PC
* ๐ Support for LLaMA ecosystems like ๐คtransformers, llama.cpp, text-generation-webui, LangChain, vLLM etc.
Please refer to [https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/) for details. |
hfl/chinese-alpaca-2-13b-16k | hfl | 2023-12-23T07:27:41Z | 1,487 | 29 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"zh",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-08-31T13:47:47Z | ---
license: apache-2.0
language:
- zh
- en
---
# Chinese-LLaMA-2-13B-16K
**This is the full Chinese-LLaMA-2-13B-16K (context size 16K)๏ผmodel๏ผwhich can be loaded directly for inference and full-parameter training.**
**Related models๐**
* Long context base models (16K)
* [Chinese-LLaMA-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-7b-16k)
* [Chinese-LLaMA-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b-16k)
* [Chinese-LLaMA-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-13b-16k)
* [Chinese-LLaMA-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b-16k)
* Long context Instruction/Chat models
* [Chinese-Alpaca-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b-16k)
* [Chinese-Alpaca-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b-16k)
* [Chinese-Alpaca-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b-16k)
* [Chinese-Alpaca-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b-16k)
* Base models
* [Chinese-LLaMA-2-7B (full model)](https://huggingface.co/hfl/chinese-llama-2-7b)
* [Chinese-LLaMA-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b)
* [Chinese-LLaMA-2-13B (full model)](https://huggingface.co/hfl/chinese-llama-2-13b)
* [Chinese-LLaMA-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b)
* Instruction/Chat models
* [Chinese-Alpaca-2-7B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b)
* [Chinese-Alpaca-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b)
* [Chinese-Alpaca-2-13B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b)
* [Chinese-Alpaca-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b)
# Description of Chinese-LLaMA-Alpaca-2
This project is based on the Llama-2, released by Meta, and it is the second generation of the Chinese LLaMA & Alpaca LLM project. We open-source Chinese LLaMA-2 (foundation model) and Alpaca-2 (instruction-following model). These models have been expanded and optimized with Chinese vocabulary beyond the original Llama-2. We used large-scale Chinese data for incremental pre-training, which further improved the fundamental semantic understanding of the Chinese language, resulting in a significant performance improvement compared to the first-generation models. The relevant models support a 4K context and can be expanded up to 18K+ using the NTK method.
The main contents of this project include:
* ๐ New extended Chinese vocabulary beyond Llama-2, open-sourcing the Chinese LLaMA-2 and Alpaca-2 LLMs.
* ๐ Open-sourced the pre-training and instruction finetuning (SFT) scripts for further tuning on user's data
* ๐ Quickly deploy and experience the quantized LLMs on CPU/GPU of personal PC
* ๐ Support for LLaMA ecosystems like ๐คtransformers, llama.cpp, text-generation-webui, LangChain, vLLM etc.
Please refer to [https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/) for details. |
hfl/chinese-alpaca-2-1.3b | hfl | 2023-12-23T07:27:31Z | 250 | 8 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"zh",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-10-08T08:46:07Z | ---
license: apache-2.0
language:
- zh
- en
---
# Chinese-Alpaca-2-1.3B
**This is the full Chinese-Alpaca-2-1.3B model๏ผwhich can be loaded directly for inference and full-parameter training.**
**Related models๐**
* Long context base models (16K)
* [Chinese-LLaMA-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-7b-16k)
* [Chinese-LLaMA-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b-16k)
* [Chinese-LLaMA-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-13b-16k)
* [Chinese-LLaMA-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b-16k)
* Long context Instruction/Chat models
* [Chinese-Alpaca-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b-16k)
* [Chinese-Alpaca-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b-16k)
* [Chinese-Alpaca-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b-16k)
* [Chinese-Alpaca-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b-16k)
* Base models
* [Chinese-LLaMA-2-7B (full model)](https://huggingface.co/hfl/chinese-llama-2-7b)
* [Chinese-LLaMA-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b)
* [Chinese-LLaMA-2-13B (full model)](https://huggingface.co/hfl/chinese-llama-2-13b)
* [Chinese-LLaMA-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b)
* Instruction/Chat models
* [Chinese-Alpaca-2-7B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b)
* [Chinese-Alpaca-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b)
* [Chinese-Alpaca-2-13B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b)
* [Chinese-Alpaca-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b)
# Description of Chinese-LLaMA-Alpaca-2
This project is based on the Llama-2, released by Meta, and it is the second generation of the Chinese LLaMA & Alpaca LLM project. We open-source Chinese LLaMA-2 (foundation model) and Alpaca-2 (instruction-following model). These models have been expanded and optimized with Chinese vocabulary beyond the original Llama-2. We used large-scale Chinese data for incremental pre-training, which further improved the fundamental semantic understanding of the Chinese language, resulting in a significant performance improvement compared to the first-generation models. The relevant models support a 4K context and can be expanded up to 18K+ using the NTK method.
The main contents of this project include:
* ๐ New extended Chinese vocabulary beyond Llama-2, open-sourcing the Chinese LLaMA-2 and Alpaca-2 LLMs.
* ๐ Open-sourced the pre-training and instruction finetuning (SFT) scripts for further tuning on user's data
* ๐ Quickly deploy and experience the quantized LLMs on CPU/GPU of personal PC
* ๐ Support for LLaMA ecosystems like ๐คtransformers, llama.cpp, text-generation-webui, LangChain, vLLM etc.
Please refer to [https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/) for details. |
hfl/chinese-llama-2-lora-7b-16k | hfl | 2023-12-23T07:25:58Z | 8 | 1 | transformers | [
"transformers",
"llama",
"text-generation",
"zh",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-08-25T00:40:00Z | ---
license: apache-2.0
language:
- zh
- en
---
# Chinese-LLaMA-2-LoRA-7B-16K
**This is the LoRA model for Chinese-LLaMA-2-7B-16K (context size 16K)๏ผwhich should be merged with original Llama-2-7b-hf model before inference or training.**
**Related models๐**
* Long context base models (16K)
* [Chinese-LLaMA-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-7b-16k)
* [Chinese-LLaMA-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b-16k)
* [Chinese-LLaMA-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-13b-16k)
* [Chinese-LLaMA-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b-16k)
* Base models
* [Chinese-LLaMA-2-7B (full model)](https://huggingface.co/hfl/chinese-llama-2-7b)
* [Chinese-LLaMA-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b)
* [Chinese-LLaMA-2-13B (full model)](https://huggingface.co/hfl/chinese-llama-2-13b)
* [Chinese-LLaMA-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b)
* Instruction/Chat models
* [Chinese-Alpaca-2-7B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b)
* [Chinese-Alpaca-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b)
* [Chinese-Alpaca-2-13B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b)
* [Chinese-Alpaca-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b)
# Description of Chinese-LLaMA-Alpaca-2
This project is based on the Llama-2, released by Meta, and it is the second generation of the Chinese LLaMA & Alpaca LLM project. We open-source Chinese LLaMA-2 (foundation model) and Alpaca-2 (instruction-following model). These models have been expanded and optimized with Chinese vocabulary beyond the original Llama-2. We used large-scale Chinese data for incremental pre-training, which further improved the fundamental semantic understanding of the Chinese language, resulting in a significant performance improvement compared to the first-generation models. The relevant models support a 4K context and can be expanded up to 18K+ using the NTK method.
The main contents of this project include:
* ๐ New extended Chinese vocabulary beyond Llama-2, open-sourcing the Chinese LLaMA-2 and Alpaca-2 LLMs.
* ๐ Open-sourced the pre-training and instruction finetuning (SFT) scripts for further tuning on user's data
* ๐ Quickly deploy and experience the quantized LLMs on CPU/GPU of personal PC
* ๐ Support for LLaMA ecosystems like ๐คtransformers, llama.cpp, text-generation-webui, LangChain, vLLM etc.
Please refer to [https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/) for details. |
hfl/chinese-llama-2-1.3b | hfl | 2023-12-23T07:25:50Z | 1,655 | 18 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"zh",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-10-08T08:28:54Z | ---
license: apache-2.0
language:
- zh
- en
---
# Chinese-LLaMA-2-1.3B
**This is the full Chinese-LLaMA-2-1.3B model๏ผwhich can be loaded directly for inference and full-parameter training.**
**Related models๐**
* Long context base models (16K)
* [Chinese-LLaMA-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-7b-16k)
* [Chinese-LLaMA-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b-16k)
* [Chinese-LLaMA-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-13b-16k)
* [Chinese-LLaMA-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b-16k)
* Long context Instruction/Chat models
* [Chinese-Alpaca-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b-16k)
* [Chinese-Alpaca-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b-16k)
* [Chinese-Alpaca-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b-16k)
* [Chinese-Alpaca-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b-16k)
* Base models
* [Chinese-LLaMA-2-7B (full model)](https://huggingface.co/hfl/chinese-llama-2-7b)
* [Chinese-LLaMA-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b)
* [Chinese-LLaMA-2-13B (full model)](https://huggingface.co/hfl/chinese-llama-2-13b)
* [Chinese-LLaMA-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b)
* Instruction/Chat models
* [Chinese-Alpaca-2-7B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b)
* [Chinese-Alpaca-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b)
* [Chinese-Alpaca-2-13B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b)
* [Chinese-Alpaca-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b)
# Description of Chinese-LLaMA-Alpaca-2
This project is based on the Llama-2, released by Meta, and it is the second generation of the Chinese LLaMA & Alpaca LLM project. We open-source Chinese LLaMA-2 (foundation model) and Alpaca-2 (instruction-following model). These models have been expanded and optimized with Chinese vocabulary beyond the original Llama-2. We used large-scale Chinese data for incremental pre-training, which further improved the fundamental semantic understanding of the Chinese language, resulting in a significant performance improvement compared to the first-generation models. The relevant models support a 4K context and can be expanded up to 18K+ using the NTK method.
The main contents of this project include:
* ๐ New extended Chinese vocabulary beyond Llama-2, open-sourcing the Chinese LLaMA-2 and Alpaca-2 LLMs.
* ๐ Open-sourced the pre-training and instruction finetuning (SFT) scripts for further tuning on user's data
* ๐ Quickly deploy and experience the quantized LLMs on CPU/GPU of personal PC
* ๐ Support for LLaMA ecosystems like ๐คtransformers, llama.cpp, text-generation-webui, LangChain, vLLM etc.
Please refer to [https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/) for details. |
hfl/chinese-llama-2-13b | hfl | 2023-12-23T07:21:09Z | 1,464 | 34 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"zh",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-08-11T08:52:21Z | ---
license: apache-2.0
language:
- zh
- en
---
# Chinese-LLaMA-2-13B
**This is the full Chinese-LLaMA-2-13B model๏ผwhich can be loaded directly for inference and full-parameter training.**
**Related models๐**
* Long context base models
* [Chinese-LLaMA-2-7B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-7b-16k)
* [Chinese-LLaMA-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b-16k)
* [Chinese-LLaMA-2-13B-16K (full model)](https://huggingface.co/hfl/chinese-llama-2-13b-16k)
* [Chinese-LLaMA-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b-16k)
* Base models
* [Chinese-LLaMA-2-7B (full model)](https://huggingface.co/hfl/chinese-llama-2-7b)
* [Chinese-LLaMA-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-7b)
* [Chinese-LLaMA-2-13B (full model)](https://huggingface.co/hfl/chinese-llama-2-13b)
* [Chinese-LLaMA-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-llama-2-lora-13b)
* Instruction/Chat models
* [Chinese-Alpaca-2-7B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-7b)
* [Chinese-Alpaca-2-LoRA-7B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-7b)
* [Chinese-Alpaca-2-13B (full model)](https://huggingface.co/hfl/chinese-alpaca-2-13b)
* [Chinese-Alpaca-2-LoRA-13B (LoRA model)](https://huggingface.co/hfl/chinese-alpaca-2-lora-13b)
# Description of Chinese-LLaMA-Alpaca-2
This project is based on the Llama-2, released by Meta, and it is the second generation of the Chinese LLaMA & Alpaca LLM project. We open-source Chinese LLaMA-2 (foundation model) and Alpaca-2 (instruction-following model). These models have been expanded and optimized with Chinese vocabulary beyond the original Llama-2. We used large-scale Chinese data for incremental pre-training, which further improved the fundamental semantic understanding of the Chinese language, resulting in a significant performance improvement compared to the first-generation models. The relevant models support a 4K context and can be expanded up to 18K+ using the NTK method.
The main contents of this project include:
* ๐ New extended Chinese vocabulary beyond Llama-2, open-sourcing the Chinese LLaMA-2 and Alpaca-2 LLMs.
* ๐ Open-sourced the pre-training and instruction finetuning (SFT) scripts for further tuning on user's data
* ๐ Quickly deploy and experience the quantized LLMs on CPU/GPU of personal PC
* ๐ Support for LLaMA ecosystems like ๐คtransformers, llama.cpp, text-generation-webui, LangChain, vLLM etc.
Please refer to [https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/) for details. |
LoneStriker/SAM-8.0bpw-h8-exl2 | LoneStriker | 2023-12-23T07:07:26Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-23T07:04:20Z | ---
license: apache-2.0
language:
- en
---
# Model Card
SAM (Small Agentic Model), a 7B model that demonstrates impressive reasoning abilities despite its smaller size. SAM-7B has outperformed existing SoTA models on various reasoning benchmarks, including GSM8k and ARC-C.
For full details of this model please read our [release blog post](https://superagi.com/introducing-sam-small-agentic-model/).
# Key Contributions
- SAM-7B outperforms GPT 3.5, Orca, and several other 70B models on multiple reasoning benchmarks, including ARC-C and GSM8k.
- Interestingly, despite being trained on a 97% smaller dataset, SAM-7B surpasses Orca-13B on GSM8k.
- All responses in our fine-tuning dataset are generated by open-source models without any assistance from state-of-the-art models like GPT-3.5 or GPT-4.
## Training
- Trained by: SuperAGI Team
- Hardware: NVIDIA 6 x H100 SxM (80GB)
- Model used: Mistral 7B
- Duration of finetuning: 4 hours
- Number of epochs: 1
- Batch size: 16
- Learning Rate: 2e-5
- Warmup Ratio: 0.1
- Optmizer: AdamW
- Scheduler: Cosine
## Example Prompt
The template used to build a prompt for the Instruct model is defined as follows:
```
<s> [INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST]
```
Note that `<s>` and `</s>` are special tokens for beginning of string (BOS) and end of string (EOS) while [INST] and [/INST] are regular strings.
## Evaluation
These benchmarks show that our model has improved reasoning as compared to orca 2-7b, orca 2-13b and GPT-3.5.
Despite being smaller in size, we show better multi-hop reasoning, as shown below:
<img src = "https://superagi.com/wp-content/uploads/2023/12/image-932.png" alt="Reasoning Benchmark Performance" width="700">
Note: Temperature=0.3 is the suggested for optimal performance
## Run the model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "SuperAGI/SAM"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
text = "Can elephants fly?"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
## Limitations
SAM is a demonstration that better reasoning can be induced using less but high-quality data generated using OpenSource LLMs.
The model is not suitable for conversations and simple Q&A, it performs better in task breakdown and reasoning only.
It does not have any moderation mechanisms. Therefore, the model is not suitable for production usage as it doesn't have guardrails for toxicity, societal bias, and language limitations. We would love to collaborate with the community to build safer and better models.
## The SuperAGI AI Team
Anmol Gautam, Arkajit Datta, Rajat Chawla, Ayush Vatsal, Sukrit Chatterjee, Adarsh Jha, Abhijeet Sinha, Rakesh Krishna, Adarsh Deep, Ishaan Bhola, Mukunda NS, Nishant Gaurav. |
tcyuan1017/HW02 | tcyuan1017 | 2023-12-23T06:50:13Z | 1 | 0 | peft | [
"peft",
"tensorboard",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"base_model:adapter:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | 2023-12-23T06:13:52Z | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.7.2.dev0 |
LoneStriker/SAM-4.0bpw-h6-exl2 | LoneStriker | 2023-12-23T06:49:51Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-23T06:48:11Z | ---
license: apache-2.0
language:
- en
---
# Model Card
SAM (Small Agentic Model), a 7B model that demonstrates impressive reasoning abilities despite its smaller size. SAM-7B has outperformed existing SoTA models on various reasoning benchmarks, including GSM8k and ARC-C.
For full details of this model please read our [release blog post](https://superagi.com/introducing-sam-small-agentic-model/).
# Key Contributions
- SAM-7B outperforms GPT 3.5, Orca, and several other 70B models on multiple reasoning benchmarks, including ARC-C and GSM8k.
- Interestingly, despite being trained on a 97% smaller dataset, SAM-7B surpasses Orca-13B on GSM8k.
- All responses in our fine-tuning dataset are generated by open-source models without any assistance from state-of-the-art models like GPT-3.5 or GPT-4.
## Training
- Trained by: SuperAGI Team
- Hardware: NVIDIA 6 x H100 SxM (80GB)
- Model used: Mistral 7B
- Duration of finetuning: 4 hours
- Number of epochs: 1
- Batch size: 16
- Learning Rate: 2e-5
- Warmup Ratio: 0.1
- Optmizer: AdamW
- Scheduler: Cosine
## Example Prompt
The template used to build a prompt for the Instruct model is defined as follows:
```
<s> [INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST]
```
Note that `<s>` and `</s>` are special tokens for beginning of string (BOS) and end of string (EOS) while [INST] and [/INST] are regular strings.
## Evaluation
These benchmarks show that our model has improved reasoning as compared to orca 2-7b, orca 2-13b and GPT-3.5.
Despite being smaller in size, we show better multi-hop reasoning, as shown below:
<img src = "https://superagi.com/wp-content/uploads/2023/12/image-932.png" alt="Reasoning Benchmark Performance" width="700">
Note: Temperature=0.3 is the suggested for optimal performance
## Run the model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "SuperAGI/SAM"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
text = "Can elephants fly?"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
## Limitations
SAM is a demonstration that better reasoning can be induced using less but high-quality data generated using OpenSource LLMs.
The model is not suitable for conversations and simple Q&A, it performs better in task breakdown and reasoning only.
It does not have any moderation mechanisms. Therefore, the model is not suitable for production usage as it doesn't have guardrails for toxicity, societal bias, and language limitations. We would love to collaborate with the community to build safer and better models.
## The SuperAGI AI Team
Anmol Gautam, Arkajit Datta, Rajat Chawla, Ayush Vatsal, Sukrit Chatterjee, Adarsh Jha, Abhijeet Sinha, Rakesh Krishna, Adarsh Deep, Ishaan Bhola, Mukunda NS, Nishant Gaurav. |
lorenzreyes/ppo-Pyramids | lorenzreyes | 2023-12-23T06:49:48Z | 1 | 0 | ml-agents | [
"ml-agents",
"tensorboard",
"onnx",
"Pyramids",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Pyramids",
"region:us"
] | reinforcement-learning | 2023-12-23T06:49:45Z | ---
library_name: ml-agents
tags:
- Pyramids
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Pyramids
---
# **ppo** Agent playing **Pyramids**
This is a trained model of a **ppo** agent playing **Pyramids**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog ๐ถ to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: lorenzreyes/ppo-Pyramids
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play ๐
|
Tomkao0214/learningllm | Tomkao0214 | 2023-12-23T06:20:20Z | 9 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-12-23T01:40:50Z | ---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: learningllm
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# learningllm
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0243
- Matthews Correlation: 0.4896
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.4503901752406154e-05
- train_batch_size: 4
- eval_batch_size: 16
- seed: 11
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5604 | 1.0 | 2138 | 0.5529 | 0.4471 |
| 0.4503 | 2.0 | 4276 | 0.8003 | 0.5166 |
| 0.2792 | 3.0 | 6414 | 1.0243 | 0.4896 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
dthseemsbttr/gpt2-finetuned-wikitext2 | dthseemsbttr | 2023-12-23T06:15:29Z | 5 | 0 | transformers | [
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"base_model:openai-community/gpt2",
"base_model:finetune:openai-community/gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-12-23T05:52:59Z | ---
license: mit
base_model: gpt2
tags:
- generated_from_keras_callback
model-index:
- name: dthseemsbttr/gpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# dthseemsbttr/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.4944
- Validation Loss: 6.3520
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3084 | 6.7593 | 0 |
| 6.4944 | 6.3520 | 1 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.0
- Tokenizers 0.15.0
|
Deneseramose/DrOZKellyClarksonWeightLoss | Deneseramose | 2023-12-23T06:01:36Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | null | 2023-12-23T06:01:27Z | ---
license: mit
---
โฅโ
Official Website: Dr OZ Kelly Clarkson Weight Loss
โฅโ
Product Name:Dr OZ Kelly Clarkson Weight Loss
โฅโ
Benefits: Dr OZ Kelly Clarkson Weight Loss Helps you to get rid of chronic pain & aches.
โฅโ
Category: Weight Loss Supplement
โฅโ
Rating: โ
โ
โ
โ
โ (4.5/5.0)
โฅโ
Side Effects: No Major Side Effects
โฅโ
Availability: In Stock Voted #1 Product in the USA
Beyond the scale, the focus should be on promoting overall health and embracing diversity in body shapes and sizes. Dr. Oz and Kelly Clarkson's collaboration could serve as a harmonic catalyst for broader conversations about celebrating diverse wellness journeys.
https://www.deccanherald.com/brandspot/featured/gino-chouinard-cbd-gummies-canada-reviews-serena-leafz-cbd-gummies-canada-side-effects-benefits-and-consumer-reports-2778034
https://www.mid-day.com/lifestyle/infotainment/article/serena-leafz-cbd-gummies-canada-honest-opinions-doctor-exposes-important--23317660
BLOGS:-
https://groups.google.com/a/chromium.org/g/telemetry/c/yHjCzjbFjA0
https://groups.google.com/a/chromium.org/g/telemetry/c/6t7y8Ofsp3U
https://groups.google.com/a/chromium.org/g/chromium-reviews/c/A-cIm3ArLRk
https://groups.google.com/g/dr-oz-kelly-clarkson-weight-loss/c/CY2CnLxCvrk
https://groups.google.com/g/dr-oz-kelly-clarkson-weight-loss/c/OVzZkg9zbf8
https://sites.google.com/view/drozkellyclarksonweightloss
https://sites.google.com/view/droz-kelly-clarkson-weightloss/home
https://dr-oz-kelly-clarkson-weight-loss.webflow.io/
https://dr-oz-kelly-clarkson-weight-loss-buy.webflow.io/
https://medium.com/@margaretdiazee/dr-oz-kelly-clarkson-weight-loss-top-rated-reviews-genuine-expense-cca86a37574e
https://medium.com/@margaretdiazee/dr-oz-kelly-clarkson-weight-loss-fraudulent-exposed-reviews-ingredients-where-to-buy-eb9f1880254b
https://dr-oz-kelly-clarkson-weight-loss.company.site/
https://gamma.app/public/Dr-OZ-Kelly-Clarkson-Weight-Loss-IS-FAKE-or-REAL-Read-About-100-N-t66pskwttrjnvbh
https://gamma.app/public/Dr-OZ-Kelly-Clarkson-Weight-Loss-REVIEWS-LEGIT-OR-FAKE-WHAT-DO-CU-b9e0r7pqbji0nec
https://groups.google.com/a/chromium.org/g/chromium-reviews/c/hHHZQHO-yek
https://groups.google.com/a/chromium.org/g/telemetry/c/qisAaK78TuM
FACEBOOK:-
https://www.facebook.com/DrOZKellyClarksonWeightLoss/
https://www.facebook.com/DrOzCBDGummies/
https://www.facebook.com/Dr.OZ.CBD.Gummies.Order/
https://www.facebook.com/DrOzCBDGummieReviews/
https://www.facebook.com/KellyClarksonKetoFusionGummies/
https://www.facebook.com/KellyClarksonKetoChewsGummies/
https://www.facebook.com/Kelly.Clarkson.Keto.Blast.Gummies.Official/
https://www.facebook.com/KellyClarksonWeightLossDrOZ/
https://www.facebook.com/kellyclarksonweightlossgummies/
https://www.facebook.com/KellyClarksonSlimFusionGummies/
https://www.facebook.com/KellyClarksonKetoFusionGummies/
https://www.facebook.com/KellyClarksonWeightLossDrOZ/
https://www.facebook.com/FirstChoiceKetoGummiesKellyClarkson/
https://www.facebook.com/KellyClarksonKetoChewsGummies/
https://www.facebook.com/SerenaLeafzCBDGummiesFrancaisCA/
https://www.facebook.com/Order.SerenaLeafzCBDGummies/
https://www.facebook.com/TrySerenaLeafzCBDGummiesEnFrancais/
https://www.facebook.com/GetSerenaLeafzCBDGummiesCA/
https://www.facebook.com/TrySerenaLeafzCBDGummiesCA/ |
Sayan1997/test | Sayan1997 | 2023-12-23T05:57:40Z | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:TinyPixel/Llama-2-7B-bf16-sharded",
"base_model:adapter:TinyPixel/Llama-2-7B-bf16-sharded",
"region:us"
] | null | 2023-12-23T05:57:32Z | ---
library_name: peft
base_model: TinyPixel/Llama-2-7B-bf16-sharded
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.7.2.dev0 |
halu1003/LLMClassWork1 | halu1003 | 2023-12-23T05:30:06Z | 10 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-12-23T01:35:45Z | ---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: LLMClassWork1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# LLMClassWork1
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5382
- Matthews Correlation: 0.4677
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.829056514710492e-05
- train_batch_size: 64
- eval_batch_size: 16
- seed: 21
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 134 | 0.5430 | 0.3489 |
| No log | 2.0 | 268 | 0.4852 | 0.4818 |
| No log | 3.0 | 402 | 0.5382 | 0.4677 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
HimashaJ96/Zephyer-7B-Finetune | HimashaJ96 | 2023-12-23T05:27:44Z | 3 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:HuggingFaceH4/zephyr-7b-beta",
"base_model:adapter:HuggingFaceH4/zephyr-7b-beta",
"region:us"
] | null | 2023-12-23T05:27:21Z | ---
library_name: peft
base_model: HuggingFaceH4/zephyr-7b-beta
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.7.2.dev0 |
codewithaman/vit-base-patch16-224-in21k-finetuned-brain-ich | codewithaman | 2023-12-23T05:10:50Z | 4 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_keras_callback",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2023-12-23T05:01:45Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_keras_callback
model-index:
- name: dwiedarioo/vit-base-patch16-224-in21k-brainmri
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# winwithaman/vit-base-patch16-224-in21k-finetuned-brain-ich
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an brain hemorrhage dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.2848
- Train Accuracy: 0.9969
- Train Top-3-accuracy: 0.9992
- Validation Loss: 0.3786
- Validation Accuracy: 0.9590
- Validation Top-3-accuracy: 0.9892
- Epoch: 7
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'inner_optimizer': {'module': 'transformers.optimization_tf', 'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 1230, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.8999999761581421, 'beta_2': 0.9990000128746033, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}, 'registered_name': 'AdamWeightDecay'}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000}
- training_precision: mixed_float16
### Training results
| Train Loss | Train Accuracy | Train Top-3-accuracy | Validation Loss | Validation Accuracy | Validation Top-3-accuracy | Epoch |
|:----------:|:--------------:|:--------------------:|:---------------:|:-------------------:|:-------------------------:|:-----:|
| 2.2199 | 0.4215 | 0.6564 | 1.8634 | 0.5702 | 0.8099 | 0 |
| 1.5448 | 0.6976 | 0.8797 | 1.3110 | 0.7603 | 0.9028 | 1 |
| 1.0494 | 0.8694 | 0.9519 | 0.9507 | 0.8855 | 0.9590 | 2 |
| 0.7408 | 0.9381 | 0.9824 | 0.7499 | 0.9114 | 0.9806 | 3 |
| 0.5428 | 0.9756 | 0.9939 | 0.5831 | 0.9460 | 0.9849 | 4 |
| 0.4169 | 0.9901 | 0.9977 | 0.4895 | 0.9525 | 0.9914 | 5 |
| 0.3371 | 0.9947 | 0.9977 | 0.4194 | 0.9611 | 0.9892 | 6 |
| 0.2848 | 0.9969 | 0.9992 | 0.3786 | 0.9590 | 0.9892 | 7 |
### Framework versions
- Transformers 4.35.0
- TensorFlow 2.14.0
- Datasets 2.14.6
- Tokenizers 0.14.1
|
coeuslearning/llama-2-7b-coesulearning | coeuslearning | 2023-12-23T05:03:40Z | 0 | 0 | peft | [
"peft",
"region:us"
] | null | 2023-12-02T09:15:28Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.4.0
- PEFT 0.4.0
- PEFT 0.4.0
|
LiamLi1991/HW01 | LiamLi1991 | 2023-12-23T04:59:00Z | 5 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-12-23T01:35:44Z | ---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: HW01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# HW01
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7590
- Matthews Correlation: 0.5475
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5168 | 1.0 | 535 | 0.4544 | 0.4535 |
| 0.3414 | 2.0 | 1070 | 0.4683 | 0.5277 |
| 0.2331 | 3.0 | 1605 | 0.6640 | 0.5162 |
| 0.1657 | 4.0 | 2140 | 0.7590 | 0.5475 |
| 0.1236 | 5.0 | 2675 | 0.8733 | 0.5256 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
dcaustin33/llama_friends | dcaustin33 | 2023-12-23T04:51:14Z | 2 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-hf",
"base_model:adapter:meta-llama/Llama-2-7b-hf",
"region:us"
] | null | 2023-12-23T02:37:28Z | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.7.1 |
DaRkSpyro/JewelTheMacaw | DaRkSpyro | 2023-12-23T04:41:24Z | 0 | 0 | flair | [
"flair",
"music",
"en",
"dataset:HuggingFaceH4/no_robots",
"license:apache-2.0",
"region:us"
] | null | 2023-12-23T04:25:09Z | ---
license: apache-2.0
language:
- en
metrics:
- accuracy
tags:
- music
datasets:
- HuggingFaceH4/no_robots
library_name: flair
--- |
gfodor/Segmind-VegaRT-Fused | gfodor | 2023-12-23T04:36:35Z | 1 | 0 | diffusers | [
"diffusers",
"safetensors",
"lora",
"text-to-image",
"arxiv:2311.05556",
"base_model:segmind/Segmind-Vega",
"base_model:adapter:segmind/Segmind-Vega",
"license:apache-2.0",
"autotrain_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] | text-to-image | 2023-12-22T00:05:34Z | ---
library_name: diffusers
base_model: segmind/Segmind-Vega
tags:
- lora
- text-to-image
license: apache-2.0
inference: false
---
# Segmind-VegaRT - Latent Consistency Model Segmind-Vega
# Fused model by gfodor
Try real-time inference here **[VegaRT demoโก](https://www.segmind.com/segmind-vega-rt)**
API for **[Segmind-VegaRT](https://www.segmind.com/models/segmind-vega-rt-v1/api)**
<video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/62039c2d91d53938a643317d/WacXd5DqP5hx8iEGTPt16.mp4"></video>
Segmind-VegaRT a distilled consistency adapter for [Segmind-Vega](https://huggingface.co/segmind/Segmind-Vega) that allows
to reduce the number of inference steps to only between **2 - 8 steps**.
Latent Consistency Model (LCM) LoRA was proposed in [LCM-LoRA: A universal Stable-Diffusion Acceleration Module](https://arxiv.org/abs/2311.05556)
by *Simian Luo, Yiqin Tan, Suraj Patil, Daniel Gu et al.*
# Image comparison (Segmind-VegaRT vs SDXL-Turbo)



# Speed comparison (Segmind-VegaRT vs SDXL-Turbo) on A100 80GB

| Model | Params / M |
|----------------------------------------------------------------------------|------------|
| [lcm-lora-sdv1-5](https://huggingface.co/latent-consistency/lcm-lora-sdv1-5) | 67.5 |
| [**Segmind-VegaRT**](https://huggingface.co/segmind/Segmind-VegaRT) | **119** |
| [lcm-lora-sdxl](https://huggingface.co/latent-consistency/lcm-lora-sdxl) | 197 |
## Usage
LCM-LoRA is supported in ๐ค Hugging Face Diffusers library from version v0.23.0 onwards. To run the model, first
install the latest version of the Diffusers library as well as `peft`, `accelerate` and `transformers`.
audio dataset from the Hugging Face Hub:
```bash
pip install --upgrade pip
pip install --upgrade diffusers transformers accelerate peft
```
### Text-to-Image
Let's load the base model `segmind/Segmind-Vega` first. Next, the scheduler needs to be changed to [`LCMScheduler`](https://huggingface.co/docs/diffusers/v0.22.3/en/api/schedulers/lcm#diffusers.LCMScheduler) and we can reduce the number of inference steps to just 2 to 8 steps.
Please make sure to either disable `guidance_scale` or use values between 1.0 and 2.0.
```python
import torch
from diffusers import LCMScheduler, AutoPipelineForText2Image
model_id = "segmind/Segmind-Vega"
adapter_id = "segmind/Segmind-VegaRT"
pipe = AutoPipelineForText2Image.from_pretrained(model_id, torch_dtype=torch.float16, variant="fp16")
pipe.scheduler = LCMScheduler.from_config(pipe.scheduler.config)
pipe.to("cuda")
# load and fuse lcm lora
pipe.load_lora_weights(adapter_id)
pipe.fuse_lora()
prompt = "Self-portrait oil painting, a beautiful cyborg with golden hair, 8k"
# disable guidance_scale by passing 0
image = pipe(prompt=prompt, num_inference_steps=4, guidance_scale=0).images[0]
```
|
Maxx0/mistral_instruct_generation | Maxx0 | 2023-12-23T04:21:07Z | 1 | 0 | peft | [
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"dataset:generator",
"base_model:mistralai/Mistral-7B-Instruct-v0.1",
"base_model:adapter:mistralai/Mistral-7B-Instruct-v0.1",
"license:apache-2.0",
"region:us"
] | null | 2023-12-23T04:20:58Z | ---
license: apache-2.0
library_name: peft
tags:
- trl
- sft
- generated_from_trainer
datasets:
- generator
base_model: mistralai/Mistral-7B-Instruct-v0.1
model-index:
- name: mistral_instruct_generation
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mistral_instruct_generation
This model is a fine-tuned version of [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0138
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_steps: 0.03
- training_steps: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.3071 | 20.0 | 20 | 0.0966 |
| 0.0239 | 40.0 | 40 | 0.0214 |
| 0.0192 | 60.0 | 60 | 0.0189 |
| 0.0179 | 80.0 | 80 | 0.0173 |
| 0.0149 | 100.0 | 100 | 0.0138 |
### Framework versions
- PEFT 0.7.1
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0 |
adandu/dreambooth_output | adandu | 2023-12-23T04:02:14Z | 0 | 0 | diffusers | [
"diffusers",
"tensorboard",
"safetensors",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"dreambooth",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:finetune:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | 2023-12-23T02:03:01Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
instance_prompt: a photo of AESARNAV person
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- dreambooth
inference: true
---
# DreamBooth - adandu/dreambooth_output
This is a dreambooth model derived from runwayml/stable-diffusion-v1-5. The weights were trained on a photo of AESARNAV person using [DreamBooth](https://dreambooth.github.io/).
You can find some example images in the following.
DreamBooth for the text encoder was enabled: True.
|
Gummybear05/whisper-small-ko-E30_Y_freq_speed | Gummybear05 | 2023-12-23T03:59:40Z | 5 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"hf-asr-leaderboard",
"generated_from_trainer",
"hi",
"dataset:aihub_elder",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2023-12-23T01:53:20Z | ---
language:
- hi
license: apache-2.0
base_model: openai/whisper-small
tags:
- hf-asr-leaderboard
- generated_from_trainer
datasets:
- aihub_elder
model-index:
- name: whisper-small-ko-E30_Y_freq_speed
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-ko-E30_Y_freq_speed
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the aihub Y dialogue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1876
- Cer: 5.2573
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 2
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.4514 | 0.13 | 100 | 0.2782 | 6.3910 |
| 0.2636 | 0.26 | 200 | 0.2298 | 6.1913 |
| 0.2355 | 0.39 | 300 | 0.2313 | 6.5789 |
| 0.2075 | 0.52 | 400 | 0.2121 | 6.1149 |
| 0.1899 | 0.64 | 500 | 0.2107 | 5.9622 |
| 0.1746 | 0.77 | 600 | 0.2040 | 5.8212 |
| 0.1791 | 0.9 | 700 | 0.1974 | 5.6685 |
| 0.0826 | 1.03 | 800 | 0.1924 | 5.4335 |
| 0.0725 | 1.16 | 900 | 0.1959 | 5.4570 |
| 0.072 | 1.29 | 1000 | 0.1942 | 5.2749 |
| 0.0658 | 1.42 | 1100 | 0.1935 | 5.4746 |
| 0.0639 | 1.55 | 1200 | 0.1894 | 5.2867 |
| 0.0658 | 1.68 | 1300 | 0.1891 | 5.3043 |
| 0.0606 | 1.81 | 1400 | 0.1876 | 5.1985 |
| 0.0648 | 1.93 | 1500 | 0.1876 | 5.2573 |
### Framework versions
- Transformers 4.37.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
ljvmiranda921/xx_isl_sigtyp_trf | ljvmiranda921 | 2023-12-23T03:51:12Z | 1 | 0 | spacy | [
"spacy",
"token-classification",
"multilingual",
"model-index",
"region:us"
] | token-classification | 2023-11-30T06:14:10Z | ---
tags:
- spacy
- token-classification
language:
- multilingual
model-index:
- name: xx_isl_sigtyp_trf
results:
- task:
name: TAG
type: token-classification
metrics:
- name: TAG (XPOS) Accuracy
type: accuracy
value: 0.8484209631
- task:
name: POS
type: token-classification
metrics:
- name: POS (UPOS) Accuracy
type: accuracy
value: 0.9628502448
- task:
name: MORPH
type: token-classification
metrics:
- name: Morph (UFeats) Accuracy
type: accuracy
value: 0.9012080149
- task:
name: LEMMA
type: token-classification
metrics:
- name: Lemma Accuracy
type: accuracy
value: 0.9486362207
- task:
name: UNLABELED_DEPENDENCIES
type: token-classification
metrics:
- name: Unlabeled Attachment Score (UAS)
type: f_score
value: 0.8288867214
- task:
name: LABELED_DEPENDENCIES
type: token-classification
metrics:
- name: Labeled Attachment Score (LAS)
type: f_score
value: 0.7770595885
- task:
name: SENTS
type: token-classification
metrics:
- name: Sentences F-Score
type: f_score
value: 0.9772685943
---
| Feature | Description |
| --- | --- |
| **Name** | `xx_isl_sigtyp_trf` |
| **Version** | `0.1.0` |
| **spaCy** | `>=3.6.1,<3.7.0` |
| **Default Pipeline** | `transformer`, `parser`, `trainable_lemmatizer`, `tagger`, `morphologizer` |
| **Components** | `transformer`, `parser`, `trainable_lemmatizer`, `tagger`, `morphologizer` |
| **Vectors** | 0 keys, 0 unique vectors (0 dimensions) |
| **Sources** | n/a |
| **License** | n/a |
| **Author** | [n/a]() |
### Label Scheme
<details>
<summary>View label scheme (7120 labels for 3 components)</summary>
| Component | Labels |
| --- | --- |
| **`parser`** | `ROOT`, `acl`, `acl:relcl`, `advcl`, `advmod`, `amod`, `appos`, `aux`, `case`, `cc`, `ccomp`, `compound:prt`, `conj`, `cop`, `dep`, `det`, `discourse`, `expl`, `fixed`, `flat:foreign`, `flat:name`, `iobj`, `mark`, `nmod`, `nmod:poss`, `nsubj`, `nummod`, `obj`, `obl`, `parataxis`, `punct`, `vocative`, `xcomp` |
| **`tagger`** | `"`, `"__Case=Acc\|Gender=Neut\|Number=Sing`, `"__Case=Gen\|Number=Sing\|Person=1\|PronType=Prs`, `"__Case=Gen\|Number=Sing\|Person=2\|PronType=Prs`, `"__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `"__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `"__NumType=Card`, `"__NumType=Frac`, `"__VerbForm=Sup\|Voice=Mid`, `,`, `.`, `:`, `;`, `ADJ`, `ADJ-A`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Acc\|Degree=Pos`, `ADJ-A__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-A__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `ADJ-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `ADJ-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `ADJ-A__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Card`, `ADJ-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `ADJ-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `ADJ-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ADJ-A__Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJ-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `ADJ-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Prs`, `ADJ-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ADJ-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `ADJ-A__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJ-A__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Ind`, `ADJ-A__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Dem`, `ADJ-A__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ-A__Case=Gen\|Gender=Fem\|Number=Plur\|NumType=Card`, `ADJ-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJ-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-A__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-A__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-A__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-A__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `ADJ-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `ADJ-A__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJ-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `ADJ-A__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `ADJ-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ADJ-A__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJ-A__Degree=Cmp`, `ADJ-A__Degree=Sup`, `ADJ-A__Foreign=Yes`, `ADJ-A__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-A__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-A__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-A__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-A__NumType=Card`, `ADJ-A__VerbForm=Inf\|Voice=Act`, `ADJ-A__VerbForm=Sup\|Voice=Act`, `ADJ-D`, `ADJ-D__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Card`, `ADJ-D__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `ADJ-D__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `ADJ-D__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `ADJ-D__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Dat\|Degree=Pos`, `ADJ-D__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-D__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-D__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-D__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Dem`, `ADJ-D__Case=Dat\|Gender=Masc\|Number=Plur\|NumType=Card`, `ADJ-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Dem`, `ADJ-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Dem`, `ADJ-D__Case=Dat\|Gender=Neut\|Number=Plur\|NumType=Card`, `ADJ-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Dem`, `ADJ-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ADJ-D__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-D__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ-D__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ-D__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-D__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-D__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-D__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-D__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-D__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-D__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `ADJ-D__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-D__NumType=Card`, `ADJ-D__VerbForm=Inf\|Voice=Act`, `ADJ-G`, `ADJ-G__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-G__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJ-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `ADJ-G__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-G__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-G__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJ-G__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-G__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Gen\|Degree=Pos`, `ADJ-G__Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-G__Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-G__Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-G__Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `ADJ-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Dem`, `ADJ-G__Case=Gen\|Gender=Masc\|Number=Sing\|NumType=Card`, `ADJ-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Dem`, `ADJ-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ADJ-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ADJ-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Prs`, `ADJ-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-G__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-G__Degree=Cmp`, `ADJ-G__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-G__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-G__NumType=Card`, `ADJ-G__VerbForm=Inf\|Voice=Act`, `ADJ-N`, `ADJ-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Card`, `ADJ-N__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Prs`, `ADJ-N__Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `ADJ-N__Case=Acc\|Gender=Masc\|Number=Sing\|NumType=Card`, `ADJ-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ADJ-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `ADJ-N__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Ind`, `ADJ-N__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Dem`, `ADJ-N__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `ADJ-N__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Dem`, `ADJ-N__Case=Gen\|Gender=Neut\|Number=Plur\|NumType=Card`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Nom\|Definite=Ind\|Number=Sing`, `ADJ-N__Case=Nom\|Degree=Pos`, `ADJ-N__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ-N__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ-N__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ-N__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ-N__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJ-N__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `ADJ-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `ADJ-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `ADJ-N__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJ-N__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `ADJ-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Dem`, `ADJ-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `ADJ-N__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `ADJ-N__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Card`, `ADJ-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `ADJ-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ADJ-N__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJ-N__Case=Nom\|Gender=Neut\|Number=Plur\|NumType=Card`, `ADJ-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `ADJ-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADJ-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ADJ-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `ADJ-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJ-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `ADJ-N__Degree=Cmp`, `ADJ-N__Degree=Sup`, `ADJ-N__Foreign=Yes`, `ADJ-N__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `ADJ-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `ADJ-N__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJ-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJ-N__NumType=Card`, `ADJ-N__NumType=Frac`, `ADJ-N__VerbForm=Inf\|Voice=Act`, `ADJ-N__VerbForm=Inf\|Voice=Mid`, `ADJ-N__VerbForm=Part\|Voice=Act`, `ADJ-N__VerbForm=Sup\|Voice=Act`, `ADJ-N__VerbForm=Sup\|Voice=Mid`, `ADJP-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJP__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJR`, `ADJR-A`, `ADJR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `ADJR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADJR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-A__Case=Acc\|Definite=Ind\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJR-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJR-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJR-A__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJR-A__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-A__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-A__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJR-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJR-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJR-A__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `ADJR-A__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-A__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing`, `ADJR-A__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJR-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `ADJR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJR-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJR-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJR-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJR-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJR-A__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJR-A__Degree=Cmp`, `ADJR-A__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJR-A__VerbForm=Inf\|Voice=Act`, `ADJR-D`, `ADJR-D__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-D__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-D__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADJR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJR-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJR-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJR-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJR-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJR-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJR-D__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-D__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-D__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJR-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `ADJR-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJR-D__Degree=Cmp`, `ADJR-G`, `ADJR-G__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-G__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-G__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJR-G__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `ADJR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADJR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJR-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJR-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJR-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJR-G__Degree=Cmp`, `ADJR-N`, `ADJR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `ADJR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADJR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJR-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJR-N__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-N__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADJR-N__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJR-N__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJR-N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJR-N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-N__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-N__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADJR-N__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur`, `ADJR-N__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJR-N__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJR-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJR-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJR-N__Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `ADJR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADJR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADJR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJR-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJR-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJR-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJR-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJR-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJR-N__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJR-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADJR-N__Degree=Cmp`, `ADJR-N__Foreign=Yes`, `ADJR-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJR-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJR-N__VerbForm=Inf\|Voice=Act`, `ADJR__Degree=Cmp`, `ADJS`, `ADJS-A`, `ADJS-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Plur`, `ADJS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJS-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJS-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJS-A__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-A__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJS-A__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-A__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-A__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-A__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-A__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJS-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-A__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-A__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-A__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-A__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Plur`, `ADJS-A__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `ADJS-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-A__Degree=Sup`, `ADJS-A__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADJS-A__VerbForm=Inf\|Voice=Act`, `ADJS-D`, `ADJS-D__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-D__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJS-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Plur`, `ADJS-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJS-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJS-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADJS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJS-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-D__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADJS-D__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-D__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-D__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-D__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-G__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-G__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-G__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Plur`, `ADJS-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJS-G__Case=Gen\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJS-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJS-G__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-N`, `ADJS-N__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-N__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-N__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-N__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-N__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJS-N__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-N__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-N__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADJS-N__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADJS-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADJS-N__Case=Nom\|Definite=Ind\|Number=Sing`, `ADJS-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `ADJS-N__Degree=Cmp`, `ADJS-N__Degree=Sup`, `ADJS-N__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `ADJS-N__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `ADJS-N__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJS-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJS-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `ADJS-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADJS-N__VerbForm=Inf\|Voice=Act`, `ADJS-N__VerbForm=Inf\|Voice=Mid`, `ADJS__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADJS__Degree=Sup`, `ADJ__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADJ__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADJ__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADJ__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADJ__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADJ__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADJ__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADJ__Degree=Cmp`, `ADJ__Degree=Pos`, `ADV`, `ADV-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADVP`, `ADVR`, `ADVR-1`, `ADVR__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADVR__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADVR__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADVR__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADVR__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADVR__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADVR__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADVR__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADVR__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADVR__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADVR__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADVR__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADVR__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADVR__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADVR__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADVR__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADVR__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADVR__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADVR__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADVR__Case=Gen\|Gender=Masc\|Number=Sing\|NumType=Card`, `ADVR__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADVR__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `ADVR__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADVR__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `ADVR__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADVR__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADVR__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADVR__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADVR__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADVR__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADVR__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADVR__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `ADVR__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `ADVR__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `ADVR__Degree=Cmp`, `ADVR__Degree=Sup`, `ADVR__Foreign=Yes`, `ADVR__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADVR__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADVR__VerbForm=Inf\|Voice=Mid`, `ADVR__VerbForm=Sup\|Voice=Act`, `ADVS`, `ADVS__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADVS__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADVS__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `ADVS__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADVS__Case=Acc\|Definite=Ind\|Number=Sing`, `ADVS__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADVS__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADVS__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADVS__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADVS__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADVS__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADVS__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADVS__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADVS__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADVS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADVS__Case=Nom\|Definite=Ind\|Number=Sing`, `ADVS__Degree=Cmp`, `ADVS__Degree=Sup`, `ADVS__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `ADVS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `ADVS__VerbForm=Inf\|Voice=Mid`, `ADVS__VerbForm=Sup\|Voice=Mid`, `ADV__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ADV__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADV__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADV__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADV__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADV__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADV__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADV__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADV__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADV__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADV__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADV__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADV__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADV__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADV__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADV__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADV__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADV__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADV__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADV__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `ADV__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `ADV__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Prs`, `ADV__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ADV__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Prs`, `ADV__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `ADV__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ADV__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ADV__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADV__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADV__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADV__Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing`, `ADV__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADV__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADV__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADV__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADV__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `ADV__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADV__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADV__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADV__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADV__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADV__Case=Dat\|Definite=Ind\|Number=Sing`, `ADV__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADV__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Prs`, `ADV__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `ADV__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `ADV__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `ADV__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADV__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `ADV__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADV__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADV__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `ADV__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADV__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADV__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADV__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADV__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADV__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ADV__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Prs`, `ADV__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `ADV__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `ADV__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `ADV__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADV__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADV__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADV__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADV__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADV__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `ADV__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `ADV__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `ADV__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADV__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ADV__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADV__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ADV__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ADV__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `ADV__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `ADV__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ADV__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `ADV__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ADV__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `ADV__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ADV__Case=Nom\|Definite=Ind\|Number=Sing`, `ADV__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ADV__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ADV__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ADV__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `ADV__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADV__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `ADV__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ADV__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `ADV__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ADV__Degree=Cmp`, `ADV__Degree=Pos`, `ADV__Degree=Sup`, `ADV__Foreign=Yes`, `ADV__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADV__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADV__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `ADV__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADV__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `ADV__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADV__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ADV__VerbForm=Inf\|Voice=Act`, `ADV__VerbForm=Sup\|Voice=Act`, `ALSO`, `ALSO__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `ALSO__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ALSO__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ALSO__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ALSO__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `ALSO__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `ALSO__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ALSO__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ALSO__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ALSO__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ALSO__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `ALSO__Foreign=Yes`, `ALSO__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `ALSO__VerbForm=Sup\|Voice=Act`, `BAG__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `BAG__VerbForm=Part\|Voice=Act`, `BE`, `BEDI`, `BEDI__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `BEDI__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `BEDI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `BEDI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `BEDI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `BEDI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `BEDI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `BEDI__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `BEDI__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Prs`, `BEDI__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `BEDI__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `BEDI__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Dem`, `BEDI__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `BEDI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `BEDI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `BEDI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `BEDI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `BEDI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `BEDI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `BEDI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `BEDI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `BEDI__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `BEDI__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `BEDI__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `BEDI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `BEDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `BEDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEDI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDI__VerbForm=Inf\|Voice=Act`, `BEDS__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `BEDS__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `BEDS__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `BEDS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDS__Mood=Sub\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `BEI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `BEI__Degree=Cmp`, `BEI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEI__Mood=Imp\|VerbForm=Inf`, `BEI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEI__VerbForm=Sup\|Voice=Act`, `BEN`, `BEN__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `BEN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `BEN__VerbForm=Sup\|Voice=Act`, `BEPI`, `BEPI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `BEPI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `BEPI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `BEPI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `BEPI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `BEPI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `BEPI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `BEPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `BEPI__Mood=Ind\|Tense=Pres`, `BEPI__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS`, `BEPS__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `BEPS__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `BEPS__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `BEPS__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `BEPS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `BEPS__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `BEPS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `BEPS__VerbForm=Sup\|Voice=Act`, `BE__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `BE__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `BE__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `BE__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `BE__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `BE__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `BE__VerbForm=Inf\|Voice=Act`, `BE__VerbForm=Sup\|Voice=Act`, `C`, `CONJ`, `CONJ-1`, `CONJ-1__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `CONJ-1__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `CONJ-1__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `CONJ-1__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `CONJ-1__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `CONJ-1__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `CONJ-1__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `CONJ-1__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `CONJ-2`, `CONJ-2__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `CONJ-2__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `CONJ-3`, `CONJ-3__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `CONJ-3__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `CONJ-4`, `CONJ-4__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `CONJ-4__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `CONJ-5__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `CONJ-6__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `CONJ-7`, `CONJ-8`, `CONJ-9`, `CONJ__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `CONJ__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `CONJ__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `CONJ__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `CONJ__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `CONJ__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `CONJ__Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing`, `CONJ__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `CONJ__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `CONJ__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `CONJ__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `CONJ__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `CONJ__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `CONJ__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `CONJ__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `CONJ__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `CONJ__Foreign=Yes`, `CONJ__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `CONJ__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `C__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `C__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `C__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `C__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `C__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `C__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `C__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `C__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `D-A`, `D-A__Case=Acc`, `D-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `D-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur`, `D-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `D-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `D-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `D-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `D-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `D-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `D-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `D-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `D-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `D-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `D-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `D-A__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `D-A__Case=Acc\|Gender=Fem\|Number=Plur`, `D-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `D-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Prs`, `D-A__Case=Acc\|Gender=Fem\|Number=Sing`, `D-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `D-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `D-A__Case=Acc\|Gender=Masc\|Number=Plur`, `D-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `D-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `D-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Prs`, `D-A__Case=Acc\|Gender=Masc\|Number=Sing`, `D-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `D-A__Case=Acc\|Gender=Neut\|Number=Plur`, `D-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `D-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Prs`, `D-A__Case=Acc\|Gender=Neut\|Number=Sing`, `D-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `D-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `D-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `D-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `D-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `D-A__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `D-A__Case=Dat\|Gender=Neut\|Number=Sing`, `D-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `D-A__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `D-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `D-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `D-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `D-A__Case=Gen\|Gender=Masc\|Number=Plur`, `D-A__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Card`, `D-A__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Dem`, `D-A__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Dem`, `D-A__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Dem`, `D-A__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `D-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `D-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `D-A__Case=Nom\|Gender=Fem\|Number=Plur`, `D-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `D-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Prs`, `D-A__Case=Nom\|Gender=Fem\|Number=Sing`, `D-A__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `D-A__Case=Nom\|Gender=Masc\|Number=Sing`, `D-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `D-A__Case=Nom\|Gender=Neut\|Number=Plur`, `D-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `D-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Prs`, `D-A__Case=Nom\|Gender=Neut\|Number=Sing`, `D-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `D-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `D-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `D-A__Degree=Sup`, `D-A__Foreign=Yes`, `D-A__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `D-A__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `D-A__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `D-A__VerbForm=Inf\|Voice=Act`, `D-D`, `D-D__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Prs`, `D-D__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `D-D__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `D-D__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `D-D__Case=Dat`, `D-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `D-D__Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing`, `D-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `D-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `D-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `D-D__Case=Dat\|Gender=Fem\|Number=Plur`, `D-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Dem`, `D-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Prs`, `D-D__Case=Dat\|Gender=Fem\|Number=Sing`, `D-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Dem`, `D-D__Case=Dat\|Gender=Masc\|Number=Plur`, `D-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Dem`, `D-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Prs`, `D-D__Case=Dat\|Gender=Masc\|Number=Sing`, `D-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Dem`, `D-D__Case=Dat\|Gender=Neut\|Number=Plur`, `D-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Dem`, `D-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Prs`, `D-D__Case=Dat\|Gender=Neut\|Number=Sing`, `D-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Dem`, `D-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `D-D__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `D-D__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `D-D__Case=Gen\|Number=Plur\|Person=1\|PronType=Prs`, `D-D__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `D-D__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `D-D__Foreign=Yes`, `D-G`, `D-G__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `D-G__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `D-G__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `D-G__Case=Gen`, `D-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `D-G__Case=Gen\|Gender=Fem\|Number=Plur`, `D-G__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Dem`, `D-G__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Prs`, `D-G__Case=Gen\|Gender=Fem\|Number=Sing`, `D-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Dem`, `D-G__Case=Gen\|Gender=Masc\|Number=Plur`, `D-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Dem`, `D-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Prs`, `D-G__Case=Gen\|Gender=Masc\|Number=Sing`, `D-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Dem`, `D-G__Case=Gen\|Gender=Neut\|Number=Plur`, `D-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Dem`, `D-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Prs`, `D-G__Case=Gen\|Gender=Neut\|Number=Sing`, `D-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Dem`, `D-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Prs`, `D-G__Case=Nom\|Gender=Fem\|Number=Plur`, `D-G__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `D-G__Degree=Cmp`, `D-G__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `D-G__VerbForm=Inf\|Voice=Act`, `D-N`, `D-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `D-N__Case=Acc\|Gender=Fem\|Number=Plur`, `D-N__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `D-N__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Prs`, `D-N__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `D-N__Case=Acc\|Gender=Masc\|Number=Sing`, `D-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `D-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `D-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Prs`, `D-N__Case=Acc\|Gender=Neut\|Number=Sing`, `D-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `D-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `D-N__Case=Dat\|Gender=Neut\|Number=Sing`, `D-N__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `D-N__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Dem`, `D-N__Case=Nom`, `D-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `D-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `D-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `D-N__Case=Nom\|Gender=Fem\|Number=Plur`, `D-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `D-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Prs`, `D-N__Case=Nom\|Gender=Fem\|Number=Sing`, `D-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `D-N__Case=Nom\|Gender=Masc\|Number=Plur`, `D-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Dem`, `D-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Prs`, `D-N__Case=Nom\|Gender=Masc\|Number=Sing`, `D-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `D-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `D-N__Case=Nom\|Gender=Neut\|Number=Plur`, `D-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `D-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Prs`, `D-N__Case=Nom\|Gender=Neut\|Number=Sing`, `D-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `D-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `D-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `D-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `D-N__Foreign=Yes`, `D-N__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `D-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `D-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `D-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `D-N__VerbForm=Inf\|Voice=Act`, `DAG__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `DAG__VerbForm=Part\|Voice=Act`, `DAN`, `DAN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `DAN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `DAN-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `DAN-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `DAN-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `DAN-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `DAN-A__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DAN-A__VerbForm=Sup\|Voice=Act`, `DAN-D`, `DAN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `DAN-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `DAN-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `DAN-D__Foreign=Yes`, `DAN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `DAN__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `DAN__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `DAN__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `DAN__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `DAN__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `DAN__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `DAN__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `DAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `DAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `DAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `DAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `DAN__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `DAN__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `DAN__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `DAN__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `DAN__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `DAN__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `DAN__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `DAN__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DAN__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `DAN__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DAN__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `DAN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DAN__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DAN__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DAN__Tense=Past\|VerbForm=Part`, `DAN__VerbForm=Sup\|Voice=Act`, `DODI__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `DODI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `DODI__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DODI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `DODI__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `DODI__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `DODI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `DODI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `DODI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `DODI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `DODI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `DODS__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Prs`, `DODS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODS__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DODS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DODS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DOG__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `DOI`, `DOI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `DOI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `DOI__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `DOI__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DOI__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DOI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DOI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOI__VerbForm=Inf\|Voice=Act`, `DOI__VerbForm=Sup\|Voice=Act`, `DON__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `DON__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `DON__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DON__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `DON__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `DON__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `DON__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `DON__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DON__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DON__VerbForm=Sup\|Voice=Act`, `DON__VerbForm=Sup\|Voice=Mid`, `DOPI__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `DOPI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `DOPI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `DOPI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `DOPI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `DOPI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `DOPI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `DOPI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `DOPI__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `DOPI__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `DOPI__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `DOPI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `DOPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `DOPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `DOPI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `DOPI__VerbForm=Inf\|Voice=Act`, `DOPI__VerbForm=Inf\|Voice=Mid`, `DOPS__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `DOPS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `DOPS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DOPS__VerbForm=Inf\|Voice=Act`, `DO__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DO__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `DO__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `DO__VerbForm=Inf\|Voice=Act`, `DO__VerbForm=Inf\|Voice=Mid`, `DO__VerbForm=Sup\|Voice=Mid`, `ES__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ES__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `ES__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `ES__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `FOREIGN__Foreign=Yes`, `FP`, `FP-1`, `FP-A`, `FP-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `FP-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `FP-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP-A__Case=Acc\|Gender=Fem\|Number=Sing\|NumType=Card`, `FP-A__Case=Acc\|Gender=Neut\|Number=Sing\|NumType=Card`, `FP-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FP-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP-A__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Card`, `FP-D`, `FP-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `FP-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FP-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `FP-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP-D__Case=Dat\|Gender=Masc\|Number=Sing\|NumType=Card`, `FP-D__Case=Dat\|Gender=Neut\|Number=Sing\|NumType=Card`, `FP-D__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP-G__Case=Gen\|Gender=Masc\|Number=Sing\|NumType=Card`, `FP-N`, `FP-N-6__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FP-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `FP-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `FP-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FP-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `FP-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `FP-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FP-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `FP-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP-N__Case=Nom\|Gender=Fem\|Number=Plur\|NumType=Card`, `FP-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `FP-N__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Card`, `FP-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `FP__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FP__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `FP__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `FP__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `FP__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FP__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `FP__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FP__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `FW`, `FW-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `FW__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `FW__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `FW__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `FW__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `FW__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `FW__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `FW__Case=Acc\|Definite=Ind\|Number=Sing`, `FW__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur`, `FW__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `FW__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FW__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing`, `FW__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `FW__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `FW__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FW__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `FW__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `FW__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `FW__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `FW__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `FW__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `FW__Case=Dat\|Definite=Ind\|Number=Sing`, `FW__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur`, `FW__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur`, `FW__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Ind`, `FW__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `FW__Case=Gen\|Definite=Def\|Gender=Neut\|Number=Sing`, `FW__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FW__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `FW__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `FW__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `FW__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `FW__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `FW__Case=Gen\|Definite=Ind\|Number=Sing`, `FW__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Prs`, `FW__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FW__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `FW__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `FW__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `FW__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `FW__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FW__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `FW__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `FW__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `FW__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `FW__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `FW__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `FW__Case=Nom\|Definite=Ind\|Number=Sing`, `FW__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `FW__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `FW__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `FW__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `FW__Foreign=Yes`, `FW__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `FW__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `FW__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `FW__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `FW__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `FW__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `FW__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `FW__NumType=Card`, `FW__VerbForm=Inf\|Voice=Act`, `G__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `HAG__VerbForm=Part\|Voice=Act`, `HAN`, `HAN__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `HAN__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `HAN__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `HAN__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `HAN__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `HAN__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `HAN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `HAN__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HAN__VerbForm=Sup\|Voice=Act`, `HV`, `HVDI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `HVDI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `HVDI__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `HVDI__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `HVDI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `HVDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `HVDI__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVDI__VerbForm=Inf\|Voice=Act`, `HVDS__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `HVDS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVDS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDS__Mood=Sub\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVDS__VerbForm=Inf\|Voice=Act`, `HVI`, `HVI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `HVI__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `HVI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVI__VerbForm=Sup\|Voice=Act`, `HVN__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `HVN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `HVN__VerbForm=Sup\|Voice=Act`, `HVPI`, `HVPI__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `HVPI__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `HVPI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `HVPI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `HVPI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `HVPI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `HVPI__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `HVPI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `HVPI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `HVPI__Case=Nom\|Definite=Ind\|Number=Sing`, `HVPI__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `HVPI__Foreign=Yes`, `HVPI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `HVPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPI__VerbForm=Inf\|Voice=Act`, `HVPI__VerbForm=Sup\|Voice=Act`, `HVPS`, `HVPS__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `HVPS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `HVPS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HVPS__VerbForm=Inf\|Voice=Act`, `HV__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `HV__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `HV__VerbForm=Inf\|Voice=Act`, `HV__VerbForm=Inf\|Voice=Mid`, `INTJ`, `INTJ__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `INTJ__Case=Nom\|Definite=Ind\|Number=Sing`, `INTJ__Foreign=Yes`, `INTJ__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `INTJ__VerbForm=Sup\|Voice=Act`, `IP-INF__VerbForm=Inf\|Voice=Act`, `LB`, `M-D`, `MAG`, `MAG__VerbForm=Part\|Voice=Act`, `MD`, `MDDI`, `MDDI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `MDDI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDDI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `MDDI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `MDDI__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDDI__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `MDDI__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `MDDI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `MDDI__Mood=Ind\|Tense=Past`, `MDDI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDI__VerbForm=Inf\|Voice=Act`, `MDDI__VerbForm=Sup\|Voice=Act`, `MDDS`, `MDDS__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `MDDS__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `MDDS__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MDDS__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDDS__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `MDDS__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `MDDS__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `MDDS__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MDDS__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDDS__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MDDS__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `MDDS__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `MDDS__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `MDDS__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `MDDS__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `MDDS__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MDDS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDDS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `MDDS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `MDDS__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `MDDS__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `MDDS__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `MDDS__Foreign=Yes`, `MDDS__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `MDDS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Sub\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDDS__Mood=Sub\|Tense=Past`, `MDDS__VerbForm=Inf\|Voice=Act`, `MDDS__VerbForm=Sup\|Voice=Act`, `MDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `MDN__VerbForm=Sup\|Voice=Act`, `MDPI`, `MDPI__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `MDPI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `MDPI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `MDPI__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `MDPI__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDPI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `MDPI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `MDPI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `MDPI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `MDPI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MDPI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `MDPI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDPI__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `MDPI__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `MDPI__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDPI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `MDPI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `MDPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `MDPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `MDPI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MDPI__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDPI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `MDPI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `MDPI__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `MDPI__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `MDPI__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `MDPI__Foreign=Yes`, `MDPI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `MDPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `MDPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `MDPI__Mood=Ind\|Tense=Pres`, `MDPI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPI__VerbForm=Inf\|Voice=Act`, `MDPI__VerbForm=Sup\|Voice=Act`, `MDPS`, `MDPS__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `MDPS__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `MDPS__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MDPS__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `MDPS__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDPS__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `MDPS__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MDPS__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `MDPS__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `MDPS__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MDPS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `MDPS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MDPS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `MDPS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `MDPS__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `MDPS__Foreign=Yes`, `MDPS__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MDPS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MDPS__VerbForm=Inf\|Voice=Act`, `MD__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MD__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MD__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MD__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MD__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `MD__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `MD__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MD__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `MD__Foreign=Yes`, `MD__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MD__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MD__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MD__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MD__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MD__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MD__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `MD__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `MD__VerbForm=Inf\|Voice=Act`, `MD__VerbForm=Part\|Voice=Act`, `MS-N__Degree=Sup`, `N`, `N-A`, `N-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `N-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `N-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `N-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `N-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `N-A__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `N-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur`, `N-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `N-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `N-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-A__Case=Acc\|Definite=Ind\|Number=Sing`, `N-A__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `N-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `N-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Prs`, `N-A__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Card`, `N-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Prs`, `N-A__Case=Acc\|Gender=Neut\|Number=Plur\|NumType=Card`, `N-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Prs`, `N-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-A__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-A__Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing`, `N-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-A__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-A__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-A__Case=Dat\|Definite=Ind\|Number=Sing`, `N-A__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Prs`, `N-A__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `N-A__Case=Dat\|Gender=Neut\|Number=Plur\|NumType=Card`, `N-A__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `N-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-A__Case=Gen\|Definite=Def\|Gender=Fem\|Number=Plur`, `N-A__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-A__Case=Gen\|Definite=Ind\|Number=Sing`, `N-A__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Card`, `N-A__Case=Gen\|Gender=Neut\|Number=Plur\|NumType=Card`, `N-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `N-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `N-A__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-A__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `N-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `N-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-A__Case=Nom\|Definite=Ind\|Number=Sing`, `N-A__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `N-A__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `N-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `N-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Prs`, `N-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `N-A__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `N-A__Foreign=Yes`, `N-A__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-A__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-A__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-A__NumType=Card`, `N-A__VerbForm=Inf\|Voice=Act`, `N-A__VerbForm=Inf\|Voice=Mid`, `N-A__VerbForm=Sup\|Voice=Act`, `N-D`, `N-D__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-D__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `N-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-D__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-D__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-D__Case=Acc\|Definite=Ind\|Number=Sing`, `N-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-D__Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing`, `N-D__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-D__Case=Dat\|Definite=Def\|Gender=Neut\|Number=Plur`, `N-D__Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing`, `N-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-D__Case=Dat\|Definite=Ind\|Number=Sing`, `N-D__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-D__Case=Dat\|Gender=Fem\|Number=Plur\|NumType=Card`, `N-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Prs`, `N-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `N-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `N-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `N-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `N-D__Case=Dat\|Number=Sing\|Person=1\|PronType=Prs`, `N-D__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `N-D__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-D__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-D__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-D__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-D__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-D__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-D__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-D__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-D__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-D__Case=Gen\|Definite=Ind\|Number=Sing`, `N-D__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Prs`, `N-D__Case=Gen\|Gender=Neut\|Number=Plur\|NumType=Card`, `N-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `N-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-D__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `N-D__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `N-D__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-D__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-D__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-D__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-D__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-D__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-D__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-D__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-D__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-D__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `N-D__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `N-D__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `N-D__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `N-D__Degree=Cmp`, `N-D__Foreign=Yes`, `N-D__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-D__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-D__VerbForm=Inf\|Voice=Act`, `N-D__VerbForm=Inf\|Voice=Mid`, `N-D__VerbForm=Part\|Voice=Act`, `N-D__VerbForm=Sup\|Voice=Act`, `N-G`, `N-G__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-G__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-G__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Plur`, `N-G__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `N-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-G__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-G__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-G__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-G__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-G__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-G__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-G__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-G__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-G__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-G__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Prs`, `N-G__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-G__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-G__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-G__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `N-G__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-G__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-G__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-G__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-G__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-G__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-G__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-G__Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing`, `N-G__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur`, `N-G__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-G__Case=Gen\|Definite=Def\|Gender=Neut\|Number=Plur`, `N-G__Case=Gen\|Definite=Def\|Gender=Neut\|Number=Sing`, `N-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-G__Case=Gen\|Definite=Ind\|Number=Sing`, `N-G__Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-G__Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-G__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Prs`, `N-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Prs`, `N-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Prs`, `N-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Ind`, `N-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Prs`, `N-G__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-G__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-G__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `N-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-G__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-G__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-G__Case=Nom\|Definite=Ind\|Number=Sing`, `N-G__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-G__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-G__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-G__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `N-G__Foreign=Yes`, `N-G__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-G__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-G__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-G__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-G__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-G__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-G__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-G__VerbForm=Inf\|Voice=Act`, `N-N`, `N-N__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-N__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `N-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-N__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `N-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `N-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `N-N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-N__Case=Dat\|Number=Sing\|Person=1\|PronType=Prs`, `N-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-N__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-N__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-N__Case=Gen\|Definite=Ind\|Number=Sing`, `N-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `N-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `N-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-N__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `N-N__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Plur`, `N-N__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `N-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `N-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `N-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `N-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `N-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `N-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N-N__Case=Nom\|Definite=Ind\|Number=Sing`, `N-N__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `N-N__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `N-N__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `N-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Prs`, `N-N__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `N-N__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `N-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Dem`, `N-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `N-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `N-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Prs`, `N-N__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `N-N__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `N-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `N-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `N-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `N-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `N-N__Case=Nom\|Number=Plur\|Person=1\|PronType=Prs`, `N-N__Case=Nom\|Number=Plur\|Person=2\|PronType=Prs`, `N-N__Case=Nom\|Number=Sing\|Person=1\|PronType=Prs`, `N-N__Case=Nom\|Number=Sing\|Person=2\|PronType=Prs`, `N-N__Foreign=Yes`, `N-N__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-N__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-N__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-N__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-N__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `N-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `N-N__NumType=Frac`, `N-N__VerbForm=Inf\|Voice=Act`, `N-N__VerbForm=Part\|Voice=Act`, `N-N__VerbForm=Sup\|Voice=Act`, `NEG`, `NEG-1`, `NEG-2`, `NEG-3__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NEG__Foreign=Yes`, `NP-NPR__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NP-SBJ-1`, `NPR-1__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-A`, `NPR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `NPR-A__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `NPR-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NPR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPR-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Acc\|Definite=Ind\|Number=Sing`, `NPR-A__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `NPR-A__Case=Acc\|Gender=Neut\|Number=Sing\|NumType=Card`, `NPR-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-A__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPR-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Dat\|Definite=Ind\|Number=Sing`, `NPR-A__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NPR-A__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `NPR-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Gen\|Definite=Ind\|Number=Sing`, `NPR-A__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Prs`, `NPR-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPR-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-A__Case=Nom\|Definite=Ind\|Number=Sing`, `NPR-A__Case=Nom\|Number=Sing\|Person=1\|PronType=Prs`, `NPR-A__Foreign=Yes`, `NPR-A__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `NPR-A__NumType=Ord`, `NPR-A__VerbForm=Inf\|Voice=Act`, `NPR-A__VerbForm=Sup\|Voice=Act`, `NPR-D`, `NPR-D__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `NPR-D__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NPR-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPR-D__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Acc\|Definite=Ind\|Number=Sing`, `NPR-D__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `NPR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NPR-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPR-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Dat\|Definite=Ind\|Number=Sing`, `NPR-D__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Dat\|Number=Sing\|Person=1\|PronType=Prs`, `NPR-D__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-D__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-D__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-D__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Gen\|Definite=Ind\|Number=Sing`, `NPR-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-D__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-D__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-D__Case=Nom\|Definite=Ind\|Number=Sing`, `NPR-D__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `NPR-D__Foreign=Yes`, `NPR-D__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `NPR-D__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `NPR-D__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NPR-G`, `NPR-G__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `NPR-G__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-G__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-G__Case=Acc\|Definite=Ind\|Number=Sing`, `NPR-G__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-G__Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NPR-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-G__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-G__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-G__Case=Dat\|Definite=Ind\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Def\|Gender=Fem\|Number=Plur`, `NPR-G__Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur`, `NPR-G__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Def\|Gender=Neut\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPR-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-G__Case=Gen\|Definite=Ind\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NPR-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NPR-G__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-G__Case=Nom\|Definite=Ind\|Number=Sing`, `NPR-G__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-G__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-G__Case=Nom\|Gender=Masc\|Number=Sing`, `NPR-G__Foreign=Yes`, `NPR-G__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `NPR-G__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NPR-G__VerbForm=Inf\|Voice=Act`, `NPR-N`, `NPR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-N__Case=Acc\|Definite=Ind\|Number=Sing`, `NPR-N__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-N__Case=Acc\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPR-N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-N__Case=Dat\|Definite=Ind\|Number=Sing`, `NPR-N__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `NPR-N__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur`, `NPR-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-N__Case=Gen\|Definite=Ind\|Number=Sing`, `NPR-N__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Prs`, `NPR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NPR-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `NPR-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NPR-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPR-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPR-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPR-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPR-N__Case=Nom\|Definite=Ind\|Number=Sing`, `NPR-N__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPR-N__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPR-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `NPR-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `NPR-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Prs`, `NPR-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Dem`, `NPR-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `NPR-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `NPR-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Prs`, `NPR-N__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `NPR-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `NPR-N__Case=Nom\|Number=Plur\|Person=2\|PronType=Prs`, `NPR-N__Case=Nom\|Number=Sing\|Person=1\|PronType=Prs`, `NPR-N__Case=Nom\|Number=Sing\|Person=2\|PronType=Prs`, `NPR-N__Foreign=Yes`, `NPR-N__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NPR-N__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NPR-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `NPR-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NPR-N__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NPR-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NPR-N__NumType=Card`, `NPR-N__VerbForm=Inf\|Voice=Act`, `NPR-N__VerbForm=Sup\|Voice=Act`, `NPR-S__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR-V__Foreign=Yes`, `NPRO-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-A`, `NPRS-A__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Plur`, `NPRS-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur`, `NPRS-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `NPRS-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NPRS-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPRS-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPRS-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPRS-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPRS-A__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPRS-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPRS-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPRS-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPRS-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPRS-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPRS-A__Case=Nom\|Definite=Ind\|Number=Sing`, `NPRS-A__Foreign=Yes`, `NPRS-D`, `NPRS-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPRS-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-D__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Plur`, `NPRS-D__Case=Dat\|Definite=Def\|Gender=Neut\|Number=Plur`, `NPRS-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPRS-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPRS-D__Case=Dat\|Definite=Ind\|Number=Sing`, `NPRS-D__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-D__Foreign=Yes`, `NPRS-G`, `NPRS-G__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPRS-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NPRS-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NPRS-G__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPRS-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-G__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-G__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur`, `NPRS-G__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing`, `NPRS-G__Case=Gen\|Definite=Def\|Gender=Neut\|Number=Plur`, `NPRS-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPRS-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPRS-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPRS-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPRS-G__Case=Gen\|Definite=Ind\|Number=Sing`, `NPRS-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPRS-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-G__Foreign=Yes`, `NPRS-N`, `NPRS-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPRS-N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPRS-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NPRS-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-N__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Plur`, `NPRS-N__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Plur`, `NPRS-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `NPRS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NPRS-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NPRS-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NPRS-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPRS-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NPRS-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NPRS-N__Foreign=Yes`, `NPRS-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NPR__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NPR__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-A`, `NS-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `NS-A__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Plur`, `NS-A__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `NS-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur`, `NS-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `NS-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `NS-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `NS-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NS-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NS-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NS-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-A__Case=Acc\|Definite=Ind\|Number=Plur`, `NS-A__Case=Acc\|Definite=Ind\|Number=Sing`, `NS-A__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NS-A__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Card`, `NS-A__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Card`, `NS-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Prs`, `NS-A__Case=Acc\|Gender=Neut\|Number=Plur\|NumType=Card`, `NS-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NS-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NS-A__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-A__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-A__Case=Dat\|Definite=Ind\|Number=Sing`, `NS-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NS-A__Case=Gen\|Definite=Def\|Gender=Fem\|Number=Plur`, `NS-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-A__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Card`, `NS-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Plur`, `NS-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `NS-A__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Plur`, `NS-A__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `NS-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NS-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NS-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NS-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-A__Case=Nom\|Definite=Ind\|Number=Sing`, `NS-A__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `NS-A__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `NS-A__Degree=Cmp`, `NS-A__Foreign=Yes`, `NS-A__NumType=Ord`, `NS-A__VerbForm=Inf\|Voice=Act`, `NS-D`, `NS-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-D__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-D__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NS-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NS-D__Case=Dat\|Definite=Def\|Gender=Fem\|Number=Plur`, `NS-D__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Plur`, `NS-D__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing`, `NS-D__Case=Dat\|Definite=Def\|Gender=Neut\|Number=Plur`, `NS-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NS-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NS-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NS-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NS-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-D__Case=Dat\|Definite=Ind\|Number=Sing`, `NS-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `NS-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `NS-D__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-D__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-D__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-D__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-D__Foreign=Yes`, `NS-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `NS-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NS-D__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `NS-D__VerbForm=Sup\|Voice=Act`, `NS-G`, `NS-G__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `NS-G__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NS-G__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NS-G__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `NS-G__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `NS-G__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur`, `NS-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NS-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NS-G__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-G__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-G__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NS-G__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NS-G__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NS-G__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-G__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-G__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NS-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NS-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NS-G__Case=Gen\|Definite=Def\|Gender=Fem\|Number=Plur`, `NS-G__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur`, `NS-G__Case=Gen\|Definite=Def\|Gender=Neut\|Number=Plur`, `NS-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NS-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-G__Case=Gen\|Definite=Ind\|Number=Plur`, `NS-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Prs`, `NS-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Prs`, `NS-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `NS-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `NS-G__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NS-G__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `NS-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-G__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-G__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NS-G__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NS-G__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NS-G__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NS-G__Case=Nom\|Gender=Neut\|Number=Plur\|NumType=Card`, `NS-G__Foreign=Yes`, `NS-G__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NS-G__VerbForm=Inf\|Voice=Act`, `NS-N`, `NS-N__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Plur`, `NS-N__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `NS-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NS-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-N__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Card`, `NS-N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-N__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-N__Case=Gen\|Gender=Neut\|Number=Plur\|NumType=Card`, `NS-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NS-N__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Plur`, `NS-N__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `NS-N__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Plur`, `NS-N__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `NS-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `NS-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `NS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `NS-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NS-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NS-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NS-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NS-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS-N__Case=Nom\|Definite=Ind\|Number=Plur`, `NS-N__Case=Nom\|Definite=Ind\|Number=Sing`, `NS-N__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NS-N__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `NS-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Prs`, `NS-N__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `NS-N__Case=Nom\|Gender=Neut\|Number=Plur\|NumType=Card`, `NS-N__Case=Nom\|Number=Plur\|Person=2\|PronType=Prs`, `NS-N__Foreign=Yes`, `NS-N__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NS-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NS-N__NumType=Ord`, `NS__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NS__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NS__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Plur`, `NUM`, `NUM-1__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `NUM-A`, `NUM-A__Case=Acc`, `NUM-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NUM-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NUM-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NUM-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NUM-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NUM-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NUM-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NUM-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NUM-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NUM-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NUM-A__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-A__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-A__Case=Acc\|Gender=Masc\|Number=Sing\|NumType=Card`, `NUM-A__Case=Acc\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-A__Case=Acc\|Gender=Neut\|Number=Sing\|NumType=Card`, `NUM-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `NUM-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NUM-A__Case=Dat\|Definite=Ind\|Number=Sing`, `NUM-A__Case=Dat\|Gender=Fem\|Number=Sing\|NumType=Card`, `NUM-A__Case=Dat\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-A__Case=Dat\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NUM-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-A__Case=Gen\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-A__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-A__Case=Gen\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `NUM-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NUM-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NUM-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NUM-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NUM-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NUM-A__Case=Nom\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-A__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-A__Case=Nom\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `NUM-A__Foreign=Yes`, `NUM-A__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NUM-A__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NUM-A__NumType=Card`, `NUM-A__NumType=Ord`, `NUM-D`, `NUM-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NUM-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NUM-D__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-D__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-D__Case=Acc\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-D__Case=Dat`, `NUM-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NUM-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NUM-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NUM-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NUM-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NUM-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NUM-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `NUM-D__Case=Dat\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-D__Case=Dat\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-D__Case=Dat\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-D__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-D__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-D__Case=Gen\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-D__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `NUM-D__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NUM-D__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NUM-D__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-D__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `NUM-D__Foreign=Yes`, `NUM-D__NumType=Card`, `NUM-G`, `NUM-G__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-G__Case=Acc\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-G__Case=Acc\|Gender=Neut\|Number=Sing\|NumType=Card`, `NUM-G__Case=Dat\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-G__Case=Gen`, `NUM-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NUM-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NUM-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NUM-G__Case=Gen\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-G__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `NUM-G__Case=Gen\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `NUM-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NUM-G__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NUM-G__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-G__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-G__Foreign=Yes`, `NUM-G__NumType=Card`, `NUM-N`, `NUM-N__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NUM-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NUM-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NUM-N__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-N__Case=Acc\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-N__Case=Acc\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NUM-N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NUM-N__Case=Dat\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-N__Case=Dat\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-N__Case=Gen\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-N__Case=Gen\|Gender=Masc\|Number=Sing\|NumType=Card`, `NUM-N__Case=Gen\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-N__Case=Nom`, `NUM-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `NUM-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `NUM-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `NUM-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `NUM-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NUM-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `NUM-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `NUM-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `NUM-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `NUM-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `NUM-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `NUM-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NUM-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `NUM-N__Case=Nom\|Definite=Ind\|Number=Sing`, `NUM-N__Case=Nom\|Gender=Fem\|Number=Plur\|NumType=Card`, `NUM-N__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Int`, `NUM-N__Case=Nom\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM-N__Case=Nom\|Gender=Neut\|Number=Sing\|NumType=Card`, `NUM-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `NUM-N__Foreign=Yes`, `NUM-N__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NUM-N__NumType=Card`, `NUM-N__NumType=Ord`, `NUM__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Card`, `NUM__Case=Gen\|Gender=Neut\|Number=Plur\|NumType=Card`, `NUM__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `NUM__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `NUM__Case=Nom\|Definite=Ind\|Number=Sing`, `NUM__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ONE-A`, `ONE-A__Case=Acc`, `ONE-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ONE-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ONE-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ONE-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ONE-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ONE-A__Case=Acc\|Gender=Fem\|Number=Sing\|NumType=Card`, `ONE-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `ONE-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `ONE-A__Case=Acc\|Gender=Masc\|Number=Sing\|NumType=Card`, `ONE-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `ONE-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ONE-A__Case=Acc\|Gender=Neut\|Number=Sing\|NumType=Card`, `ONE-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ONE-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ONE-A__Case=Dat\|Gender=Fem\|Number=Sing\|NumType=Card`, `ONE-A__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Ind`, `ONE-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ONE-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ONE-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ONE-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ONE-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ONE-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ONE-A__Case=Nom\|Gender=Fem\|Number=Plur\|NumType=Card`, `ONE-A__Case=Nom\|Gender=Fem\|Number=Sing\|NumType=Card`, `ONE-A__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Card`, `ONE-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ONE-A__Case=Nom\|Gender=Neut\|Number=Sing\|NumType=Card`, `ONE-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ONE-A__NumType=Card`, `ONE-D`, `ONE-D__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `ONE-D__Case=Dat`, `ONE-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ONE-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ONE-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `ONE-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ONE-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ONE-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ONE-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ONE-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ONE-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ONE-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Ind`, `ONE-D__Case=Dat\|Gender=Fem\|Number=Sing\|NumType=Card`, `ONE-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Ind`, `ONE-D__Case=Dat\|Gender=Masc\|Number=Sing\|NumType=Card`, `ONE-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ONE-D__Case=Dat\|Gender=Neut\|Number=Sing\|NumType=Card`, `ONE-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ONE-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `ONE-D__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `ONE-D__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ONE-G`, `ONE-G__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `ONE-G__Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing`, `ONE-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ONE-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ONE-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `ONE-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `ONE-G__Case=Gen\|Gender=Fem\|Number=Sing\|NumType=Card`, `ONE-G__Case=Gen\|Gender=Masc\|Number=Sing\|NumType=Card`, `ONE-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ONE-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Ind`, `ONE-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ONE-N`, `ONE-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ONE-N__Case=Acc\|Gender=Masc\|Number=Sing\|NumType=Card`, `ONE-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ONE-N__Case=Acc\|Gender=Neut\|Number=Sing\|NumType=Card`, `ONE-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ONE-N__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Ind`, `ONE-N__Case=Nom`, `ONE-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ONE-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ONE-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `ONE-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `ONE-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `ONE-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `ONE-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `ONE-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `ONE-N__Case=Nom\|Gender=Fem\|Number=Sing\|NumType=Card`, `ONE-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `ONE-N__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Card`, `ONE-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `ONE-N__Case=Nom\|Gender=Neut\|Number=Plur\|NumType=Card`, `ONE-N__Case=Nom\|Gender=Neut\|Number=Sing\|NumType=Card`, `ONE-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `ONE-N__NumType=Card`, `ONES-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHER-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `OTHER-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `OTHER-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `OTHER-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `OTHER-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `OTHER-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `OTHER-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHER-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `OTHER-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `OTHER-A__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `OTHER-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `OTHER-D`, `OTHER-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `OTHER-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `OTHER-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `OTHER-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `OTHER-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Ind`, `OTHER-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Ind`, `OTHER-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `OTHER-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `OTHER-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHER-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `OTHER-D__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `OTHER-G`, `OTHER-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `OTHER-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Ind`, `OTHER-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `OTHER-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHER-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `OTHER-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `OTHER-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHER-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `OTHER-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `OTHER-N__Case=Nom\|Definite=Ind\|Number=Sing`, `OTHER-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `OTHER-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `OTHER-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `OTHER-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHER-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `OTHER-WPRO__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `OTHERS-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `OTHERS-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `OTHERS-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `OTHERS-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `OTHERS-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `OTHERS-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `OTHERS-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHERS-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `OTHERS-A__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `OTHERS-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHERS-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Ind`, `OTHERS-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `OTHERS-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `OTHERS-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHERS-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `OTHERS-G__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Ind`, `OTHERS-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Ind`, `OTHERS-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHERS-N__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `OTHERS-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHERS-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `OTHERS-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `OTHERS-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `OTHERS-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `OTHER__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `P`, `POR-A`, `POR-D__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `POS-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-A`, `PRO-A__Case=Acc`, `PRO-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `PRO-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `PRO-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `PRO-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `PRO-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `PRO-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `PRO-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `PRO-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `PRO-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `PRO-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-A__Case=Acc\|Definite=Ind\|Number=Sing`, `PRO-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `PRO-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `PRO-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Prs`, `PRO-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `PRO-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `PRO-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `PRO-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `PRO-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `PRO-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `PRO-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `PRO-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `PRO-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `PRO-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Acc\|Number=Plur\|Person=1\|PronType=Prs`, `PRO-A__Case=Acc\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-A__Case=Acc\|Number=Sing\|Person=1\|PronType=Prs`, `PRO-A__Case=Acc\|Number=Sing\|Person=2\|PronType=Prs`, `PRO-A__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `PRO-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-A__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Dat\|Number=Plur\|Person=1\|PronType=Prs`, `PRO-A__Case=Dat\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-A__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `PRO-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `PRO-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `PRO-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-A__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Prs`, `PRO-A__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `PRO-A__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Dem`, `PRO-A__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-A__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-A__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Gen\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-A__Case=Gen\|Number=Sing\|Person=1\|PronType=Prs`, `PRO-A__Case=Gen\|Number=Sing\|Person=2\|PronType=Prs`, `PRO-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Plur`, `PRO-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `PRO-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `PRO-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `PRO-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `PRO-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `PRO-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-A__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `PRO-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `PRO-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Prs`, `PRO-A__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `PRO-A__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `PRO-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Nom\|Gender=Neut\|Number=Plur\|NumType=Card`, `PRO-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `PRO-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `PRO-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `PRO-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `PRO-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-A__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `PRO-A__Foreign=Yes`, `PRO-A__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-A__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-A__VerbForm=Inf\|Voice=Act`, `PRO-D`, `PRO-D__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `PRO-D__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `PRO-D__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `PRO-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-D__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-D__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-D__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-D__Case=Acc\|Number=Plur\|Person=1\|PronType=Prs`, `PRO-D__Case=Acc\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-D__Case=Dat`, `PRO-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `PRO-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `PRO-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `PRO-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `PRO-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `PRO-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `PRO-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `PRO-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `PRO-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `PRO-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `PRO-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Dem`, `PRO-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Ind`, `PRO-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Prs`, `PRO-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Dem`, `PRO-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Ind`, `PRO-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Prs`, `PRO-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Dem`, `PRO-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `PRO-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Dem`, `PRO-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `PRO-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Dem`, `PRO-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Dem`, `PRO-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `PRO-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-D__Case=Dat\|Number=Plur\|Person=1\|PronType=Prs`, `PRO-D__Case=Dat\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-D__Case=Dat\|Number=Sing\|Person=1\|PronType=Prs`, `PRO-D__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `PRO-D__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `PRO-D__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `PRO-D__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-D__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Dem`, `PRO-D__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Ind`, `PRO-D__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-D__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `PRO-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `PRO-D__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `PRO-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-D__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `PRO-D__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `PRO-D__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `PRO-D__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `PRO-D__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-D__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `PRO-D__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-D__Case=Nom\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-D__Foreign=Yes`, `PRO-D__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `PRO-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-D__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `PRO-D__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-D__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-D__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-G`, `PRO-G__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `PRO-G__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `PRO-G__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `PRO-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `PRO-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-G__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `PRO-G__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-G__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `PRO-G__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `PRO-G__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-G__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-G__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `PRO-G__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-G__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `PRO-G__Case=Dat\|Definite=Ind\|Number=Sing`, `PRO-G__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Dem`, `PRO-G__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `PRO-G__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-G__Case=Dat\|Number=Plur\|Person=1\|PronType=Prs`, `PRO-G__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `PRO-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `PRO-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `PRO-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `PRO-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `PRO-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `PRO-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-G__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Dem`, `PRO-G__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Ind`, `PRO-G__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Prs`, `PRO-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Dem`, `PRO-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `PRO-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Prs`, `PRO-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Dem`, `PRO-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-G__Case=Gen\|Gender=Masc\|Number=Sing`, `PRO-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Dem`, `PRO-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `PRO-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Dem`, `PRO-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-G__Case=Gen\|Gender=Neut\|Number=Sing`, `PRO-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Dem`, `PRO-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `PRO-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Int`, `PRO-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-G__Case=Gen\|Number=Plur\|Person=1\|PronType=Prs`, `PRO-G__Case=Gen\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-G__Case=Gen\|Number=Sing\|Person=1\|PronType=Prs`, `PRO-G__Case=Gen\|Number=Sing\|Person=2\|PronType=Prs`, `PRO-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `PRO-G__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `PRO-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `PRO-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `PRO-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `PRO-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-G__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `PRO-G__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-G__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `PRO-G__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Prs`, `PRO-G__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-G__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-G__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `PRO-G__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `PRO-G__Degree=Cmp`, `PRO-G__Foreign=Yes`, `PRO-G__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `PRO-G__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-G__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-G__VerbForm=Inf\|Voice=Act`, `PRO-N`, `PRO-N-YYY__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `PRO-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `PRO-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `PRO-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `PRO-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `PRO-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-N__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `PRO-N__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `PRO-N__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Prs`, `PRO-N__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `PRO-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `PRO-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `PRO-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `PRO-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `PRO-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-N__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-N__Case=Dat\|Number=Plur\|Person=1\|PronType=Prs`, `PRO-N__Case=Dat\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-N__Case=Dat\|Number=Sing\|Person=1\|PronType=Prs`, `PRO-N__Case=Dat\|Number=Sing\|Person=2\|PronType=Prs`, `PRO-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `PRO-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `PRO-N__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Prs`, `PRO-N__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Prs`, `PRO-N__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-N__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-N__Case=Gen\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-N__Case=Gen\|Number=Sing\|Person=1\|PronType=Prs`, `PRO-N__Case=Gen\|Number=Sing\|Person=2\|PronType=Prs`, `PRO-N__Case=Nom`, `PRO-N__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `PRO-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `PRO-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `PRO-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `PRO-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `PRO-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `PRO-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `PRO-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `PRO-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `PRO-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `PRO-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `PRO-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `PRO-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `PRO-N__Case=Nom\|Definite=Ind\|Number=Sing`, `PRO-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `PRO-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `PRO-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Int`, `PRO-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Prs`, `PRO-N__Case=Nom\|Gender=Fem\|Number=Sing`, `PRO-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `PRO-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `PRO-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Prs`, `PRO-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Dem`, `PRO-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `PRO-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Prs`, `PRO-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `PRO-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `PRO-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `PRO-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `PRO-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Prs`, `PRO-N__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `PRO-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `PRO-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `PRO-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `PRO-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `PRO-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `PRO-N__Case=Nom\|Number=Plur\|Person=1\|PronType=Prs`, `PRO-N__Case=Nom\|Number=Plur\|Person=2\|PronType=Prs`, `PRO-N__Case=Nom\|Number=Sing\|Person=1\|PronType=Prs`, `PRO-N__Case=Nom\|Number=Sing\|Person=2\|PronType=Prs`, `PRO-N__Foreign=Yes`, `PRO-N__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-N__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-N__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-N__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `PRO-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `PRO-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `PRO-TTT-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Prs`, `PRO__Case=Nom\|Number=Sing\|Person=2\|PronType=Prs`, `P__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `P__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `P__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `P__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `P__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `P__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `P__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `P__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Prs`, `P__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `P__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `P__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `P__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `P__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `P__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `P__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `P__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `P__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `P__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `P__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `P__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `P__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Ind`, `P__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `P__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `P__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `P__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `P__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `P__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `P__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `P__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Dem`, `P__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `P__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `P__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `P__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `P__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `P__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `P__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `P__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `P__Case=Nom\|Definite=Ind\|Number=Sing`, `P__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `P__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `P__Case=Nom\|Number=Plur\|Person=1\|PronType=Prs`, `P__Degree=Cmp`, `P__Degree=Sup`, `P__Foreign=Yes`, `P__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `P__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `P__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `P__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `P__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `P__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `P__VerbForm=Inf\|Voice=Act`, `P__VerbForm=Sup\|Voice=Act`, `Q`, `Q-A`, `Q-A__Case=Acc`, `Q-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `Q-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `Q-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-A__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `Q-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `Q-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `Q-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `Q-A__Case=Acc\|Definite=Ind\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `Q-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `Q-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `Q-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `Q-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `Q-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `Q-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `Q-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `Q-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `Q-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `Q-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Int`, `Q-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Prs`, `Q-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `Q-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-A__Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing`, `Q-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-A__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-A__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `Q-A__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Int`, `Q-A__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `Q-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-A__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-A__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-A__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `Q-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `Q-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `Q-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `Q-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-A__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-A__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Ind`, `Q-A__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Ind`, `Q-A__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `Q-A__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `Q-A__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `Q-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `Q-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `Q-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `Q-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `Q-A__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-A__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `Q-A__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Int`, `Q-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `Q-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `Q-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q-A__Degree=Cmp`, `Q-A__Degree=Sup`, `Q-A__Foreign=Yes`, `Q-A__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-A__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-A__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-A__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Q-A__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-A__NumType=Card`, `Q-A__VerbForm=Inf\|Voice=Act`, `Q-D`, `Q-D__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `Q-D__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `Q-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-D__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-D__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-D__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `Q-D__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q-D__Case=Dat`, `Q-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `Q-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-D__Case=Dat\|Definite=Ind\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `Q-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `Q-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `Q-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `Q-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `Q-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `Q-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Ind`, `Q-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `Q-D__Case=Dat\|Gender=Masc\|Number=Sing\|NumType=Card`, `Q-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `Q-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `Q-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Dem`, `Q-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Ind`, `Q-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Int`, `Q-D__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-D__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-D__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Ind`, `Q-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `Q-D__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `Q-D__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-D__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-D__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-D__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `Q-D__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `Q-D__Degree=Sup`, `Q-D__Foreign=Yes`, `Q-D__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-D__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Q-D__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-G`, `Q-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-G__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-G__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-G__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-G__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-G__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `Q-G__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `Q-G__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-G__Case=Gen`, `Q-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `Q-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `Q-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `Q-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-G__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `Q-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Int`, `Q-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Ind`, `Q-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `Q-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Ind`, `Q-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Int`, `Q-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-G__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `Q-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `Q-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-G__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-G__Foreign=Yes`, `Q-G__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-G__VerbForm=Inf\|Voice=Act`, `Q-N`, `Q-N__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-N__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `Q-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `Q-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-N__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `Q-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `Q-N__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `Q-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-N__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-N__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-N__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `Q-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `Q-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `Q-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q-N__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `Q-N__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-N__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-N__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-N__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-N__Case=Dat\|Gender=Masc\|Number=Plur\|NumType=Card`, `Q-N__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `Q-N__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q-N__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-N__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-N__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-N__Case=Nom`, `Q-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `Q-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `Q-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `Q-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `Q-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-N__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `Q-N__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `Q-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `Q-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `Q-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `Q-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `Q-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `Q-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `Q-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `Q-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `Q-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `Q-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `Q-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `Q-N__Case=Nom\|Definite=Ind\|Number=Sing`, `Q-N__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `Q-N__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `Q-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `Q-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `Q-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Int`, `Q-N__Case=Nom\|Gender=Masc\|Number=Plur`, `Q-N__Case=Nom\|Gender=Masc\|Number=Plur\|NumType=Card`, `Q-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `Q-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Int`, `Q-N__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Card`, `Q-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `Q-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `Q-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `Q-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Int`, `Q-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `Q-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `Q-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `Q-N__Degree=Cmp`, `Q-N__Degree=Sup`, `Q-N__Foreign=Yes`, `Q-N__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-N__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Q-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-N__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Q-N__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Q-N__NumType=Card`, `Q-N__VerbForm=Inf\|Voice=Act`, `Q-N__VerbForm=Part\|Voice=Act`, `Q-N__VerbForm=Sup\|Voice=Act`, `QR`, `QR-A`, `QR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `QR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `QR-A__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-A__Case=Acc\|Definite=Ind\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `QR-A__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-A__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `QR-A__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `QR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-A__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `QR-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `QR-A__Degree=Cmp`, `QR-A__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `QR-A__VerbForm=Inf\|Voice=Mid`, `QR-D`, `QR-D__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-D__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `QR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `QR-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-D__Case=Dat\|Definite=Ind\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `QR-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `QR-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `QR-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `QR-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Ind`, `QR-D__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `QR-D__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `QR-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-D__Degree=Cmp`, `QR-D__Foreign=Yes`, `QR-G__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-G__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-G__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `QR-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `QR-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-N`, `QR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `QR-N__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-N__Case=Acc\|Definite=Ind\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-N__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-N__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-N__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-N__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-N__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Prs`, `QR-N__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-N__Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-N__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Prs`, `QR-N__Case=Nom`, `QR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `QR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `QR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `QR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `QR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `QR-N__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `QR-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `QR-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `QR-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `QR-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `QR-N__Degree=Cmp`, `QR-N__Foreign=Yes`, `QR-N__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `QR-N__VerbForm=Inf\|Voice=Act`, `QR__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR__Case=Acc\|Definite=Ind\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `QR__Degree=Cmp`, `QS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur`, `QS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `QS-A__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `QS-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `QS-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `QS-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `QS-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-A__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-A__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-A__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `QS-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-A__Degree=Sup`, `QS-D__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-D__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `QS-D__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `QS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `QS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `QS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `QS-D__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-D__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `QS-D__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-D__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `QS-G__Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `QS-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-G__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-G__Case=Gen\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-G__Case=Gen\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `QS-G__Case=Gen\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `QS-G__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-G__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-N`, `QS-N__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-N__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `QS-N__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `QS-N__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-N__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur`, `QS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `QS-N__Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `QS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `QS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `QS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `QS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `QS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `QS-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS-N__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `QS-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `QS-N__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `QS-N__Degree=Sup`, `QS__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `QS__Degree=Sup`, `Q__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `Q__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `Q__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `Q__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `Q__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Card`, `Q__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `RAN__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `RDDI`, `RDDI__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `RDDI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `RDDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `RDDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `RDDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `RDDS__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `RDDS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `RDDS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `RDDS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `RDDS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `RDDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `RDDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDI`, `RDI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `RDN__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `RDN__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `RDN__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `RDN__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `RDN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `RDN__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `RDN__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `RDN__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `RDN__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `RDN__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `RDN__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `RDN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `RDN__VerbForm=Sup\|Voice=Act`, `RDPI__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `RDPI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `RDPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `RDPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `RDPI__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `RDPI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `RDPI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPI__VerbForm=Inf\|Voice=Act`, `RDPS__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `RDPS__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `RDPS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `RDPS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `RDPS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPS__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RDPS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RD__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `RD__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RD__VerbForm=Inf\|Voice=Act`, `REP`, `RP`, `RP-2`, `RP-3`, `RPO-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Dem`, `RPX`, `RPX-3`, `RPX__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `RPX__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `RPX__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RP__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `RP__Case=Nom\|Number=Plur\|Person=1\|PronType=Prs`, `RP__Degree=Cmp`, `RP__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `RX`, `SUCH-A`, `SUCH-A__Case=Acc`, `SUCH-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `SUCH-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `SUCH-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `SUCH-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `SUCH-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `SUCH-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Dem`, `SUCH-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Dem`, `SUCH-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `SUCH-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `SUCH-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `SUCH-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `SUCH-A__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur`, `SUCH-A__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Ind`, `SUCH-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `SUCH-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH-A__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `SUCH-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `SUCH-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `SUCH-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `SUCH-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `SUCH-D__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `SUCH-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `SUCH-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `SUCH-D__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur`, `SUCH-D__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH-D__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH-D__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Dem`, `SUCH-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Dem`, `SUCH-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Dem`, `SUCH-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Dem`, `SUCH-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Dem`, `SUCH-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Dem`, `SUCH-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `SUCH-D__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `SUCH-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `SUCH-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `SUCH-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH-G__Case=Gen\|Gender=Fem\|Number=Plur\|PronType=Dem`, `SUCH-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Dem`, `SUCH-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Dem`, `SUCH-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Dem`, `SUCH-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Dem`, `SUCH-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Dem`, `SUCH-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `SUCH-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `SUCH-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `SUCH-N__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH-N__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Prs`, `SUCH-N__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `SUCH-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `SUCH-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `SUCH-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `SUCH-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH-N__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `SUCH-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Dem`, `SUCH-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `SUCH-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Dem`, `SUCH-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `SUCH-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `SUCH-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `SUCH__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `SUCH__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `SUCH__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `SUCH__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `SUCH__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `SUCH__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `SUCH__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `SUCH__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `SUCH__Foreign=Yes`, `TO`, `VAG`, `VAG-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAG-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VAG-A__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VAG-A__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG-A__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAG-A__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG-A__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-A__Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAG-A__Case=Acc\|Tense=Pres\|VerbForm=Part`, `VAG-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG-A__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG-A__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG-A__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VAG-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAG-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG-A__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG-A__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-A__VerbForm=Part\|Voice=Act`, `VAG-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAG-D__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG-D__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG-D__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG-D__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAG-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAG-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VAG-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAG-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VAG-D__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VAG-D__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG-D__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAG-D__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG-D__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAG-D__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-D__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAG-D__VerbForm=Part\|Voice=Act`, `VAG-G__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-G__Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing`, `VAG-G__Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG-G__Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAG-G__Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG-G__Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAG-G__Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-G__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG-G__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAG-G__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG-N__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG-N__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG-N__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG-N__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG-N__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAG__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VAG__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAG__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAG__Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `VAG__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAG__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VAG__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAG__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAG__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG__Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VAG__Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG__Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAG__Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG__Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAG__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAG__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VAG__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VAG__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAG__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAG__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAG__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAG__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAG__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAG__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAG__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VAG__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAG__VerbForm=Part\|Voice=Act`, `VAN`, `VAN-A`, `VAN-A-4__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-A__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `VAN-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `VAN-A__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `VAN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VAN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-A__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `VAN-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VAN-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAN-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAN-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAN-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VAN-A__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Card`, `VAN-A__Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VAN-A__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN-A__Case=Acc\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VAN-A__Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN-A__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN-A__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-A__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAN-A__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN-A__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VAN-A__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAN-A__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-A__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `VAN-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAN-A__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VAN-A__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN-A__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN-A__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VAN-A__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN-A__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAN-A__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VAN-A__VerbForm=Inf\|Voice=Act`, `VAN-A__VerbForm=Sup\|Voice=Act`, `VAN-D`, `VAN-D__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN-D__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAN-D__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VAN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VAN-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAN-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAN-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VAN-D__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAN-D__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Ind`, `VAN-D__Case=Dat\|Tense=Past\|VerbForm=Part`, `VAN-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur`, `VAN-D__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing`, `VAN-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAN-D__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAN-D__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAN-D__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAN-D__VerbForm=Part\|Voice=Act`, `VAN-G`, `VAN-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN-G__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VAN-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAN-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-G__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAN-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAN-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VAN-G__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAN-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN-N__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `VAN__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `VAN__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `VAN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VAN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VAN__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAN__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VAN__Case=Acc\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VAN__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN__Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN__Case=Acc\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VAN__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAN__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VAN__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAN__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAN__Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VAN__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAN__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VAN__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `VAN__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `VAN__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `VAN__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `VAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VAN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VAN__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VAN__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VAN__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VAN__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VAN__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VAN__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VAN__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VAN__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VAN__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VAN__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `VAN__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VAN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VAN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `VAN__Foreign=Yes`, `VAN__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VAN__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VAN__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAN__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VAN__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VAN__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VAN__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAN__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VAN__Tense=Past\|VerbForm=Part`, `VAN__VerbForm=Inf\|Voice=Act`, `VAN__VerbForm=Inf\|Voice=Mid`, `VAN__VerbForm=Sup\|Voice=Act`, `VAN__VerbForm=Sup\|Voice=Mid`, `VB`, `VB-3__VerbForm=Inf\|Voice=Act`, `VBDI`, `VBDI__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBDI__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `VBDI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBDI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBDI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBDI__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `VBDI__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `VBDI__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBDI__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBDI__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBDI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBDI__Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBDI__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBDI__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `VBDI__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `VBDI__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VBDI__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBDI__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBDI__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBDI__Case=Dat\|Definite=Def\|Gender=Fem\|Number=Plur`, `VBDI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBDI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBDI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBDI__Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur`, `VBDI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBDI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBDI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBDI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBDI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBDI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBDI__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBDI__Case=Dat\|Gender=Fem\|Number=Plur\|PronType=Ind`, `VBDI__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `VBDI__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `VBDI__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBDI__Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing`, `VBDI__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBDI__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBDI__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBDI__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBDI__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `VBDI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBDI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBDI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBDI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBDI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBDI__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `VBDI__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `VBDI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBDI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBDI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBDI__Case=Nom\|Definite=Ind\|Number=Sing`, `VBDI__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBDI__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `VBDI__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `VBDI__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBDI__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBDI__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `VBDI__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBDI__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `VBDI__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBDI__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBDI__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `VBDI__Degree=Sup`, `VBDI__Foreign=Yes`, `VBDI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Ind\|Tense=Past`, `VBDI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDI__VerbForm=Inf\|Voice=Act`, `VBDI__VerbForm=Inf\|Voice=Mid`, `VBDI__VerbForm=Sup\|Voice=Act`, `VBDI__VerbForm=Sup\|Voice=Mid`, `VBDP__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS`, `VBDS__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBDS__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBDS__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBDS__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBDS__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBDS__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBDS__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBDS__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBDS__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBDS__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBDS__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBDS__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBDS__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBDS__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBDS__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `VBDS__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBDS__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBDS__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `VBDS__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur`, `VBDS__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `VBDS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBDS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBDS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBDS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBDS__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBDS__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `VBDS__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBDS__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `VBDS__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Plur\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBDS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBDS__NumType=Card`, `VBDS__VerbForm=Inf\|Voice=Act`, `VBDS__VerbForm=Sup\|Voice=Act`, `VBI`, `VBI__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBI__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBI__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `VBI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBI__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBI__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBI__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBI__Foreign=Yes`, `VBI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBI__VerbForm=Inf\|Voice=Act`, `VBI__VerbForm=Inf\|Voice=Mid`, `VBI__VerbForm=Sup\|Voice=Act`, `VBN`, `VBN-A__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBN-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBN-A__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBN-A__Case=Acc\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBN-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBN-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBN-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBN-D__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBN-G__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBN__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `VBN__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `VBN__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `VBN__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `VBN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VBN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBN__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBN__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `VBN__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBN__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBN__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBN__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBN__Case=Acc\|Gender=Fem\|Number=Plur\|NumType=Card`, `VBN__Case=Acc\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBN__Case=Acc\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBN__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBN__Case=Gen\|Definite=Def\|Gender=Neut\|Number=Plur`, `VBN__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBN__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `VBN__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Plur`, `VBN__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `VBN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VBN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBN__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBN__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBN__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBN__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBN__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBN__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBN__Case=Nom\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBN__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBN__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBN__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBN__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `VBN__Degree=Sup`, `VBN__Foreign=Yes`, `VBN__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBN__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBN__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBN__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBN__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBN__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBN__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBN__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBN__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBN__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBN__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBN__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBN__Tense=Past\|VerbForm=Part`, `VBN__VerbForm=Inf\|Voice=Act`, `VBN__VerbForm=Inf\|Voice=Mid`, `VBN__VerbForm=Sup\|Voice=Act`, `VBN__VerbForm=Sup\|Voice=Mid`, `VBPI`, `VBPI__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBPI__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBPI__Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing`, `VBPI__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `VBPI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VBPI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBPI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBPI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBPI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBPI__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBPI__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `VBPI__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBPI__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBPI__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBPI__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBPI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBPI__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBPI__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `VBPI__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Prs`, `VBPI__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBPI__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBPI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBPI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBPI__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBPI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBPI__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBPI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBPI__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBPI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBPI__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBPI__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Prs`, `VBPI__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Prs`, `VBPI__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Dem`, `VBPI__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `VBPI__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Prs`, `VBPI__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `VBPI__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBPI__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBPI__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBPI__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBPI__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBPI__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBPI__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBPI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VBPI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBPI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBPI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBPI__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBPI__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VBPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBPI__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBPI__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBPI__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBPI__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBPI__Case=Nom\|Definite=Ind\|Number=Sing`, `VBPI__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VBPI__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBPI__Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBPI__Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBPI__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Ind`, `VBPI__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Int`, `VBPI__Case=Nom\|Gender=Fem\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBPI__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `VBPI__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBPI__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `VBPI__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Prs`, `VBPI__Case=Nom\|Gender=Masc\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBPI__Case=Nom\|Gender=Neut\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VBPI__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `VBPI__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VBPI__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `VBPI__Degree=Cmp`, `VBPI__Degree=Sup`, `VBPI__Foreign=Yes`, `VBPI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Ind\|Tense=Pres`, `VBPI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPI__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPI__NumType=Card`, `VBPI__VerbForm=Inf\|Voice=Act`, `VBPI__VerbForm=Inf\|Voice=Mid`, `VBPI__VerbForm=Sup\|Voice=Act`, `VBPI__VerbForm=Sup\|Voice=Mid`, `VBPS`, `VBPS__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBPS__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBPS__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBPS__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBPS__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBPS__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBPS__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VBPS__Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing`, `VBPS__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VBPS__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBPS__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBPS__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBPS__Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur`, `VBPS__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Prs`, `VBPS__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur`, `VBPS__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VBPS__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBPS__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing`, `VBPS__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBPS__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VBPS__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing`, `VBPS__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VBPS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VBPS__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VBPS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VBPS__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `VBPS__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VBPS__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `VBPS__Foreign=Yes`, `VBPS__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Ind\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Ind\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Sub\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VBPS__Mood=Sub\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VBPS__VerbForm=Inf\|Voice=Act`, `VBPS__VerbForm=Inf\|Voice=Mid`, `VBPS__VerbForm=Sup\|Voice=Act`, `VBPS__VerbForm=Sup\|Voice=Mid`, `VB__Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `VB__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VB__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VB__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur`, `VB__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VB__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `VB__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VB__Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `VB__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VB__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VB__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VB__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Prs`, `VB__Case=Acc\|Gender=Fem\|Number=Sing\|VerbForm=Part\|Voice=Act`, `VB__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Dem`, `VB__Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing`, `VB__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VB__Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VB__Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `VB__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `VB__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `VB__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `VB__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `VB__Case=Gen\|Gender=Masc\|Number=Plur\|NumType=Card`, `VB__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing`, `VB__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `VB__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VB__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `VB__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing`, `VB__Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing`, `VB__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `VB__Case=Nom\|Definite=Ind\|Number=Sing`, `VB__Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing`, `VB__Case=Nom\|Gender=Masc\|Number=Plur\|VerbForm=Part\|Voice=Act`, `VB__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Dem`, `VB__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Mid`, `VB__Degree=Sup`, `VB__Foreign=Yes`, `VB__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VB__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `VB__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VB__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VB__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VB__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VB__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VB__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `VB__NumType=Card`, `VB__VerbForm=Inf`, `VB__VerbForm=Inf\|Voice=Act`, `VB__VerbForm=Inf\|Voice=Mid`, `VB__VerbForm=Sup\|Voice=Act`, `VB__VerbForm=Sup\|Voice=Mid`, `VDPI__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VDPI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `VPDI__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `WADJ-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `WADJ-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `WADJ-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Dem`, `WADV`, `WADV-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `WADV-D`, `WADV-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `WADVP-1`, `WADVP-10__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `WADV__Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur`, `WADV__Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing`, `WADV__Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing`, `WADV__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur`, `WADV__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `WADV__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WADV__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `WADV__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WADV__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WADV__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WADV__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WADV__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `WADV__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `WADV__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WADV__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WADV__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Dem`, `WADV__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `WADV__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur`, `WADV__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Int`, `WADV__Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing`, `WADV__Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing`, `WADV__Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur`, `WADV__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `WADV__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `WADV__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WADV__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WADV__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WADV__Case=Nom\|Gender=Masc\|Number=Sing\|NumType=Card`, `WADV__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WADV__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Dem`, `WADV__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WADV__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WADV__Degree=Cmp`, `WADV__Mood=Imp\|Number=Plur\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WADV__Mood=Ind\|Number=Plur\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `WADV__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WADV__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WD-A`, `WD-A__Case=Acc`, `WD-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `WD-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `WD-A__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `WD-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `WD-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur`, `WD-A__Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing`, `WD-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WD-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Int`, `WD-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `WD-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WD-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Int`, `WD-A__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WD-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `WD-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WD-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `WD-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `WD-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WD-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WD-A__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Int`, `WD-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WD-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WD-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `WD-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `WD-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `WD-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WD-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WD-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `WD-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WD-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WD-D__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `WD-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WD-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Int`, `WD-D__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WD-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WD-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Int`, `WD-D__Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing`, `WD-D__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WD-D__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WD-G`, `WD-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `WD-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur`, `WD-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WD-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WD-G__Case=Gen\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WD-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WD-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Int`, `WD-N`, `WD-N__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Int`, `WD-N__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Prs`, `WD-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WD-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WD-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `WD-N__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WD-N__Case=Nom`, `WD-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur`, `WD-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `WD-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `WD-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WD-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WD-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Int`, `WD-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Dem`, `WD-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WD-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Int`, `WD-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Dem`, `WD-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Int`, `WD-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WD-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `WD-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WD-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Int`, `WD-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WD-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WDD__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WN-D__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WNP-2__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO`, `WPRO-1`, `WPRO-1__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-2`, `WPRO-A`, `WPRO-A__Case=Acc`, `WPRO-A__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WPRO-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `WPRO-A__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WPRO-A__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WPRO-A__Case=Acc\|Gender=Fem\|Number=Plur\|PronType=Ind`, `WPRO-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WPRO-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Int`, `WPRO-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `WPRO-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Int`, `WPRO-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Int`, `WPRO-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-A__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WPRO-A__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WPRO-A__Case=Dat\|Definite=Ind\|Number=Sing`, `WPRO-A__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `WPRO-A__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WPRO-A__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WPRO-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Int`, `WPRO-A__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WPRO-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO-A__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `WPRO-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WPRO-A__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-A__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WPRO-A__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WPRO-A__Mood=Ind\|Number=Sing\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `WPRO-A__Mood=Ind\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WPRO-A__VerbForm=Inf\|Voice=Act`, `WPRO-D`, `WPRO-D__Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WPRO-D__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Dem`, `WPRO-D__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur`, `WPRO-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `WPRO-D__Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `WPRO-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `WPRO-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WPRO-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur`, `WPRO-D__Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WPRO-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur`, `WPRO-D__Case=Dat\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WPRO-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WPRO-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Prs`, `WPRO-D__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO-D__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `WPRO-D__Case=Nom\|Number=Plur\|Person=1\|PronType=Prs`, `WPRO-D__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WPRO-D__Mood=Ind\|Number=Plur\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `WPRO-D__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `WPRO-D__Mood=Sub\|Number=Plur\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WPRO-D__NumType=Card`, `WPRO-G`, `WPRO-G__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `WPRO-G__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Int`, `WPRO-G__Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing`, `WPRO-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur`, `WPRO-G__Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WPRO-G__Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WPRO-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WPRO-G__Case=Gen\|Gender=Masc\|Number=Plur\|PronType=Int`, `WPRO-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WPRO-G__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-G__Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing`, `WPRO-G__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WPRO-G__Mood=Imp\|Number=Sing\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WPRO-G__Mood=Ind\|Number=Plur\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WPRO-G__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WPRO-N`, `WPRO-N-1__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-N-3`, `WPRO-N__Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing`, `WPRO-N__Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `WPRO-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur`, `WPRO-N__Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WPRO-N__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WPRO-N__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WPRO-N__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Int`, `WPRO-N__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Ind`, `WPRO-N__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Int`, `WPRO-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Int`, `WPRO-N__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Prs`, `WPRO-N__Case=Acc\|Gender=Neut\|Number=Plur\|PronType=Dem`, `WPRO-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WPRO-N__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-N__Case=Dat\|Gender=Masc\|Number=Plur\|PronType=Ind`, `WPRO-N__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO-N__Case=Dat\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WPRO-N__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WPRO-N__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-N__Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WPRO-N__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WPRO-N__Case=Gen\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-N__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing`, `WPRO-N__Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WPRO-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur`, `WPRO-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WPRO-N__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WPRO-N__Case=Nom\|Definite=Ind\|Number=Sing`, `WPRO-N__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Int`, `WPRO-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WPRO-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Int`, `WPRO-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Dem`, `WPRO-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Ind`, `WPRO-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Int`, `WPRO-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `WPRO-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WPRO-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Int`, `WPRO-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WPRO-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Prs`, `WPRO-N__Foreign=Yes`, `WPRO-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `WPRO-N__Mood=Ind\|Number=Sing\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WPRO-N__VerbForm=Sup\|Voice=Act`, `WPRO__Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WPRO__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WPRO__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WPRO__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WPRO__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `WPRO__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WPRO__Mood=Ind\|Number=Sing\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `WQ`, `WQ-A`, `WQ-A__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WQ-A__Case=Acc\|Gender=Masc\|Number=Plur\|PronType=Dem`, `WQ-A__Case=Acc\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WQ-A__Case=Acc\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WQ-A__Case=Nom\|Gender=Fem\|Number=Plur\|PronType=Int`, `WQ-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur`, `WQ-D__Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing`, `WQ-D__Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WQ-D__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WQ-D__Case=Dat\|Gender=Neut\|Number=Sing\|PronType=Int`, `WQ-D__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `WQ-G__Case=Gen\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WQ-G__Case=Gen\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WQ-N`, `WQ-N__Case=Acc\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WQ-N__Case=Dat\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WQ-N__Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing`, `WQ-N__Case=Nom\|Gender=Fem\|Number=Sing\|PronType=Ind`, `WQ-N__Case=Nom\|Gender=Masc\|Number=Plur\|PronType=Int`, `WQ-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Ind`, `WQ-N__Case=Nom\|Gender=Masc\|Number=Sing\|PronType=Int`, `WQ-N__Case=Nom\|Gender=Neut\|Number=Plur\|PronType=Ind`, `WQ-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WQ-N__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Int`, `WQ__Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing`, `WQ__Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing`, `WQ__Case=Nom\|Gender=Neut\|Number=Sing\|PronType=Ind`, `WQ__Case=Nom\|Gender=Neut\|Number=Sing\|VerbForm=Part\|Voice=Act`, `WQ__Mood=Ind\|Number=Sing\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act` |
| **`morphologizer`** | `POS=CCONJ`, `POS=ADP`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=NOUN`, `POS=ADV`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Prs`, `POS=VERB\|VerbForm=Inf\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Prs`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Prs`, `POS=SCONJ`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADV`, `POS=PUNCT`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Fem\|NumType=Card\|Number=Sing\|POS=DET`, `Foreign=Yes\|POS=X`, `Case=Dat\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Degree=Cmp\|POS=ADV`, `Case=Acc\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `POS=VERB\|VerbForm=Inf\|Voice=Mid`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Ind`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Prs`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Prs`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Nom\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=NOUN`, `POS=NOUN`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=PROPN`, `POS=PRON`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=X`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `POS=DET`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=NOUN`, `POS=VERB\|VerbForm=Sup\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Prs`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=NOUN`, `POS=AUX\|VerbForm=Inf\|Voice=Act`, `POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Dem`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `POS=PART`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Dat\|Definite=Ind\|Number=Sing\|POS=PROPN`, `Case=Nom\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=AUX`, `Case=Acc\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=NOUN`, `NumType=Card\|POS=DET`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Ind`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `POS=AUX\|VerbForm=Sup\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Ind`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Prs`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=DET`, `POS=VERB`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADJ`, `NumType=Card\|POS=NUM`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=ADV\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=PROPN`, `POS=X`, `Case=Gen\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=AUX\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADV`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Gen\|Definite=Ind\|Number=Sing\|POS=X`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Prs`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Ind`, `Mood=Imp\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Prs`, `Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=VERB`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=DET`, `Mood=Ind\|Number=Sing\|POS=ADJ\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `POS=VERB\|VerbForm=Sup\|Voice=Mid`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=DET`, `NumType=Ord\|POS=NUM`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=NOUN`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Gen\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Dem`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Dem`, `Case=Dat\|Definite=Ind\|Number=Sing\|POS=NOUN`, `Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Prs`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=DET`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=DET`, `Degree=Sup\|POS=ADV`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Prs`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Prs`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|NumType=Card\|Number=Sing\|POS=DET`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Nom\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=DET`, `Mood=Imp\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Degree=Cmp\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=DET`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Prs`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=PROPN`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=VERB\|PronType=Prs`, `Mood=Ind\|Number=Sing\|POS=DET\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Nom\|Gender=Fem\|NumType=Card\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Foreign=Yes\|POS=PROPN`, `Foreign=Yes\|POS=ADV`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADP`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Dem`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Dem`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=X`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Foreign=Yes\|POS=NOUN`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Dem`, `NumType=Card\|POS=ADJ`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Ind`, `POS=INTJ`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Int`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=ADJ\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Dem`, `Mood=Ind\|Number=Sing\|POS=NOUN\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Prs`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Prs`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Int`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Mid`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Ind`, `Foreign=Yes\|POS=PRON`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Dat\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=NUM`, `Mood=Sub\|Number=Sing\|POS=ADJ\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=NOUN\|VerbForm=Part\|Voice=Act`, `Mood=Imp\|Number=Sing\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=ADV\|PronType=Dem`, `Case=Acc\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=X`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Dem`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Number=Plur\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `POS=AUX`, `Mood=Sub\|Number=Sing\|POS=NOUN\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Number=Sing\|POS=PRON\|Person=1\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=ADV`, `POS=NOUN\|VerbForm=Inf\|Voice=Act`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=AUX`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Gen\|Number=Sing\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `Mood=Ind\|Number=Plur\|POS=ADV\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Dem`, `POS=PRON\|VerbForm=Inf\|Voice=Act`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Gen\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Int`, `Degree=Cmp\|POS=ADJ`, `Mood=Ind\|Number=Sing\|POS=PROPN\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=ADV\|PronType=Dem`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=DET`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=VERB`, `POS=NUM`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PROPN\|PronType=Prs`, `Case=Gen\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Ind\|Number=Sing\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=AUX`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Mood=Ind\|Number=Sing\|POS=PRON\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=X`, `Case=Dat\|Gender=Fem\|NumType=Card\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=PART`, `Mood=Ind\|Number=Plur\|POS=ADJ\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|POS=DET`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=SCONJ\|PronType=Int`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Ind`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=X\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Foreign=Yes\|POS=VERB`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Mood=Ind\|Number=Sing\|POS=DET\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Foreign=Yes\|POS=NUM`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=PROPN`, `POS=NOUN\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET`, `Degree=Sup\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Mood=Ind\|Number=Sing\|POS=X\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=SCONJ`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=PRON`, `Mood=Imp\|Number=Sing\|POS=PRON\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Acc\|Number=Plur\|POS=PRON\|Person=2\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=ADV\|PronType=Ind`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Ind`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=X`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=AUX`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADP`, `Mood=Ind\|Number=Sing\|POS=ADV\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=CCONJ\|PronType=Ind`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Dem`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADV`, `POS=PROPN`, `POS=VERB\|Tense=Past\|VerbForm=Part`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Dem`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=ADP`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Nom\|Number=Plur\|POS=ADP\|Person=1\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=X`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=SCONJ\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=DET`, `Foreign=Yes\|POS=AUX`, `Mood=Ind\|Number=Sing\|POS=NOUN\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=VERB`, `Mood=Ind\|Number=Sing\|POS=ADV\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=NOUN`, `Case=Gen\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=NUM`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=VERB\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=NUM`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=DET\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=AUX`, `Case=Acc\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=SCONJ`, `Mood=Ind\|Number=Sing\|POS=ADP\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=AUX\|PronType=Dem`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Nom\|POS=PRON`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=ADP\|PronType=Dem`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Dem`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=NOUN\|PronType=Ind`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Int`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=AUX`, `Mood=Ind\|Number=Plur\|POS=ADV\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADV`, `Mood=Ind\|Number=Plur\|POS=PRON\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `POS=ADJ\|VerbForm=Inf\|Voice=Act`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=AUX`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=CCONJ`, `Mood=Sub\|Number=Sing\|POS=NOUN\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Dem`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Foreign=Yes\|POS=DET`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=ADJ\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=DET`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=X`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=ADJ\|VerbForm=Part\|Voice=Act`, `Mood=Imp\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Dem`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=AUX`, `Mood=Sub\|Number=Sing\|POS=DET\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Int`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=ADJ\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=AUX`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Gen\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Prs`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Dem`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Dat\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADP`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=NOUN\|PronType=Ind`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Dem`, `Mood=Ind\|Number=Sing\|POS=ADV\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=ADP\|PronType=Ind`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Ind`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADV`, `Mood=Sub\|Number=Sing\|POS=PROPN\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=NUM\|PronType=Ind`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=NUM`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=ADJ\|PronType=Dem`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=ADV`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=VERB\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=SCONJ\|PronType=Prs`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=NOUN\|VerbForm=Part\|Voice=Act`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=SCONJ\|PronType=Ind`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Mood=Ind\|Number=Plur\|POS=ADJ\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=DET`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Dat\|POS=PRON`, `Case=Gen\|Degree=Pos\|POS=ADJ`, `Case=Acc\|POS=NUM`, `Case=Acc\|POS=DET`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=VERB\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=X`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=CCONJ`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=ADV`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=VERB`, `Mood=Ind\|Number=Sing\|POS=ADP\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Number=Plur\|POS=NOUN`, `Case=Gen\|POS=DET`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NUM`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=X`, `Case=Acc\|Definite=Ind\|Number=Sing\|POS=PROPN`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=NUM`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `POS=DET\|VerbForm=Inf\|Voice=Act`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=VERB`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Number=Sing\|POS=NOUN\|Person=1\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=AUX\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADJ\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=AUX`, `Mood=Sub\|Number=Sing\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Ind`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Mood=Ind\|Number=Plur\|POS=PRON\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=DET`, `Degree=Sup\|POS=DET`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Degree=Sup\|POS=VERB`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=NUM\|PronType=Ind`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=NUM`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=AUX`, `Case=Dat\|Degree=Pos\|POS=ADJ`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Dem`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Dat\|Number=Sing\|POS=NOUN\|Person=1\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Degree=Cmp\|POS=ADP`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=NUM\|PronType=Ind`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=AUX`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|PronType=Dem`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PROPN\|PronType=Dem`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Ind`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Dem`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Nom\|Number=Sing\|POS=ADV\|Person=2\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=NUM`, `Mood=Ind\|Number=Sing\|POS=ADV\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=PRON`, `POS=DET\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=ADV\|PronType=Dem`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=AUX`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Mood=Ind\|Number=Plur\|POS=PRON\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=X`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `POS=NOUN\|VerbForm=Sup\|Voice=Act`, `Mood=Ind\|Number=Plur\|POS=X\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Number=Sing\|POS=X`, `Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=X`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=AUX`, `Mood=Sub\|Number=Plur\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=PRON`, `POS=ADV\|VerbForm=Sup\|Voice=Act`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Prs`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Int`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=DET`, `Mood=Ind\|Number=Sing\|POS=PRON\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=SCONJ\|PronType=Prs`, `Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=VERB\|PronType=Prs`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Int`, `Mood=Sub\|Number=Plur\|POS=NOUN\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=AUX`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=AUX`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Int`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Case=Dat\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=X`, `Mood=Imp\|Number=Plur\|POS=ADV\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Mood=Ind\|Number=Plur\|POS=NOUN\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=CCONJ`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=X`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=DET`, `Mood=Ind\|Number=Sing\|POS=CCONJ\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=AUX\|VerbForm=Part\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=SCONJ\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Degree=Cmp\|POS=NOUN`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=DET`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=X`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=X`, `Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Mood=Imp\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Mood=Imp\|Number=Sing\|POS=ADP\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=NOUN\|PronType=Ind`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NUM`, `Mood=Imp\|Number=Plur\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Dat\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=NOUN`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=ADP\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PROPN\|PronType=Dem`, `Case=Gen\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=NOUN\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=ADJ\|PronType=Dem`, `Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=DET`, `Mood=Imp\|POS=AUX\|VerbForm=Inf`, `Mood=Ind\|POS=AUX\|Tense=Pres`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=DET`, `Mood=Sub\|Number=Sing\|POS=ADV\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=SCONJ\|PronType=Ind`, `NumType=Card\|POS=X`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=AUX\|VerbForm=Part\|Voice=Act`, `Mood=Sub\|Number=Sing\|POS=ADP\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=X`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=AUX\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADJ\|VerbForm=Part\|Voice=Mid`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|PronType=Ind`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=ADV\|PronType=Ind`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|PronType=Ind`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=AUX`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=ADP\|PronType=Ind`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=VERB`, `POS=AUX\|VerbForm=Part\|Voice=Act`, `POS=PROPN\|VerbForm=Inf\|Voice=Act`, `POS=ADV\|VerbForm=Inf\|Voice=Mid`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=NUM`, `POS=ADV\|VerbForm=Inf\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=ADJ\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=SCONJ\|PronType=Ind`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=X`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=CCONJ`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Mood=Ind\|Number=Sing\|POS=ADV\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=AUX\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=SCONJ`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Int`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=NOUN\|PronType=Ind`, `Case=Nom\|Number=Sing\|POS=PROPN\|Person=2\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Mood=Ind\|Number=Sing\|POS=INTJ\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=ADJ\|VerbForm=Part\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=CCONJ\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=DET`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=NOUN\|PronType=Prs`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Ind`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADV\|VerbForm=Part\|Voice=Mid`, `Case=Gen\|Definite=Ind\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=NOUN`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PROPN\|PronType=Dem`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=X`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Prs`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=X`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=X`, `Case=Dat\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Prs`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADP`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Acc\|Definite=Ind\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=CCONJ`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=DET`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=VERB`, `NumType=Ord\|POS=NOUN`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=ADJ\|PronType=Ind`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Mood=Sub\|Number=Sing\|POS=ADJ\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `NumType=Card\|POS=NOUN`, `Case=Nom\|Number=Plur\|POS=NOUN\|Person=1\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADP`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=ADJ\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=AUX`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=ADV`, `NumType=Card\|POS=VERB`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=VERB\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=DET`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Mood=Ind\|Number=Sing\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Mood=Imp\|Number=Sing\|POS=DET\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `POS=PRON\|VerbForm=Sup\|Voice=Act`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADP`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=VERB\|PronType=Ind`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=PRON\|PronType=Int`, `Mood=Sub\|Number=Plur\|POS=DET\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=ADV\|PronType=Ind`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=X`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=X\|PronType=Ind`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=X`, `Case=Nom\|Number=Sing\|POS=PROPN\|Person=1\|PronType=Prs`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Int`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Dat\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=ADV`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET\|PronType=Prs`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=CCONJ`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=X`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=ADV`, `Mood=Ind\|Number=Plur\|POS=AUX\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=ADV`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=ADP\|PronType=Ind`, `Mood=Sub\|Number=Sing\|POS=NOUN\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Number=Sing\|POS=NOUN\|Person=2\|PronType=Prs`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Foreign=Yes\|POS=ADJ`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PUNCT`, `Case=Nom\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=DET`, `POS=ADJ\|VerbForm=Sup\|Voice=Act`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Mood=Ind\|Number=Sing\|POS=ADJ\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=CCONJ`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=VERB`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=PRON\|PronType=Dem`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADP`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=ADV`, `NumType=Frac\|POS=NOUN`, `Mood=Sub\|Number=Sing\|POS=X\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Acc\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=PROPN`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=AUX`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=AUX\|PronType=Ind`, `POS=NOUN\|VerbForm=Inf\|Voice=Mid`, `Mood=Sub\|Number=Plur\|POS=NOUN\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=ADJ\|PronType=Prs`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NOUN`, `NumType=Card\|POS=PROPN`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=CCONJ`, `Mood=Ind\|Number=Plur\|POS=ADV\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Mood=Ind\|Number=Sing\|POS=NOUN\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=ADJ\|PronType=Dem`, `Mood=Ind\|Number=Sing\|POS=ADP\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|POS=VERB\|Tense=Pres`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=DET\|PronType=Int`, `Case=Dat\|POS=DET`, `Degree=Pos\|POS=ADV`, `Mood=Sub\|Number=Plur\|POS=NOUN\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=SCONJ\|PronType=Ind`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Acc\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=PUNCT`, `Case=Nom\|Number=Plur\|POS=PROPN\|Person=2\|PronType=Prs`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=SCONJ\|PronType=Ind`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Ind`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=ADP\|PronType=Dem`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=ADP\|PronType=Dem`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=PRON`, `POS=X\|VerbForm=Inf\|Voice=Act`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=X`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADP`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PROPN\|PronType=Prs`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=NUM\|PronType=Ind`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Dat\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=ADV`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=NOUN`, `Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=AUX`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PRON`, `Mood=Ind\|Number=Plur\|POS=ADJ\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PRON`, `Foreign=Yes\|POS=ADP`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Int`, `POS=AUX\|VerbForm=Inf\|Voice=Mid`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=NUM\|PronType=Ind`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=PROPN\|PronType=Ind`, `Mood=Ind\|Number=Plur\|POS=DET\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=NOUN\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Ind`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=AUX\|PronType=Prs`, `Case=Dat\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=PART`, `POS=PUNCT\|VerbForm=Sup\|Voice=Mid`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=CCONJ\|PronType=Ind`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=ADV\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Case=Gen\|Number=Plur\|POS=DET\|Person=1\|PronType=Prs`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=NUM`, `Mood=Ind\|Number=Sing\|POS=PRON\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `POS=VERB\|VerbForm=Inf`, `Case=Acc\|Degree=Pos\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Case=Nom\|POS=NUM`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|PronType=Int`, `Mood=Ind\|Number=Sing\|POS=ADJ\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `POS=ADJ\|VerbForm=Sup\|Voice=Mid`, `Mood=Ind\|Number=Plur\|POS=NOUN\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Acc\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=NOUN`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Mood=Sub\|Number=Sing\|POS=DET\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=AUX`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=ADJ\|VerbForm=Part\|Voice=Act`, `Mood=Ind\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PROPN\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=ADV\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=SCONJ\|PronType=Ind`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADP`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=NUM`, `Mood=Imp\|Number=Plur\|POS=PUNCT\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=ADV\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Ind\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADV`, `Degree=Sup\|POS=ADP`, `Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Dat\|Number=Sing\|POS=NOUN\|Person=2\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=AUX`, `Case=Dat\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=PROPN`, `NumType=Frac\|POS=ADJ`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=DET\|PronType=Int`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Prs`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NUM`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=NUM\|PronType=Ind`, `Case=Acc\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=AUX\|PronType=Prs`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=ADP\|PronType=Ind`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Acc\|Gender=Fem\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=ADP\|PronType=Dem`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADP`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=ADJ\|PronType=Dem`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=NOUN\|PronType=Ind`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Acc\|Definite=Ind\|Number=Sing\|POS=X`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=X`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=DET`, `Mood=Imp\|Number=Sing\|POS=ADJ\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=PRON\|PronType=Ind`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=CCONJ`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Mood=Ind\|Number=Sing\|POS=ADJ\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=AUX`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=AUX`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=CCONJ\|PronType=Ind`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Mood=Ind\|Number=Plur\|POS=ADJ\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=PRON`, `Mood=Ind\|Number=Sing\|POS=SCONJ\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=ADV\|PronType=Ind`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=AUX\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=SCONJ\|PronType=Int`, `Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Mood=Sub\|Number=Plur\|POS=DET\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=ADJ\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Number=Sing\|POS=PROPN\|Person=1\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=PRON\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|PronType=Int`, `Mood=Ind\|Number=Sing\|POS=AUX\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=VERB\|PronType=Dem`, `Mood=Sub\|Number=Plur\|POS=ADJ\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|POS=PRON`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NUM`, `Mood=Sub\|Number=Sing\|POS=PROPN\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=PRON\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=PRON\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Mood=Ind\|Number=Sing\|POS=NUM\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|POS=AUX\|Tense=Past`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADP`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PART\|PronType=Ind`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=AUX`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=ADV\|PronType=Ind`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=X`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=VERB`, `POS=DET\|VerbForm=Sup\|Voice=Act`, `Degree=Cmp\|POS=VERB`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=AUX`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=X`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=VERB\|PronType=Dem`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Acc\|Definite=Ind\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=ADV\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Dat\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Int`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=AUX`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=SCONJ\|PronType=Ind`, `Case=Acc\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=DET\|VerbForm=Part\|Voice=Mid`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=ADJ\|PronType=Dem`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|PronType=Prs`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Dat\|Gender=Fem\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=X`, `Mood=Sub\|Number=Sing\|POS=PROPN\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=ADJ\|PronType=Dem`, `Mood=Ind\|Number=Plur\|POS=ADJ\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Gen\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADP`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=PROPN\|PronType=Ind`, `Mood=Ind\|Number=Sing\|POS=ADV\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=CCONJ\|PronType=Ind`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=ADV`, `POS=ADP\|VerbForm=Sup\|Voice=Act`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=PROPN\|PronType=Prs`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=ADV\|PronType=Prs`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Dat\|POS=VERB\|Tense=Past\|VerbForm=Part`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=NOUN\|PronType=Ind`, `Mood=Ind\|Number=Plur\|POS=NOUN\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Int`, `Mood=Ind\|Number=Sing\|POS=ADV\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Gen\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=NOUN`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Number=Sing\|POS=ADV`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=VERB\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=X`, `Case=Gen\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET`, `POS=INTJ\|VerbForm=Sup\|Voice=Act`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=NOUN\|PronType=Prs`, `Case=Gen\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=ADV`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Mood=Ind\|Number=Sing\|POS=NOUN\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Mood=Imp\|Number=Sing\|POS=NOUN\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `POS=ADV\|VerbForm=Sup\|Voice=Mid`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=VERB`, `Mood=Ind\|Number=Sing\|POS=PRON\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=PROPN\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Mood=Sub\|Number=Sing\|POS=ADV\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=ADP`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=PROPN\|PronType=Ind`, `Case=Acc\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NUM`, `NumType=Card\|POS=PRON`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=ADV\|PronType=Prs`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=ADJ\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=ADV\|VerbForm=Part\|Voice=Act`, `POS=ADJ\|VerbForm=Part\|Voice=Act`, `Mood=Imp\|POS=VERB\|VerbForm=Inf`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Nom\|Degree=Pos\|POS=ADJ`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=NOUN`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=AUX`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=ADJ`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=ADP\|PronType=Prs`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=NOUN\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=NOUN\|PronType=Ind`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADP`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=ADV\|PronType=Dem`, `Case=Dat\|Definite=Ind\|Number=Sing\|POS=PRON`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=CCONJ\|VerbForm=Part\|Voice=Act`, `Degree=Pos\|POS=ADJ`, `Case=Acc\|POS=VERB\|Tense=Pres\|VerbForm=Part`, `Mood=Ind\|POS=VERB\|Tense=Past`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PROPN\|PronType=Prs`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=NOUN\|PronType=Dem`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=CCONJ`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PROPN\|PronType=Dem`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=PRON\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=ADP\|PronType=Prs`, `Case=Gen\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=ADJ\|PronType=Dem`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|PronType=Dem`, `Case=Acc\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PROPN`, `NumType=Ord\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=X`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=NOUN\|PronType=Dem`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=NOUN\|VerbForm=Part\|Voice=Act`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=SCONJ\|PronType=Ind`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADP`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=DET`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Gen\|Gender=Fem\|Number=Plur\|POS=NOUN\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=VERB\|PronType=Dem`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Gen\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=VERB`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=ADV\|PronType=Ind`, `Mood=Sub\|Number=Sing\|POS=ADV\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `NumType=Card\|POS=PUNCT`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=X`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=SCONJ\|PronType=Int`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Sing\|POS=AUX`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=PROPN\|PronType=Ind`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=NOUN\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADV\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Foreign=Yes\|POS=INTJ`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=X`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=VERB\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=X`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=AUX`, `Case=Acc\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=ADV`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=ADJ`, `Mood=Ind\|Number=Plur\|POS=X\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PRON`, `Mood=Ind\|Number=Plur\|POS=PROPN\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=NOUN\|PronType=Int`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NOUN`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=ADJ\|PronType=Prs`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Fem\|NumType=Card\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=AUX`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=ADJ\|PronType=Dem`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=DET`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NUM`, `Case=Dat\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Prs`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=CCONJ`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=PRON`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=AUX`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=SCONJ\|PronType=Int`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=ADV\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=SCONJ\|PronType=Ind`, `Case=Dat\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Dat\|Number=Sing\|POS=DET\|Person=2\|PronType=Prs`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=PRON`, `POS=ADP\|VerbForm=Inf\|Voice=Act`, `Case=Gen\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=ADJ`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADV`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NUM`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=ADV\|PronType=Prs`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=DET\|PronType=Prs`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=ADP\|PronType=Prs`, `Mood=Ind\|Number=Sing\|POS=ADV\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=VERB`, `Mood=Sub\|Number=Sing\|POS=VERB\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Mood=Ind\|Number=Plur\|POS=ADP\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=SCONJ\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=AUX\|PronType=Prs`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=VERB\|PronType=Ind`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Int`, `Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Mood=Sub\|Number=Sing\|POS=X\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=AUX`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=CCONJ\|PronType=Ind`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=PROPN\|PronType=Prs`, `Case=Nom\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=ADV`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Int`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Degree=Cmp\|POS=PRON`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=AUX`, `Case=Gen\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=SCONJ`, `Case=Nom\|Number=Plur\|POS=NOUN\|Person=2\|PronType=Prs`, `Mood=Sub\|Number=Plur\|POS=NUM\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=CCONJ\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=ADJ\|PronType=Prs`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Acc\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=X`, `Case=Nom\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=X`, `Mood=Sub\|Number=Plur\|POS=PRON\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=VERB`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=NOUN\|PronType=Dem`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=NOUN\|PronType=Prs`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=AUX`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Gen\|Number=Sing\|POS=PUNCT\|Person=2\|PronType=Prs`, `Case=Nom\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=ADV`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Dat\|Definite=Ind\|Number=Sing\|POS=NUM`, `Degree=Sup\|POS=X`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=INTJ`, `Case=Gen\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=AUX`, `Mood=Imp\|Number=Plur\|POS=NOUN\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Plur\|POS=CCONJ`, `Case=Gen\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Gen\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Gen\|Number=Sing\|POS=PUNCT\|Person=1\|PronType=Prs`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=NOUN\|PronType=Int`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=X`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=AUX`, `Case=Gen\|Gender=Fem\|Number=Sing\|POS=DET\|PronType=Int`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=PRON\|VerbForm=Part\|Voice=Act`, `Case=Acc\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=ADJ\|VerbForm=Part\|Voice=Act`, `Case=Gen\|Gender=Fem\|NumType=Card\|Number=Plur\|POS=ADJ`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=X`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NUM`, `Case=Gen\|Definite=Ind\|Number=Plur\|POS=NOUN`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=X`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=VERB\|PronType=Ind`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=NUM`, `Case=Nom\|Gender=Masc\|Number=Sing\|POS=ADV\|PronType=Prs`, `Case=Acc\|Definite=Ind\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=CCONJ`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=VERB`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Dat\|Definite=Def\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Nom\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=ADP`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADP`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=INTJ`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=ADV`, `Mood=Sub\|Number=Sing\|POS=PRON\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Mood=Sub\|Number=Sing\|POS=DET\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=ADV\|PronType=Dem`, `Case=Dat\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=NUM`, `Case=Nom\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Gen\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=X`, `Case=Gen\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=DET\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=ADP`, `Case=Nom\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=PRON`, `Mood=Ind\|Number=Plur\|POS=DET\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=NUM\|PronType=Int`, `Case=Gen\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Mood=Sub\|Number=Sing\|POS=ADJ\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `NumType=Frac\|POS=PUNCT`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Dat\|Gender=Masc\|Number=Sing\|POS=ADV\|PronType=Dem`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=NOUN`, `POS=DET\|VerbForm=Inf\|Voice=Mid`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=DET`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=ADJ`, `Case=Dat\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=X`, `Mood=Ind\|Number=Sing\|POS=PRON\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `POS=ADJ\|VerbForm=Inf\|Voice=Mid`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=PART`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=ADV\|PronType=Dem`, `Mood=Ind\|Number=Sing\|POS=PRON\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=VERB`, `Case=Gen\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Plur\|POS=SCONJ`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=ADJ\|PronType=Ind`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=VERB\|PronType=Dem`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=X`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADV`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=NOUN\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Plur\|POS=X\|PronType=Prs`, `Case=Nom\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=PROPN`, `Case=Dat\|Gender=Neut\|Number=Plur\|POS=VERB\|PronType=Prs`, `Case=Nom\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=NUM`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=AUX`, `Case=Gen\|Definite=Ind\|Gender=Masc\|Number=Sing\|POS=ADP`, `Case=Dat\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=SCONJ`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=ADV\|PronType=Prs`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=NOUN`, `Mood=Imp\|Number=Sing\|POS=DET\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=SCONJ\|PronType=Int`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=ADV\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=ADJ`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADV`, `Case=Gen\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=DET`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=NOUN`, `Mood=Ind\|POS=ADV\|Tense=Past`, `Case=Acc\|Definite=Ind\|Gender=Masc\|Number=Plur\|POS=X`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=ADV\|PronType=Ind`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=DET`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Mood=Ind\|Number=Plur\|POS=DET\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `POS=PROPN\|VerbForm=Sup\|Voice=Act`, `Case=Dat\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=SCONJ`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=DET`, `Mood=Ind\|Number=Sing\|POS=PROPN\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=AUX`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADV\|PronType=Dem`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=SCONJ\|PronType=Dem`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADP`, `Case=Nom\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Gen\|Gender=Masc\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Nom\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=PRON`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Mood=Imp\|Number=Sing\|POS=SCONJ\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Gender=Fem\|Number=Plur\|POS=ADV\|PronType=Ind`, `Mood=Sub\|Number=Plur\|POS=ADV\|Person=3\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Plur\|POS=AUX\|PronType=Dem`, `Case=Dat\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=PROPN`, `Case=Dat\|Definite=Ind\|Gender=Neut\|Number=Plur\|POS=ADV`, `Mood=Sub\|Number=Plur\|POS=ADV\|Person=3\|Tense=Past\|VerbForm=Fin\|Voice=Mid`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=NUM`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=DET`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=ADV\|PronType=Ind`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=NOUN\|PronType=Prs`, `Case=Acc\|Gender=Masc\|Number=Sing\|POS=NOUN\|PronType=Dem`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Sing\|POS=PRON`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=ADV\|PronType=Ind`, `Mood=Sub\|Number=Plur\|POS=VERB\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Dat\|Definite=Def\|Gender=Fem\|Number=Plur\|POS=VERB`, `Case=Gen\|Definite=Ind\|Gender=Fem\|Number=Sing\|POS=ADP`, `Case=Gen\|Gender=Neut\|Number=Plur\|POS=NOUN\|PronType=Ind`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Nom\|Definite=Ind\|Number=Sing\|POS=ADP`, `Case=Gen\|Gender=Neut\|Number=Sing\|POS=ADJ\|PronType=Prs`, `Case=Gen\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Dat\|Gender=Neut\|Number=Sing\|POS=NOUN\|PronType=Ind`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Gen\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=VERB`, `Case=Nom\|Gender=Masc\|NumType=Card\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Masc\|Number=Plur\|POS=ADJ\|PronType=Dem`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=NOUN\|PronType=Ind`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Neut\|Number=Plur\|POS=ADV`, `Case=Acc\|Gender=Neut\|NumType=Card\|Number=Plur\|POS=ADV`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=SCONJ`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=AUX\|PronType=Dem`, `Case=Dat\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=ADJ`, `Case=Gen\|POS=NUM`, `NumType=Card\|POS=ADV`, `Case=Acc\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADP`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=VERB\|PronType=Ind`, `Case=Acc\|Definite=Def\|Degree=Cmp\|Gender=Masc\|Number=Sing\|POS=AUX`, `Case=Acc\|Definite=Ind\|Number=Plur\|POS=NOUN`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=VERB`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=PROPN`, `Case=Acc\|Definite=Def\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=NOUN`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=SCONJ\|PronType=Ind`, `Case=Acc\|Definite=Def\|Degree=Sup\|Gender=Masc\|Number=Plur\|POS=PROPN`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=NOUN\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Neut\|NumType=Card\|Number=Sing\|POS=NUM`, `Case=Nom\|Definite=Def\|Degree=Cmp\|Gender=Neut\|Number=Sing\|POS=PROPN`, `Case=Dat\|POS=NUM`, `Case=Nom\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=ADP`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Fem\|Number=Plur\|POS=ADJ`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=ADJ\|VerbForm=Part\|Voice=Act`, `Case=Dat\|Degree=Pos\|Gender=Masc\|Number=Sing\|POS=NOUN`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=SCONJ`, `Foreign=Yes\|POS=CCONJ`, `Case=Nom\|Gender=Neut\|Number=Sing\|POS=AUX\|PronType=Ind`, `Mood=Ind\|Number=Sing\|POS=ADJ\|Person=2\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Case=Acc\|Definite=Def\|Gender=Neut\|Number=Plur\|POS=DET`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=ADP`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=VERB\|PronType=Ind`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=PROPN\|PronType=Prs`, `Case=Acc\|Gender=Neut\|Number=Plur\|POS=ADP\|PronType=Ind`, `Case=Acc\|Gender=Neut\|Number=Sing\|POS=VERB\|PronType=Int`, `Case=Dat\|Gender=Fem\|Number=Plur\|POS=X\|PronType=Ind`, `Case=Acc\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=X`, `Case=Nom\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=PROPN`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=AUX`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=VERB\|PronType=Dem`, `Case=Dat\|Degree=Pos\|Gender=Neut\|Number=Plur\|POS=PRON`, `Case=Nom\|Gender=Neut\|Number=Plur\|POS=ADV\|PronType=Ind`, `Case=Nom\|Gender=Masc\|Number=Plur\|POS=NOUN\|PronType=Ind`, `Mood=Sub\|Number=Sing\|POS=ADV\|Person=1\|Tense=Past\|VerbForm=Fin\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=DET\|Person=1\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Nom\|Gender=Fem\|Number=Sing\|POS=PROPN\|VerbForm=Part\|Voice=Act`, `Mood=Ind\|Number=Sing\|POS=ADP\|Person=2\|Tense=Pres\|VerbForm=Fin\|Voice=Act`, `Case=Gen\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=ADJ`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=PRON\|VerbForm=Part\|Voice=Act`, `Case=Nom\|Gender=Fem\|NumType=Card\|Number=Sing\|POS=ADV`, `Case=Dat\|Definite=Ind\|Degree=Pos\|Gender=Fem\|Number=Plur\|POS=ADP`, `Case=Acc\|Definite=Ind\|Degree=Sup\|Gender=Masc\|Number=Sing\|POS=ADV`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Neut\|Number=Sing\|POS=NUM`, `Case=Gen\|Definite=Ind\|Degree=Pos\|Gender=Masc\|Number=Plur\|POS=PRON`, `Case=Acc\|Definite=Def\|Gender=Masc\|Number=Sing\|POS=PRON`, `Case=Dat\|Gender=Masc\|Number=Plur\|POS=AUX\|PronType=Prs`, `Mood=Sub\|POS=AUX\|Tense=Past`, `Case=Acc\|Gender=Fem\|Number=Sing\|POS=ADJ\|PronType=Ind`, `Case=Gen\|Definite=Def\|Degree=Cmp\|Gender=Fem\|Number=Sing\|POS=PRON`, `Case=Acc\|Definite=Def\|Gender=Fem\|Number=Sing\|POS=X`, `Case=Dat\|Definite=Def\|Degree=Pos\|Gender=Fem\|Number=Sing\|POS=VERB` |
</details>
### Accuracy
| Type | Score |
| --- | --- |
| `DEP_UAS` | 82.89 |
| `DEP_LAS` | 77.71 |
| `SENTS_P` | 96.97 |
| `SENTS_R` | 98.50 |
| `SENTS_F` | 97.73 |
| `LEMMA_ACC` | 94.86 |
| `TAG_ACC` | 84.84 |
| `POS_ACC` | 96.29 |
| `MORPH_ACC` | 90.12 |
| `TRANSFORMER_LOSS` | 2803740.93 |
| `PARSER_LOSS` | 534940.15 |
| `TRAINABLE_LEMMATIZER_LOSS` | 294717.33 |
| `TAGGER_LOSS` | 890478.23 |
| `MORPHOLOGIZER_LOSS` | 426176.75 | |
CocoyGames9/JBrown | CocoyGames9 | 2023-12-23T03:50:37Z | 0 | 0 | null | [
"license:other",
"region:us"
] | null | 2023-12-23T03:47:15Z | ---
license: other
license_name: icescream4
license_link: LICENSE
---
|
AmitMidday/Brain-Tumor-detection | AmitMidday | 2023-12-23T03:45:40Z | 0 | 0 | keras | [
"keras",
"biology",
"image-classification",
"en",
"license:apache-2.0",
"region:us"
] | image-classification | 2023-12-23T03:37:40Z | ---
license: apache-2.0
library_name: keras
pipeline_tag: image-classification
tags:
- biology
language:
- en
metrics:
- accuracy
--- |
liyoo/IntegratedModel_PairClassification | liyoo | 2023-12-23T03:33:19Z | 0 | 0 | null | [
"code",
"text-classification",
"zh",
"region:us"
] | text-classification | 2023-12-23T03:30:08Z | ---
language:
- zh
pipeline_tag: text-classification
tags:
- code
--- |
pingstudio07/distilbert-base-uncased-finetuned-cola | pingstudio07 | 2023-12-23T03:16:39Z | 5 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-12-23T01:37:58Z | ---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- matthews_correlation
model-index:
- name: distilbert-base-uncased-finetuned-cola
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7613
- Matthews Correlation: 0.5218
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5174 | 1.0 | 535 | 0.4562 | 0.4364 |
| 0.3426 | 2.0 | 1070 | 0.4706 | 0.5147 |
| 0.2363 | 3.0 | 1605 | 0.6783 | 0.5016 |
| 0.1593 | 4.0 | 2140 | 0.7613 | 0.5218 |
| 0.1249 | 5.0 | 2675 | 0.8566 | 0.5139 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
|
dapa93/q-taxi-v3 | dapa93 | 2023-12-23T03:16:26Z | 0 | 0 | null | [
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | reinforcement-learning | 2023-12-23T03:16:23Z | ---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="dapa93/q-taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
Subsets and Splits