meghsn's picture
Rename results/GenericAgent-AgentTrek-1.0-32b/readme.md to results/GenericAgent-AgentTrek-1.0-32b/README.md
6d59540 verified
### GenericAgent-AgentTrek-1.0-32b
this agent is GenericAgent from Agentlab
- **Base Model:**
- Qwen/Qwen2.5-32B-Instruct
- **Architecture:**
- Type: Causal Language Models
- Training Stage: Pretraining & Post-training
- Architecture: transformers with RoPE, SwiGLU, RMSNorm, and Attention QKV bias
- Number of Parameters: 32.5B
- Number of Paramaters (Non-Embedding): 31.0B
- Number of Layers: 64
- Number of Attention Heads (GQA): 40 for Q and 8 for KV
- Input/Output Format:
- with the following flags:
```txt
flags=GenericPromptFlags(
obs=ObsFlags(
use_html=True,
use_ax_tree=True,
use_tabs=False,
use_focused_element=False,
use_error_logs=True,
use_history=True,
use_past_error_logs=False,
use_action_history=True,
use_think_history=False,
use_diff=False,
html_type='pruned_html',
use_screenshot=False,
use_som=False,
extract_visible_tag=False,
extract_clickable_tag=False,
extract_coords='False',
filter_visible_elements_only=False,
openai_vision_detail='auto',
filter_with_bid_only=False,
filter_som_only=False
),
action=ActionFlags(
action_set=HighLevelActionSetArgs(
subsets=('miniwob_all',),
multiaction=False,
strict=False,
retry_with_force=True,
demo_mode='off'
),
long_description=False,
individual_examples=False,
multi_actions=None,
is_strict=None
),
use_plan=False,
use_criticise=False,
use_thinking=True,
use_memory=True,
use_concrete_example=True,
use_abstract_example=True,
use_hints=False,
enable_chat=False,
max_prompt_tokens=40000,
be_cautious=True,
extra_instructions=None,
add_missparsed_messages=True,
max_trunc_itr=20,
flag_group=None
)
```
- Training Details
- Dataset used: [AgentTrek-6K](https://agenttrek.github.io)
- Number of training steps: 3 Epochs
- Paper Link:
- https://arxiv.org/abs/2412.09605
- Code Repository:
- https://agenttrek.github.io
- Lisense:
- apache2.0