File size: 1,670 Bytes
375f19f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 |
# API Reference
This document describes the public Python modules and functions available in AnyCoder.
---
## `models.py`
### `ModelInfo` dataclass
```python
@dataclass
class ModelInfo:
name: str
id: str
description: str
default_provider: str = "auto"
```
### `AVAILABLE_MODELS: List[ModelInfo]`
A list of supported models with metadata.
### `find_model(identifier: str) -> Optional[ModelInfo]`
Lookup a model by name or ID. Returns a `ModelInfo` or `None`.
---
## `inference.py`
### `chat_completion(model_id: str, messages: List[Dict[str,str]], provider: Optional[str]=None, max_tokens: int=4096) -> str`
Send a one-shot chat completion request. Returns the assistant response as a string.
### `stream_chat_completion(model_id: str, messages: List[Dict[str,str]], provider: Optional[str]=None, max_tokens: int=4096) -> Generator[str]`
Stream partial generation results, yielding content chunks.
---
## `hf_client.py`
### `get_inference_client(model_id: str, provider: str="auto") -> InferenceClient`
Create and return a configured `InferenceClient`, routing to Groq, OpenAI, Gemini, Fireworks, or HF as needed.
---
## `deploy.py`
### `send_to_sandbox(code: str) -> str`
Wrap HTML code in a sandboxed iframe via a data URI for live preview.
### `load_project_from_url(url: str) -> Tuple[str, str]`
Import a Hugging Face Space by URL, returning status message and code content.
---
## `plugins.py`
### `PluginManager`
* `discover()`: auto-discovers plugins in the `plugins/` namespace.
* `list_plugins() -> List[str]`: return registered plugin names.
* `run_plugin(name: str, payload: Dict) -> Any`: execute a plugin action.
|