adamdad commited on
Commit
56746cf
·
verified ·
1 Parent(s): e393609

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -0
README.md CHANGED
@@ -6,3 +6,64 @@ library_name: timm
6
  license: apache-2.0
7
  ---
8
  # Model card for kat_tiny_patch16_224.vitft
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6
  license: apache-2.0
7
  ---
8
  # Model card for kat_tiny_patch16_224.vitft
9
+
10
+ ---
11
+ tags:
12
+ - image-classification
13
+ - timm
14
+ library_name: timm
15
+ license: apache-2.0
16
+ ---
17
+
18
+ KAT model trained on ImageNet-1k (1 million images, 1,000 classes) at resolution 224x224. It was first introduced in the paper Kolmogorov–Arnold Transformer.
19
+
20
+ ## Model description
21
+ KAT is a model that replaces channel mixer in transfomrers with Group Rational Kolmogorov–Arnold Network (GR-KAN).
22
+
23
+ ## Usage
24
+ The model definition is at https://github.com/Adamdad/kat, `katransformer.py`.
25
+
26
+ ```python
27
+ from urllib.request import urlopen
28
+ from PIL import Image
29
+ import timm
30
+ import torch
31
+ import katransformer
32
+
33
+ img = Image.open(urlopen(
34
+ 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
35
+ ))
36
+
37
+ # Move model to CUDA
38
+ device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
39
+
40
+ model = timm.create_model("hf_hub:adamdad/kat_tiny_patch16_224.vitft", pretrained=True)
41
+ model = model.to(device)
42
+ model = model.eval()
43
+
44
+
45
+
46
+ # get model specific transforms (normalization, resize)
47
+ data_config = timm.data.resolve_model_data_config(model)
48
+ transforms = timm.data.create_transform(**data_config, is_training=False)
49
+
50
+ output = model(transforms(img).unsqueeze(0).to(device)) # unsqueeze single image into batch of 1
51
+
52
+ top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
53
+ print(top5_probabilities)
54
+ print(top5_class_indices)
55
+
56
+ ```
57
+
58
+ ## Bibtex
59
+ ```bibtex
60
+ @misc{yang2024compositional,
61
+ title={Kolmogorov–Arnold Transformer},
62
+ author={Xingyi Yang and Xinchao Wang},
63
+ year={2024},
64
+ eprint={XXXX},
65
+ archivePrefix={arXiv},
66
+ primaryClass={cs.CV}
67
+ }
68
+ ```
69
+