Upload pytorch_model.bin
f88f630
-
1.52 kB
initial commit
-
5.5 kB
Create README.md
-
69.8 kB
Upload config.json
pytorch_model.bin
Detected Pickle imports (27)
- "torch.nn.modules.activation.GELU",
- "torch.nn.modules.container.ModuleList",
- "torch.float32",
- "torch.nn.modules.sparse.Embedding",
- "torch.nn.modules.linear.Linear",
- "x_transformers.attend.EfficientAttentionConfig",
- "functools.partial",
- "x_transformers.autoregressive_wrapper.AutoregressiveWrapper",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.functional.softmax",
- "collections.OrderedDict",
- "x_transformers.x_transformers.TransformerWrapper",
- "x_transformers.x_transformers.FeedForward",
- "torch.nn.modules.dropout.Dropout",
- "x_transformers.x_transformers.TokenEmbedding",
- "torch.nn.modules.normalization.LayerNorm",
- "x_transformers.attend.create_causal_mask",
- "x_transformers.x_transformers.AbsolutePositionalEmbedding",
- "torch.FloatStorage",
- "x_transformers.x_transformers.Attention",
- "x_transformers.attend.Attend",
- "__builtin__.set",
- "torch.nn.modules.linear.Identity",
- "torch._utils._rebuild_parameter",
- "x_transformers.x_transformers.Decoder",
- "torch.nn.modules.container.Sequential",
- "x_transformers.x_transformers.Residual"
How to fix it?
191 MB
Upload pytorch_model.bin
-
51.9 kB
Upload 2 files