Fetching metadata from the HF Docker repository...
Upload 305 files
af9251e
-
PY3
Upload 305 files
-
8.57 kB
Upload 305 files
czech.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.42 MB
Upload 305 files
danish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.43 MB
Upload 305 files
dutch.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
840 kB
Upload 305 files
english.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
495 kB
Upload 305 files
estonian.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.8 MB
Upload 305 files
finnish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.19 MB
Upload 305 files
french.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
664 kB
Upload 305 files
german.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.71 MB
Upload 305 files
greek.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.04 MB
Upload 305 files
italian.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
749 kB
Upload 305 files
malayalam.pickle
Detected Pickle imports (7)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict"
How to fix it?
221 kB
Upload 305 files
norwegian.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.42 MB
Upload 305 files
polish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.29 MB
Upload 305 files
portuguese.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
740 kB
Upload 305 files
russian.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict"
How to fix it?
33 kB
Upload 305 files
slovene.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
940 kB
Upload 305 files
spanish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
680 kB
Upload 305 files
swedish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.17 MB
Upload 305 files
turkish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.36 MB
Upload 305 files