Fetching metadata from the HF Docker repository...
thomas-yanxin
'init'
02d13dc
-
PY3
'init'
-
6.15 kB
'init'
-
8.57 kB
'init'
czech.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.42 MB
'init'
danish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.43 MB
'init'
dutch.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
840 kB
'init'
english.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
495 kB
'init'
estonian.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.8 MB
'init'
finnish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.19 MB
'init'
french.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
664 kB
'init'
german.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.71 MB
'init'
greek.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.04 MB
'init'
italian.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
749 kB
'init'
malayalam.pickle
Detected Pickle imports (7)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict"
How to fix it?
221 kB
'init'
norwegian.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.42 MB
'init'
polish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.29 MB
'init'
portuguese.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
740 kB
'init'
russian.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict"
How to fix it?
33 kB
'init'
slovene.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
940 kB
'init'
spanish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
680 kB
'init'
swedish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.17 MB
'init'
turkish.pickle
Detected Pickle imports (3)
- "copy_reg
._reconstructor
",
- "__builtin__
.object
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.36 MB
'init'