Fetching metadata from the HF Docker repository...
Upload 23 files
f980dad
verified
-
963 kB
Upload 23 files
-
3.85 MB
Upload 23 files
-
11.6 MB
Upload 23 files
-
7.38 kB
Upload 23 files
-
4.41 kB
Upload 23 files
-
345 MB
Upload 23 files
-
818 kB
Upload 23 files
-
481 kB
Upload 23 files
-
1.92 MB
Upload 23 files
feature_scaler.pkl
Detected Pickle imports (5)
- "numpy.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler"
How to fix it?
6.76 kB
Upload 23 files
isolation_forest.pkl
Detected Pickle imports (6)
- "numpy.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.tree._classes.ExtraTreeRegressor",
- "numpy.ndarray",
- "sklearn.tree._tree.Tree",
- "sklearn.ensemble._iforest.IsolationForest"
How to fix it?
402 kB
Upload 23 files
label_encoder.pkl
Detected Pickle imports (4)
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct",
- "sklearn.preprocessing._label.LabelEncoder",
- "numpy.ndarray"
How to fix it?
460 Bytes
Upload 23 files
lightgbm_model.pkl
Detected Pickle imports (8)
- "numpy.dtype",
- "lightgbm.basic.Booster",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "collections.OrderedDict",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar",
- "lightgbm.sklearn.LGBMClassifier",
- "collections.defaultdict"
How to fix it?
1.01 MB
Upload 23 files
mlp_model.pkl
Detected Pickle imports (5)
- "numpy.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.preprocessing._label.LabelBinarizer",
- "numpy.ndarray",
- "sklearn.neural_network._multilayer_perceptron.MLPClassifier"
How to fix it?
669 kB
Upload 23 files
randomforest_model.pkl
Detected Pickle imports (5)
- "numpy.dtype",
- "sklearn.ensemble._forest.RandomForestClassifier",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.ndarray",
- "sklearn.tree._classes.DecisionTreeClassifier"
How to fix it?
2.96 MB
Upload 23 files
-
37.7 kB
Upload 23 files
-
481 kB
Upload 23 files
-
1.92 MB
Upload 23 files
-
900 kB
Upload 23 files