html_url
stringlengths 48
51
| title
stringlengths 5
268
| comments
stringlengths 70
51.8k
| body
stringlengths 0
29.8k
| comment_length
int64 16
1.52k
| text
stringlengths 164
54.1k
| embeddings
sequence |
---|---|---|---|---|---|---|
https://github.com/huggingface/datasets/issues/620 | map/filter multiprocessing raises errors and corrupts datasets | I didn't -- it was a new sessions --- buuut - look like it's working today -- woot! I'll close this issue. Thanks @lhoestq | After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing.
```python
...
ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed)
ner_ds_dict["validation"] = ner_ds_dict["test"]
rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed)
rel_ds_dict["validation"] = rel_ds_dict["test"]
return ner_ds_dict, rel_ds_dict
```
The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable.
The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`.
Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads.
I also see errors with other map and filter functions when `num_proc` is set.
```
Done writing 67 indices in 536 bytes .
Done writing 67 indices in 536 bytes .
Fatal Python error: PyCOND_WAIT(gil_cond) failed
``` | 24 | map/filter multiprocessing raises errors and corrupts datasets
After upgrading to the 1.0 started seeing errors in my data loading script after enabling multiprocessing.
```python
...
ner_ds_dict = ner_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed)
ner_ds_dict["validation"] = ner_ds_dict["test"]
rel_ds_dict = rel_ds.train_test_split(test_size=test_pct, shuffle=True, seed=seed)
rel_ds_dict["validation"] = rel_ds_dict["test"]
return ner_ds_dict, rel_ds_dict
```
The first train_test_split, `ner_ds`/`ner_ds_dict`, returns a `train` and `test` split that are iterable.
The second, `rel_ds`/`rel_ds_dict` in this case, returns a Dataset dict that has rows but if selected from or sliced into into returns an empty dictionary. eg `rel_ds_dict['train'][0] == {}` and `rel_ds_dict['train'][0:100] == {}`.
Ok I think I know the problem -- the rel_ds was mapped though a mapper with `num_proc=12`. If I remove `num_proc`. The dataset loads.
I also see errors with other map and filter functions when `num_proc` is set.
```
Done writing 67 indices in 536 bytes .
Done writing 67 indices in 536 bytes .
Fatal Python error: PyCOND_WAIT(gil_cond) failed
```
I didn't -- it was a new sessions --- buuut - look like it's working today -- woot! I'll close this issue. Thanks @lhoestq | [
-0.4389113188,
-0.1553176045,
-0.0303599108,
0.2238799036,
0.143532753,
-0.1580419242,
0.3082619309,
0.3411381245,
0.1361240149,
0.1270734966,
0.0647981763,
0.383027494,
-0.431142509,
0.3586109281,
-0.3158300817,
0.0192880929,
0.1044040695,
-0.0822966844,
-0.3726721704,
0.2437452227,
-0.2811785936,
0.2419262379,
-0.3431833982,
0.149983272,
-0.5202986598,
0.091508463,
-0.061079599,
0.287017107,
0.0001859963,
-0.567135334,
0.3406105936,
0.2375903875,
0.0439224243,
0.433978498,
-0.0001241726,
0.1413949728,
0.3641110957,
-0.1141222939,
-0.0690168589,
-0.3426635861,
-0.2272138894,
-0.2039404511,
0.290355742,
0.1174304038,
0.1919284314,
0.0396898314,
-0.1186009869,
-0.2761328816,
0.2109073102,
0.4283510149,
0.1021492109,
0.2034938335,
0.2628215551,
0.1077972353,
0.0287276395,
0.1473948061,
-0.0351304412,
0.0102035142,
0.3186469376,
-0.3860553801,
-0.087485306,
0.2630183101,
-0.254226476,
-0.1928122193,
-0.1764391363,
-0.2241110653,
0.6580688953,
-0.5945963264,
0.186085254,
0.0570210554,
-0.1738321483,
-0.0090262238,
-0.4382346869,
-0.1861440241,
-0.2577210665,
-0.5019292235,
0.2167699635,
0.0640546605,
-0.1605041772,
0.1461488158,
-0.4861245751,
-0.216223523,
-0.0459570885,
-0.0600844957,
-0.0794191509,
0.526899159,
0.2126139849,
0.3030979633,
0.1667359024,
0.1282709092,
0.2003571838,
0.0367892943,
-0.1074387506,
0.168802619,
-0.4264653921,
0.0932109207,
0.1012026966,
-0.0469357744,
-0.2404866815,
0.0284212232,
-0.1960724443,
0.0912135616,
-0.0908491462,
0.0443779491,
0.4214399159,
-0.1040275693,
0.2137225121,
0.2978259921,
0.2598980665,
-0.2024500072,
0.006180346,
0.0403157771,
0.3496517837,
-0.1308578551,
-0.0845071152,
0.3169030845,
0.1554720551,
-0.1958630085,
-0.1778362691,
0.1278186291,
-0.0874682739,
-0.2110481411,
0.0750092119,
0.1939676851,
0.0216420665,
0.7758273482,
0.1422026753,
0.193322584,
-0.3252581656,
-0.1087513566,
-0.0756798238,
-0.0402469337,
-0.3965353966,
0.083778888,
0.2747554183,
0.0335659981,
-0.0190130956,
0.013980981,
-0.3854870796,
-0.2746291757,
0.1181667447,
-0.2540863752,
0.1059900075,
0.6713737249,
-0.0934602171,
0.0071929991,
-0.0351632163,
-0.1314364523,
0.0976829976,
0.2614509165,
-0.5445340872,
-0.2043901384,
-0.2256694436,
0.0868669897,
0.0191378966,
0.2917740941,
-0.1509464383,
0.2350899577,
0.4053871036,
-0.3626492023,
-0.3123522997,
-0.3348921835,
-0.34825477,
-0.2950327396,
-0.0426825806,
0.2990988493,
-0.3928133249,
-0.0415193364,
0.0763772726,
-0.2192999572,
0.3062731326,
0.201425001,
-0.0193770565,
-0.0061093904,
-0.1035312563,
0.0675834864,
0.0663001314,
-0.2072346359,
-0.1084179208,
0.2526922822,
0.0651986226,
0.3254319727,
-0.020541776,
-0.2612626851,
0.3218257427,
-0.1641812623,
0.2407956719,
-0.0771475732,
-0.1891910136,
0.0664505288,
-0.4434907734,
-0.0198263824,
-0.0087883212,
-0.0699776337,
0.3942244649,
0.1964440048,
-0.1109889969,
-0.49637568,
0.3344274163,
-0.109940812,
0.3142294884,
0.0408363976,
0.085700579,
0.0698184818,
0.1444445848,
-0.295013696,
-0.4820040762,
0.1784679592,
-0.2608715594,
0.048310224,
-0.0037992448,
0.0216592327,
0.0957435593,
0.19244726,
-0.1963993311,
-0.2043779939,
0.032727614,
-0.033702299,
-0.3150821626,
-0.0680286884,
-0.218220979,
0.689992249,
0.1175847054,
0.1319134682,
-0.1111834049,
0.2003579438,
-0.1506476104,
-0.4101492763,
-0.2707154751,
0.2542414069,
-0.0004456062,
-0.0678376034,
-0.1897207201,
0.4699812829,
0.4786742926,
-0.1128507927,
-0.0575909726,
0.133488521,
0.2497291416,
-0.1686916351,
-0.1394181401,
0.035643924,
0.0225016624,
-0.1713119149,
0.2028630078,
0.4330974221,
0.122359857,
0.4227697253,
0.2035801411,
0.2512252331,
0.2895072103,
-0.0033041909,
-0.0225605778,
-0.1272287816,
0.2133680135,
-0.1696629524,
0.1399857998,
0.0221166667,
-0.1458354741,
-0.2394185066,
0.0672907904,
0.019060649,
-0.1369374841,
0.0566273406,
0.069679372,
-0.0945376903,
0.1550385505,
0.0438796878,
0.4365888238,
0.0082248263,
-0.2088412046,
0.1739034504,
-0.2554394007,
-0.009901166,
0.1101606041,
-0.0709504038,
0.4103219807,
0.2982472479,
0.0862265229,
0.0025430061,
-0.1559939086,
-0.3185814023,
0.0706308708,
0.3644180596,
-0.529440105,
0.1767974347,
-0.2681467533,
0.3429844677,
0.1035523713,
-0.124504678,
-0.2846294641,
-0.5637057424,
-0.1075892001,
0.5049536824,
-0.0873984694,
0.1322868168,
-0.08102341,
0.0878844336,
-0.2123987079,
0.3743548989,
-0.1540359855,
-0.3298152685,
-0.1779155731,
-0.0936025083,
0.3544713557,
-0.2819009423,
0.2225764245,
0.1541029215,
-0.2573420703,
0.1878949255,
-0.304569304,
0.0626310408,
-0.0436368585,
0.0536769703,
0.1156657636,
-0.0004304722,
0.0411716849,
-0.1813836396,
0.1464893967,
-0.1574029922,
-0.2222723663,
0.2964484692,
-0.1393527687,
0.093963325,
-0.263433814,
-0.3453348875,
-0.4375616908,
-0.1150626391,
-0.0753438473,
-0.1841031313,
0.3403443694,
0.1291125417,
0.0309451446,
-0.0299715046,
-0.0697868019,
0.0478992611,
-0.0658788458,
-0.0593989827,
-0.2077106386,
0.0360213108,
-0.1212218106,
-0.052694682,
0.0805644542,
0.1198305935,
0.4075067043,
-0.3314506412,
0.0480154231,
0.0899495259,
-0.1362225562,
0.2196605504,
-0.1817481816,
0.3469201624,
0.4104122221,
0.0584700033,
-0.0340174064,
-0.1860707998,
0.095423229,
0.038662076,
0.052445963,
0.0951317847,
0.3037012219,
0.1426614821,
0.547745347,
0.3771882653,
0.1038320959,
0.3844013214,
-0.0821644291,
0.1198286489,
-0.2023445964,
-0.3933935165,
-0.1818888783,
-0.3160483241,
-0.0118762366,
-0.2425553054,
-0.0927246064,
-0.4461108446,
-0.3504071534,
0.5543603301,
-0.3758948445,
-0.2395125031,
-0.0468250625,
-0.5213506818,
0.2646477222,
-0.0838060826,
0.0509241447,
-0.0283504277,
-0.0298487879,
-0.2563855648,
0.1997666061,
0.1217576489,
-0.3251345158,
-0.2877395749,
-0.2596099973,
-0.4335174263,
0.3148643374,
0.2012099028,
0.6812643409,
0.0003930181,
-0.0801634789,
0.0085696802,
-0.1337452531,
0.8955232501,
-0.6052270532,
-0.3748024702,
0.3144157529,
-0.3783256412,
-0.2645666897,
-0.1625880003,
-0.1791368574,
0.5422026515,
0.3443921506,
0.3918615878,
-0.1067364067,
-0.2833139002,
0.2422165871,
-0.2013669163,
-0.1008807346,
0.0209741965,
-0.3194947243,
-0.0250380933,
-0.1747469753,
0.0378716998,
-0.2602265477,
0.0839429647,
-0.2379893214,
-0.068149507,
-0.1017657444,
-0.0635962188,
0.255874753,
0.0516588613,
-0.0036193728,
-0.2103954554,
0.0366050377,
0.0464493223,
0.168069616,
0.2738559246,
0.2778930962,
-0.4287843704,
-0.1004443392,
0.163533017,
-0.0033509322,
0.3238941729,
0.2844071686,
0.2353754044,
-0.1233852953,
0.0471064225,
0.1985633671,
-0.2943031192,
-0.0072245859,
0.3591533005,
-0.0389199629,
-0.5711477399,
-0.1325412095,
-0.0589142554,
0.410154283,
-0.1745317876,
0.5028028488,
-0.4040955901,
-0.2674126029,
0.3955000341,
0.0618790761,
0.7986562252,
-0.3684021831,
-0.0315075926,
0.0784256458,
0.0157470852,
0.0965813249,
0.0716534108,
0.0351689756,
-0.2841925025,
0.0027695261,
-0.0598041415,
0.039407596,
0.3132289648,
0.1772974432,
-0.1591722518,
0.4332932234,
-0.0048882216,
-0.0966081768,
-0.0331121646,
0.144344449,
0.4444711804,
-0.0378574543,
0.0888576433,
-0.0030982755,
-0.0598701648,
-0.169453606,
0.0205887482,
0.04469347,
0.2799762487,
-0.1480809897,
-0.4104656279,
-0.0688648894,
-0.1108009368,
0.1305982322,
0.2134426236,
0.1203014553,
-0.0266279913,
0.1129791886,
-0.1301922202,
0.0459448807,
-0.0973097458,
-0.1229398772,
0.0945899189,
0.3547366858,
0.2294874489,
0.052401565,
0.3715538681,
0.0996861011,
-0.2941741943,
0.0591279119,
-0.2497492582,
-0.2445215136,
0.0230627805,
0.0745785981,
0.091155827,
0.085272491,
0.1139178425,
-0.0113203935,
-0.1307722181,
-0.1835197061,
0.0477007031,
-0.0449272878,
-0.1178246588,
0.5261172056,
-0.049987454,
-0.3697993159,
-0.0020984076,
0.4567424655,
0.2305092514,
-0.115420416,
0.1137657166,
0.1260136217,
-0.0325599387,
-0.2368059307,
-0.177618742,
0.2296188921,
-0.1732086688,
0.2777264714,
0.0771606416,
-0.2246049196,
0.2320956737,
0.0277596693,
0.0488728769,
0.3711374998,
-0.1884255707,
-0.340945363,
-0.4504371285,
0.2851999104,
0.0569396242,
0.0860520825,
-0.1097670943,
0.1406214535,
-0.0625021905,
0.286008209,
-0.2217985392,
0.1086061671,
-0.043401774,
0.2306848019,
-0.0811156482,
0.020974867,
-0.1002746224,
-0.1217830777,
0.0783501416,
-0.0161352772,
-0.2222099751,
-0.0299738701,
0.0373680443,
0.1364163011,
0.0792148858,
-0.217005372,
-0.0580644384,
-0.0694187135,
-0.1116666198,
-0.1975592077,
0.1700116098,
0.3778552413,
-0.2157558948,
0.1627103388,
0.2334393114,
0.157620728,
0.1360863745,
0.3367210627,
-0.2189629227,
0.0361103565,
0.0200927332,
0.4062178433,
0.1348038316,
-0.1911143363,
0.0480335914,
0.1183557212,
-0.0878903642,
0.0132839847,
0.3321453631,
-0.2417396754,
0.0466575474,
0.0027653016,
0.4242884815,
0.4039996564,
-0.1242796704,
0.1138939112,
0.1284147203,
0.0989409387,
0.0134861022,
-0.1570079923,
0.5547403097,
0.1114390418,
0.0611255169,
0.3319329023,
0.2720369399,
-0.1747559309,
0.2071680874,
0.1542918682,
-0.0786117762,
-0.0768163055,
0.2549139559,
0.1522729695,
0.0538588837,
0.2410255969,
0.1996326745,
-0.3166675866,
-0.1305839121,
0.3212504089,
0.010067692,
0.4989769459,
-0.0399586856,
0.109574005,
-0.1257619262,
-0.6536568999,
0.0496177822,
0.1127599627,
-0.5217783451,
0.2226067781,
-0.0324490145,
0.2831968069,
-0.3367136121,
-0.4498468935,
-0.1828359962,
0.3819915056,
-0.2711340487,
-0.3221408427,
-0.2267080247,
-0.1483095139,
-0.099790141,
0.0662747696,
-0.1178688183,
0.1854426861,
0.8019153476,
0.0322910808,
-0.0746421814,
-0.1906514019,
-0.3439489603,
-0.0163973458,
0.3141086102,
-0.0198102593,
0.4053681493,
0.0181749649,
0.0303025842,
-0.0187586807,
0.1585473865,
0.4757373929,
0.7030801773,
-0.4632854164,
-0.0224411767,
0.1462651938,
-0.0747057348,
-0.0866851285,
0.4076158702,
0.1262259632,
0.3313067555,
0.2883495092,
-0.0535501689,
0.0369686596,
-0.1178286076,
0.3061884344,
0.363997817,
0.0329934321,
0.344456166,
0.063659966,
-0.1106200963,
0.2326200306,
0.0831909776,
-0.3255213201,
-0.3061915934,
0.5170183182,
-0.468868047,
0.2489012629,
0.1927310973,
0.0471718796,
0.1036803275,
0.4653494358,
0.288744092,
0.10774149,
-0.2872967124,
-0.1678346545,
-0.4076787829,
-0.0197938941,
-0.196934551,
0.3094287217,
-0.00873046,
0.0166616887,
-0.0470240042,
0.0472439528,
-0.0202056766,
-0.2586667836,
0.3506316245,
0.2956223488,
-0.5638790727,
0.1922885478,
0.074631229,
0.0110436846,
0.0230901539,
-0.302946955,
0.353707552,
0.0172163397,
-0.1114221215,
0.1150783077,
0.1453231275,
0.1238251328,
0.1865615696,
0.0095808171,
0.273088336,
0.5970850587,
-0.0058608353,
0.0081698522,
-0.2979800701,
0.0210409909,
-0.0924886763,
-0.0315571912,
0.0570216924,
0.409812808,
-0.214882195,
-0.0458691195,
0.0307063311,
0.1876498461,
0.0146619976,
0.2140067369,
-0.431604445,
0.1968342066,
0.234698087,
0.0222957432,
-0.1508832276,
0.3642427325,
0.1588980705,
0.3398589194,
-0.2817395926,
-0.1986920238,
0.2920216322,
-0.016054906,
-0.1066013575,
-0.6062430739,
0.2379888594,
-0.1984823346,
0.1442447156,
-0.5549510717,
-0.3425119519,
0.1964059472,
-0.1812265962,
-0.5432787538,
0.2515479922,
-0.1097091436,
-0.1704102606,
-0.052409675,
-0.2285166681,
0.1160427779,
-0.1258505434,
0.1942722946,
-0.0692154318
] |
https://github.com/huggingface/datasets/issues/619 | Mistakes in MLQA features names | Indeed you're right ! Thanks for reporting that
Could you open a PR to fix the features names ? | I think the following features in MLQA shouldn't be named the way they are:
1. `questions` (should be `question`)
2. `ids` (should be `id`)
3. `start` (should be `answer_start`)
The reasons I'm suggesting these features be renamed are:
* To make them consistent with other QA datasets like SQuAD, XQuAD, TyDiQA etc. and hence make it easier to concatenate multiple QA datasets.
* The features names are not the same as the ones provided in the original MLQA datasets (it uses the names I suggested).
I know these columns can be renamed using using `Dataset.rename_column_`, `questions` and `ids` can be easily renamed but `start` on the other hand is annoying to rename since it's nested inside the feature `answers`.
| 19 | Mistakes in MLQA features names
I think the following features in MLQA shouldn't be named the way they are:
1. `questions` (should be `question`)
2. `ids` (should be `id`)
3. `start` (should be `answer_start`)
The reasons I'm suggesting these features be renamed are:
* To make them consistent with other QA datasets like SQuAD, XQuAD, TyDiQA etc. and hence make it easier to concatenate multiple QA datasets.
* The features names are not the same as the ones provided in the original MLQA datasets (it uses the names I suggested).
I know these columns can be renamed using using `Dataset.rename_column_`, `questions` and `ids` can be easily renamed but `start` on the other hand is annoying to rename since it's nested inside the feature `answers`.
Indeed you're right ! Thanks for reporting that
Could you open a PR to fix the features names ? | [
0.2796129584,
-0.0672783554,
-0.0831842944,
0.1901165843,
0.2244757265,
0.3354703784,
0.5049108267,
0.1621008366,
-0.2027406245,
-0.0173157752,
0.114508383,
0.1897462159,
0.358310461,
0.4374579787,
-0.0695421547,
-0.2454162836,
0.2463854849,
-0.0470874943,
0.2421886921,
-0.0504073501,
-0.2164767087,
0.2113192081,
-0.0955331624,
0.6324015856,
0.0162045322,
0.0617662221,
-0.0821970925,
0.0513550192,
0.1023244262,
-0.2556814551,
-0.3580750823,
0.0417947918,
-0.128883481,
0.0629304796,
-0.000112155,
-0.5027587414,
0.3252591491,
-0.0481605902,
0.0559788384,
0.1959299743,
-0.420887351,
-0.4244924486,
0.0467999429,
-0.3844261765,
0.2206069976,
-0.2236409783,
-0.2441878021,
-0.1287858635,
-0.1785166562,
0.1382638514,
0.2021540701,
-0.015075691,
0.2414554507,
0.0380661637,
0.1567498744,
-0.0216939859,
-0.2375081778,
0.0635239929,
0.1898539066,
-0.3023802042,
0.3947689831,
0.4994893372,
-0.2198393047,
0.1615970135,
0.2153595537,
0.1484516114,
0.4425869882,
-0.4972688854,
0.290312022,
0.211874187,
0.3128451109,
-0.1872742474,
-0.2283301055,
-0.2488459051,
0.1180135906,
-0.4643638432,
0.1259204149,
0.1799850315,
0.0420380533,
-0.2551811039,
0.0971520841,
-0.2348626405,
0.2476613671,
-0.0904305354,
-0.1196244434,
-0.1861040592,
0.0913871229,
0.103745535,
-0.2814539969,
-0.3552183807,
0.3920391202,
0.1443684399,
-0.1899142712,
-0.0709322318,
-0.449195981,
-0.1471945494,
-0.0656526834,
0.1372911781,
-0.1833676398,
-0.0760085061,
-0.2866716683,
-0.1546821743,
0.2513927817,
-0.0729940236,
-0.1349553168,
0.2213103175,
0.2043880522,
-0.1287402958,
0.2029761821,
-0.0088789575,
-0.1400403976,
-0.0656040311,
0.0924019292,
-0.0184071027,
-0.4182294011,
0.223641485,
0.1739640534,
-0.2739313245,
-0.318122983,
0.3076447546,
-0.4477353394,
-0.183732748,
-0.10128396,
0.110402368,
0.0584749654,
0.1314394027,
-0.0421453156,
-0.0141477631,
-0.1282829046,
-0.2364763618,
-0.0169017054,
-0.0350197218,
-0.2116489708,
0.288820982,
-0.1696246862,
0.3123840094,
0.2091031969,
-0.3090596795,
0.0306375585,
0.0261930674,
0.1595199108,
0.2442902178,
-0.1613457352,
0.1271410733,
-0.1375317723,
0.0108512342,
0.0778983608,
0.0282533765,
-0.011586532,
0.1540521085,
-0.4452199638,
-0.002024442,
-0.3583758473,
0.2011043727,
0.2075176835,
0.09126243,
0.5391232371,
0.3284411132,
0.2340387404,
-0.1146792471,
0.1014702842,
0.0638987944,
0.4210464358,
-0.0761173069,
-0.0373545587,
0.3379054368,
-0.2224427462,
-0.2421078235,
-0.2108465582,
0.2244681567,
0.0717409626,
0.2689327598,
0.151211381,
-0.236349225,
-0.0420143306,
0.1127994508,
-0.0634274408,
-0.1214622408,
0.0940465406,
-0.0091809444,
0.056250222,
-0.424133569,
0.1855675578,
-0.0839035809,
-0.2598448396,
-0.13484855,
0.0801538676,
-0.0857664943,
-0.1918998808,
-0.0042830333,
-0.2674154341,
-0.39340657,
0.2177091539,
-0.0340568498,
-0.1972795725,
0.2775908113,
-0.1092589796,
-0.3722412288,
0.0812073275,
0.0164586026,
-0.2207567841,
-0.2087622881,
0.2250335217,
-0.0180339962,
0.1622038484,
0.1617987007,
-0.4457286596,
-0.2008738816,
-0.4516763091,
0.0267484635,
0.1582200676,
-0.0885869339,
-0.144015789,
-0.0856712684,
0.1257943958,
0.0197069198,
0.1156931818,
0.1000400782,
-0.3079134822,
-0.0546974614,
-0.2999896705,
-0.0695871413,
0.0166506618,
0.0884375647,
-0.2995301783,
0.0845689476,
0.0067739617,
0.0245158151,
0.0497618318,
0.3868396878,
0.2614253163,
0.2171546072,
-0.1582842618,
0.2827339768,
-0.1130652651,
0.1268774867,
0.2862364054,
0.3946852684,
0.1731518358,
0.0114112198,
0.1446133852,
0.1174112037,
-0.096471265,
0.2288684994,
-0.1158273295,
0.2167551368,
0.0829085857,
0.2774331272,
0.013783291,
0.3125980496,
-0.0607790127,
-0.3890330195,
-0.1617793739,
-0.2932702899,
0.2806167603,
0.1219408661,
-0.2302554399,
0.0286682379,
-0.4353849292,
0.3081433177,
0.4657302201,
0.0086178109,
-0.0998730659,
-0.1286445707,
-0.028665483,
0.0195422396,
0.3403356969,
-0.1338494122,
0.2067208886,
-0.0138757769,
-0.3927892745,
0.0305580664,
0.2801208496,
-0.1702565104,
0.1206270754,
0.3795158565,
-0.4961919487,
-0.0308023617,
0.399510622,
-0.1206596643,
-0.1075790897,
0.2712967396,
0.3922178149,
-0.1185784638,
-0.3020517826,
0.1671589315,
-0.0590961054,
-0.4547240734,
0.0114509985,
-0.438216567,
-0.2242614627,
-0.3971209526,
0.2313412577,
-0.1345758289,
-0.1287673414,
0.4221913517,
0.3064454794,
0.3904315233,
-0.1393621564,
0.5181227922,
-0.2536638379,
-0.0292946883,
-0.0275442228,
0.0332501456,
-0.0914423019,
0.2400505692,
0.1717612147,
-0.0654341131,
-0.0571085438,
-0.4989716113,
-0.4638466835,
0.136767149,
-0.1452696323,
0.5637742877,
-0.0681428313,
-0.2483799905,
-0.1504258513,
-0.227866143,
0.1815305203,
0.4881225824,
0.0109627172,
-0.0555701293,
-0.0401343703,
-0.0752226338,
-0.113044396,
-0.4911837578,
-0.1803045273,
-0.1180188283,
0.163439244,
-0.1793170273,
0.1086949781,
-0.0388176627,
-0.6115883589,
-0.2516985238,
0.0535068586,
-0.1561784744,
-0.313472718,
0.0295974165,
0.1696384549,
-0.266005069,
-0.0662301108,
-0.0142084956,
0.1149476767,
0.0425459705,
-0.4492481947,
-0.2512800097,
-0.4401279986,
-0.0633305758,
0.1214748621,
0.3817399442,
-0.0488700271,
0.0095464028,
0.2082292736,
-0.0724833012,
0.126606375,
-0.4921839237,
0.0524865873,
0.5331346989,
0.6120023131,
-0.0265214723,
0.4932018518,
-0.3559102416,
0.3424317837,
0.071539402,
0.3521507382,
0.0787940025,
0.0839557499,
-0.2783082724,
-0.0198829994,
-0.266733408,
0.4488637149,
-0.0902371705,
-0.1634921432,
0.1174073964,
-0.1591562778,
-0.08312466,
-0.1478431821,
0.658575654,
0.009552598,
-0.1561865807,
-0.3136453927,
-0.5466176867,
0.5313020945,
0.1734660566,
0.2710707188,
-0.1492188573,
-0.139234066,
0.3164740801,
0.0836725533,
0.2949912548,
-0.1113814041,
-1.0161286592,
-0.6102485061,
-0.1619785726,
0.2193025053,
-0.1704930961,
0.339532733,
-0.0004179776,
-0.1577084064,
0.0694308132,
0.2542561889,
0.3741114736,
0.0146669131,
-0.5943671465,
0.4133254886,
0.4034889042,
0.3657947481,
-0.4275994301,
0.0984600484,
0.3059116006,
0.0403931588,
0.0077426527,
-0.3111065924,
-0.1033150852,
0.1952513307,
0.0858671591,
-0.404475987,
-0.0101800226,
-0.2013163269,
0.0033733621,
-0.2211664468,
-0.1750166267,
-0.2270517647,
0.0023747236,
-0.199927792,
0.0937113911,
0.176276356,
-0.1498913169,
0.2022974938,
0.5323475599,
0.2188072801,
0.0430126712,
0.3364477456,
-0.0893383697,
0.19863002,
-0.1301905215,
0.1045795828,
-0.158658579,
-0.4007579982,
0.1648659259,
-0.400219202,
0.4479205012,
-0.0035476834,
-0.0747101828,
0.344391942,
-0.1640865207,
0.2891592979,
-0.0436089151,
-0.1024336666,
0.4039684236,
0.0920988917,
-0.0113363042,
-0.2493540645,
0.3904540539,
-0.1281460524,
-0.1608404219,
-0.0669845492,
0.1257350892,
0.0692490116,
0.2802540958,
0.0964804888,
0.8481333852,
0.0873074904,
-0.3693993986,
-0.0362948738,
-0.0430645384,
0.3490489125,
0.2155733109,
0.1329014003,
-0.273276329,
-0.2228891551,
0.1439790428,
0.0114851967,
0.0663690791,
-0.0088116881,
-0.0859189034,
0.4378458261,
-0.2496572435,
-0.1334147751,
0.0046632811,
-0.1519672126,
0.3291780651,
-0.0919127241,
-0.1032708064,
0.00130371,
-0.0551788136,
0.0320165046,
0.0673654377,
0.0134252049,
0.1532141566,
0.2280712426,
0.0703622401,
-0.2106653303,
-0.1434589624,
-0.263312161,
0.2165818214,
-0.4240279794,
-0.0710135698,
0.1545517445,
0.5141527653,
0.1313968897,
-0.2693910003,
0.1191044152,
0.4850043654,
0.1883931756,
0.0794618428,
0.0329508334,
-0.0805013627,
-0.065836288,
0.0872918665,
0.3662009835,
0.1003157645,
0.158884868,
-0.8413519859,
-0.0958491713,
0.4370795488,
0.0240392499,
0.0303875469,
-0.1506912261,
0.1995592415,
-0.2023026347,
0.122502014,
-0.025103949,
0.0037184581,
0.2275636792,
-0.1892839074,
0.0803503543,
-0.0721300915,
-0.1647091061,
0.0514402576,
0.0719823018,
0.3237182498,
0.2217676044,
-0.0306719281,
-0.0739219934,
-0.0580768585,
-0.250226438,
-0.1972160935,
0.0762858689,
0.1585502476,
-0.4756888747,
-0.1611997634,
0.2974228859,
-0.0639709681,
0.2787188292,
0.0751362592,
-0.2167110145,
-0.2680913806,
0.3928008378,
-0.0263238233,
-0.16257447,
-0.0601111539,
0.1347090155,
-0.3619490266,
0.1015697569,
-0.2566002309,
0.1478845179,
-0.2383137047,
0.1147257537,
-0.0440852307,
-0.1134204343,
-0.1752566397,
0.0167188682,
0.0728173703,
-0.2535342872,
0.1008439809,
-0.094960928,
0.0469544306,
0.1583999991,
0.0966841131,
-0.0210424364,
-0.097523272,
-0.2031387836,
0.0063375849,
-0.0154473539,
-0.023028437,
-0.0880546197,
0.1500521749,
0.3062989712,
-0.2276080996,
-0.1806185991,
-0.0943460539,
0.2853699923,
0.3514631689,
-0.1775411218,
0.2365424633,
0.141354844,
-0.1868920326,
0.1068337709,
-0.0975226313,
0.1227637976,
-0.3421069086,
0.0746162683,
0.3677308559,
-0.1180886403,
-0.0392938294,
0.1700599045,
-0.1394596696,
0.4927857518,
-0.2133516818,
0.026066117,
0.313580662,
0.1641212553,
-0.3212101161,
-0.2611581981,
-0.359367311,
-0.0353644155,
-0.0426311567,
0.082872428,
0.4474891126,
-0.1211830899,
0.0169709735,
0.1829268932,
0.2554731369,
0.0731597245,
0.0739675462,
0.374054879,
0.0700117946,
0.4004063308,
0.513710916,
-0.022779353,
-0.3062428534,
0.1996740848,
0.2639440894,
-0.0143206632,
0.6973950267,
0.1846509427,
0.3648731411,
0.2844780684,
0.0337654687,
-0.2174210101,
0.169856891,
0.1047147363,
0.1198457181,
-0.0501654297,
0.007880304,
-0.440472573,
-0.1537646949,
-0.0058659399,
-0.0703189671,
-0.3327949643,
0.0734067112,
-0.0668126643,
0.0654349327,
-0.0247056298,
-0.2808012068,
-0.1099703759,
0.6754772663,
-0.1364128292,
-0.1822153032,
-0.1523165107,
-0.1406025887,
0.4321357012,
0.4079170525,
-0.0529471971,
0.1072379798,
-0.0599151887,
-0.0789153576,
0.2173177302,
0.0980098993,
-0.0897991955,
0.1774734259,
-0.4331854284,
-0.2410421669,
0.0076311082,
-0.1011979133,
0.3845254481,
0.1653137654,
0.1407282948,
0.3100247979,
0.1142036915,
0.0501969606,
-0.0555059612,
0.1272188723,
-0.007862376,
0.1683944464,
-0.4091053307,
0.102230683,
-0.3346334696,
-0.0675827861,
-0.197317183,
-0.151213631,
-0.2300191373,
0.0502039455,
0.1940103024,
0.0131135173,
-0.0536454283,
0.1401582658,
0.0819328353,
-0.3603377342,
0.3539222479,
0.2670243382,
0.0564555712,
-0.2570889592,
-0.1675047874,
-0.3037044108,
0.1231963485,
0.4289313257,
0.3278813362,
0.3765979707,
-0.0242366903,
0.0989280492,
-0.1642769873,
-0.3755146265,
-0.1622454226,
-0.1620219052,
0.1337092966,
-0.6391882896,
-0.3137766123,
-0.057757616,
0.294485569,
-0.0273589902,
0.2157589644,
0.4554791152,
0.3199613988,
-0.1616204679,
-0.0226114243,
-0.340166986,
0.318231076,
0.6806935072,
0.1832422912,
0.2632364929,
0.1262328327,
0.0127971768,
-0.2782557607,
0.0611336716,
-0.4556922019,
-0.0571735613,
0.0371970199,
-0.1770209074,
-0.0471636616,
0.331812501,
0.2001005411,
-0.1217116117,
0.321811676,
0.1192959845,
0.0384390391,
-0.4937002957,
0.2479214668,
0.4585892558,
0.0407305881,
0.0988557041,
0.0653119907,
0.1100175977,
0.3921360075,
-0.0505034924,
-0.2685382366,
0.6965747476,
-0.0760934129,
-0.0218542516,
-0.0188633576,
0.4410730004,
-0.0197550394,
-0.1611001492,
-0.4324762225,
-0.4226661026,
0.3406487405,
-0.35584566,
-0.1867019981,
0.2179406881,
0.1376256049,
-0.1097856313,
-0.034153536,
0.1473581493,
0.3316615224,
-0.1459030211,
0.1566535532,
0.1557340026
] |
https://github.com/huggingface/datasets/issues/617 | Compare different Rouge implementations | Updates - the differences between the following three
(1) https://github.com/bheinzerling/pyrouge (previously popular. The one I trust the most)
(2) https://github.com/google-research/google-research/tree/master/rouge
(3) https://github.com/pltrdy/files2rouge (used in fairseq)
can be explained by two things, stemming and handling multiple sentences.
Stemming:
(1), (2): default is no stemming. (3): default is with stemming ==> No stemming is the correct default as you did [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py#L84)
Multiple sentences:
(1) `rougeL` splits text using `\n`
(2) `rougeL` ignores `\n`.
(2) `rougeLsum` splits text using `\n`
(3) `rougeL` splits text using `.`
For (2), `rougeL` and `rougeLsum` are identical if the sequence doesn't contain `\n`. With `\n`, it is `rougeLsum` that matches (1) not `rougeL`.
Overall, and as far as I understand, for your implementation here https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py#L65 to match the default, you only need to change `rougeL` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py#L86) to `rougeLsum` to correctly compute metrics for text with newlines.
Tagging @sshleifer who might be interested. | I used RougeL implementation provided in `datasets` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py) and it gives numbers that match those reported in the pegasus paper but very different from those reported in other papers, [this](https://arxiv.org/pdf/1909.03186.pdf) for example.
Can you make sure the google-research implementation you are using matches the official perl implementation?
There are a couple of python wrappers around the perl implementation, [this](https://pypi.org/project/pyrouge/) has been commonly used, and [this](https://github.com/pltrdy/files2rouge) is used in fairseq).
There's also a python reimplementation [here](https://github.com/pltrdy/rouge) but its RougeL numbers are way off.
| 145 | Compare different Rouge implementations
I used RougeL implementation provided in `datasets` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py) and it gives numbers that match those reported in the pegasus paper but very different from those reported in other papers, [this](https://arxiv.org/pdf/1909.03186.pdf) for example.
Can you make sure the google-research implementation you are using matches the official perl implementation?
There are a couple of python wrappers around the perl implementation, [this](https://pypi.org/project/pyrouge/) has been commonly used, and [this](https://github.com/pltrdy/files2rouge) is used in fairseq).
There's also a python reimplementation [here](https://github.com/pltrdy/rouge) but its RougeL numbers are way off.
Updates - the differences between the following three
(1) https://github.com/bheinzerling/pyrouge (previously popular. The one I trust the most)
(2) https://github.com/google-research/google-research/tree/master/rouge
(3) https://github.com/pltrdy/files2rouge (used in fairseq)
can be explained by two things, stemming and handling multiple sentences.
Stemming:
(1), (2): default is no stemming. (3): default is with stemming ==> No stemming is the correct default as you did [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py#L84)
Multiple sentences:
(1) `rougeL` splits text using `\n`
(2) `rougeL` ignores `\n`.
(2) `rougeLsum` splits text using `\n`
(3) `rougeL` splits text using `.`
For (2), `rougeL` and `rougeLsum` are identical if the sequence doesn't contain `\n`. With `\n`, it is `rougeLsum` that matches (1) not `rougeL`.
Overall, and as far as I understand, for your implementation here https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py#L65 to match the default, you only need to change `rougeL` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py#L86) to `rougeLsum` to correctly compute metrics for text with newlines.
Tagging @sshleifer who might be interested. | [
-0.0971114859,
-0.1454414278,
-0.1018337607,
0.2446956784,
-0.2790501118,
-0.6873996258,
0.096621722,
0.0406913385,
-0.2773723006,
0.2449085116,
-0.1052665114,
0.1245478541,
0.1565692425,
0.0299379602,
0.1508819461,
-0.2753462791,
0.0824457332,
0.0195158087,
-0.0070210993,
-0.1224721596,
-0.0989000276,
0.710848093,
-0.158222124,
0.0294796564,
-0.1360560656,
0.0632250085,
-0.0651762486,
0.1977921575,
-0.1465399265,
-0.1077265292,
-0.4238387048,
0.2580073774,
-0.0102512799,
0.5024157166,
-0.0001064565,
-0.2057291269,
0.1464960426,
-0.0821159482,
0.016335275,
-0.0362425745,
-0.1412745118,
-0.1513663977,
-0.0882193893,
-0.127660051,
-0.1156210154,
-0.1115846336,
-0.1406904459,
0.1013707072,
0.3020978868,
0.461953491,
0.2020173967,
-0.2780744433,
0.1017408296,
-0.1463396698,
0.5364744663,
-0.0650645122,
0.1783740819,
0.4277872145,
0.10880512,
-0.0589308403,
-0.2776236832,
0.3784534335,
0.0933000743,
-0.2227548361,
-0.0347063169,
0.1359010488,
-0.0522903614,
-0.1840518266,
0.0191417467,
0.5499540567,
-0.0278987065,
-0.1120533049,
-0.5472528934,
-0.0820497721,
-0.1682855934,
-0.3088391423,
-0.0079200082,
0.2469449341,
0.0827692077,
-0.0889342055,
-0.0894186571,
0.0640294999,
-0.1061205044,
-0.1750175655,
-0.4134927988,
0.0716328397,
-0.148103863,
-0.0499196984,
0.225497514,
-0.1386006624,
-0.451875329,
0.0884425938,
-0.2105748355,
-0.2248038054,
-0.2263536751,
0.0304094255,
0.201251179,
0.2470627427,
0.154894501,
0.3268063962,
0.1311068237,
0.0373831615,
0.1005327404,
0.0954622999,
-0.2064218521,
0.020002082,
0.1247551143,
0.0741548091,
0.4574237466,
0.1187109426,
-0.0139872804,
0.122483097,
0.2310451567,
-0.4499969184,
-0.2169404626,
0.0168353543,
-0.1520792693,
-0.2141430378,
-0.4475256801,
0.0682154894,
-0.4359626174,
-0.0626245663,
0.0520832762,
0.2803232074,
-0.0924390927,
0.1899288446,
0.0190020502,
0.0347930081,
-0.386793673,
-0.4065910876,
-0.2543852925,
0.1753970087,
-0.158220917,
-0.0041396245,
0.1424895227,
0.0595287718,
0.120160602,
0.0901781619,
0.1998115778,
-0.1900428087,
0.123847492,
-0.3567662537,
-0.1528309584,
-0.1231162921,
-0.1686125994,
-0.1810044348,
-0.0722728521,
-0.0784542114,
-0.1086548343,
-0.0798538476,
-0.3861468136,
-0.0888709053,
0.2821953297,
0.2549304664,
-0.1683554947,
-0.0753796548,
-0.0070648529,
0.3652496338,
0.0471456833,
-0.2536485791,
0.2029474527,
-0.3402796984,
-0.3170868754,
-0.1564066708,
0.1576574296,
0.1496521235,
-0.3296750784,
-0.1588442326,
0.4732912779,
0.0124880262,
0.2169861794,
0.4738479853,
0.0712080449,
0.1423045546,
0.0155603886,
0.1054050624,
0.1755564809,
-0.3649570644,
-0.2462284267,
0.2770293653,
0.1828017533,
-0.1516366601,
0.1605481803,
-0.1878700703,
0.2899070382,
-0.0099037625,
0.4108473659,
0.293428719,
0.2333950102,
0.1149005219,
-0.0389561392,
0.0885277167,
0.1400675178,
0.0026705563,
-0.1767751426,
-0.2594025433,
-0.0072351806,
-0.5122951269,
0.2567874789,
-0.2199500352,
-0.0949149132,
0.4375616312,
0.0547255911,
0.2224572748,
0.3824209571,
-0.0100097507,
0.0522029325,
-0.1480381042,
-0.9107897282,
0.1051267833,
-0.162416622,
-0.1286666691,
-0.1426204741,
-0.2644804716,
0.2685121,
-0.3495294452,
0.2174310088,
0.1194112599,
0.2412712872,
0.1883350015,
0.0180595964,
0.4312146604,
-0.0966071784,
0.1366844326,
-0.1902312189,
0.069712095,
-0.0828462765,
-0.1460763216,
0.3277023435,
0.4177886844,
0.1070452034,
0.1123863906,
0.138459444,
0.4820682406,
0.4255561829,
0.2667172253,
0.0809953213,
-0.0636451095,
0.2385508865,
-0.1907714903,
0.1061719805,
0.4484040737,
0.1561098844,
0.0829990357,
-0.547858119,
0.4688164592,
-0.257052511,
0.0163857378,
0.172032848,
-0.150586918,
0.1022731587,
0.0802511498,
-0.3891166151,
-0.0443931669,
0.3786429465,
0.288398385,
0.5072110295,
0.0236744042,
-0.2626598775,
0.0863703787,
0.471735388,
-0.2174662948,
0.2383184731,
0.0318507254,
-0.1205308959,
-0.2995310128,
0.0457244217,
0.0068179443,
-0.0083264336,
0.2829851806,
0.1845151335,
-0.0496971011,
-0.0539586246,
-0.4153893292,
0.2375271022,
-0.1250953674,
-0.2692333162,
0.2895720303,
0.3184879422,
-0.0660476238,
-0.4066005647,
-0.4403738678,
-0.157413125,
-0.004567435,
-0.1147074774,
0.2354545891,
-0.097347714,
-0.0767505616,
0.0552367195,
-0.3195217252,
-0.3229715824,
-0.2448721528,
0.2438195348,
0.0047331741,
-0.141825974,
-0.124214448,
0.3198381066,
0.1610336751,
0.1131193414,
-0.0448295474,
-0.0473406091,
-0.0999688655,
-0.1550525129,
0.1665180773,
-0.3360770345,
0.0314209685,
0.0443981066,
-0.3856734037,
-0.1498593092,
-0.0498600341,
-0.4121782184,
0.3080252409,
-0.1008325219,
0.0904306695,
0.3280991614,
-0.3643450141,
0.056774728,
0.3632387221,
0.1514817476,
-0.2763029337,
-0.2511569858,
0.2864518166,
-0.1681542546,
0.1224939227,
-0.2411349118,
-0.4058033526,
-0.0488450155,
-0.2470817566,
0.3765842617,
0.1368354559,
0.1807423532,
0.2938696444,
-0.0960829407,
0.073452577,
0.2460883707,
0.344604671,
-0.2459903806,
-0.0772867799,
0.271096617,
-0.4211293161,
-0.4233295918,
-0.1781837195,
0.0295938179,
-0.2226562947,
0.3266164362,
-0.090106234,
-0.2683164179,
-0.2974977791,
-0.0205992088,
0.0652812496,
0.1605581343,
0.4207522273,
-0.1087338179,
-0.1601272821,
-0.2147704661,
0.3006650209,
0.1324688643,
0.0576617941,
0.1184684038,
-0.1117350385,
0.3356944025,
0.1246397644,
0.1817937791,
0.6110417247,
0.4109738171,
-0.1472969055,
-0.1327872574,
0.1051608473,
-0.2374238074,
0.4501044154,
0.2883420587,
0.1164883822,
-0.1492546648,
0.415592283,
0.1704952568,
0.0853626579,
-0.0921642482,
0.0817559808,
-0.0926123187,
-0.3090032935,
0.0528820083,
0.2603553236,
0.265860796,
-0.2009757459,
0.1422983706,
0.1658269316,
-0.1431719065,
0.0257822163,
-0.0636700243,
0.4361128807,
0.1187672913,
-0.1927918494,
0.2547782063,
-0.5697333217,
0.3782298565,
0.2520795763,
-0.1025371253,
-0.0779549703,
-0.1554356962,
-0.0146311708,
-0.2424533069,
0.2306694686,
-0.3393935859,
-0.0417408533,
0.1037971675,
-0.338876307,
-0.4969877005,
0.0106605664,
-0.6937623024,
-0.0959682539,
0.0893724412,
-0.048252292,
-0.5003876686,
0.1068420708,
0.182892859,
-0.0426490791,
0.0415885076,
0.127132386,
-0.2671120167,
-0.0529955886,
0.0766844824,
0.0952228457,
0.064646557,
0.2550483346,
-0.2173061967,
0.0498063713,
0.2746775448,
-0.2561139464,
0.3525842726,
0.179670155,
0.3466416001,
-0.3354487121,
-0.3113318682,
0.077626273,
-0.1116545945,
0.1052938253,
0.3015514314,
-0.0927386582,
-0.3703176379,
0.2278024852,
-0.2204486877,
0.1226777285,
0.3488487303,
-0.1216982156,
0.058051303,
-0.1321875453,
0.26282686,
-0.2526236773,
-0.1727535427,
0.1578544378,
0.2081115991,
-0.1084816754,
-0.2623501718,
0.1070526913,
0.3056502342,
-0.213751018,
-0.1419989765,
-0.0868669748,
-0.3515121341,
0.2254667878,
-0.1824956834,
0.8882468939,
0.2881527543,
0.2318198979,
0.2464221269,
-0.2060980797,
0.4856416583,
-0.5538766384,
0.3664361537,
-0.3987363875,
-0.199107334,
-0.0963082984,
-0.0561263859,
-0.0862098783,
-0.0280219056,
-0.3337029815,
0.4100314677,
0.0071541369,
0.0599104911,
-0.0034880992,
0.186719954,
-0.0764987543,
-0.0226581655,
0.074322477,
0.1884349883,
0.0009853598,
0.0434773676,
-0.1266751885,
0.0128381811,
-0.4018340111,
-0.1078992039,
-0.0795296431,
-0.0119306929,
-0.016380012,
0.2267964631,
-0.0228107236,
-0.6391389966,
-0.3914238214,
0.4576608241,
0.1660068035,
0.1854422539,
-0.1440288126,
0.1402643025,
-0.3095241189,
0.3136015832,
0.1122796237,
-0.0075607449,
0.1653446108,
-0.0874929652,
-0.1345493346,
-0.0730034485,
0.0828043967,
-0.2731730938,
-0.4576726854,
0.1336608827,
-0.4114173651,
-0.1679709554,
0.3816083074,
0.1423404217,
-0.2837905288,
-0.0725671425,
0.1663459688,
0.1209417284,
-0.0679293424,
0.1607422829,
-0.0545489565,
-0.3273952007,
-0.1977743804,
0.2310940623,
0.0633951426,
0.0514847338,
-0.0855837613,
0.1746192276,
-0.1256530136,
-0.3106576204,
-0.1957188547,
0.0224108398,
-0.1284233034,
0.2936968803,
-0.2116028517,
-0.0801143721,
0.1554588377,
0.3048994541,
0.00722297,
0.1150606424,
0.1041506752,
-0.3179340065,
-0.1029888242,
0.4243272841,
0.0835589468,
-0.0209633224,
0.1780694872,
-0.0637655407,
0.1291879416,
-0.0087516196,
-0.3982552588,
0.0700940862,
0.0439027995,
0.0695232153,
-0.1887411326,
-0.4910793006,
0.0962918773,
0.1348728091,
0.1042613164,
0.3069245517,
-0.0105624739,
-0.2076696157,
-0.2314292341,
0.0660095587,
-0.1553934813,
0.0625843853,
0.1309309751,
0.1137333736,
-0.0862929076,
-0.3337723315,
0.3982593417,
0.2252205759,
0.2101084143,
0.1649029702,
-0.2435775548,
0.1802207679,
-0.1114250943,
0.0854100585,
0.0968199298,
0.1067477167,
-0.005656451,
0.0913320631,
-0.1165048853,
-0.1035878062,
-0.1371752918,
0.2953249216,
0.1966887563,
0.018152412,
0.2846372128,
0.0069290437,
-0.0362473018,
-0.3335186839,
0.6811602116,
0.2897186875,
-0.1604288518,
-0.0128496215,
-0.1669509113,
0.3059040308,
-0.1640089452,
-0.0910127312,
0.1018713489,
0.0283459984,
0.2761336565,
0.4041278064,
0.1747854799,
-0.0776230022,
0.1023046374,
-0.0922660977,
0.0974033773,
-0.1086210161,
0.0788891762,
-0.1504368186,
0.0693042129,
0.0775829777,
0.4464606345,
-0.024350483,
0.2357944548,
0.5418568254,
0.1702069938,
0.0721824914,
0.2276644111,
0.3588778377,
0.2913192213,
0.1977205575,
-0.2707345784,
0.2905136347,
0.0959034711,
-0.0549216829,
-0.1055885628,
0.1782447845,
-0.1429121047,
0.2414568812,
-0.246159628,
0.1060928106,
-0.1165654883,
-0.3853593171,
-0.2906481624,
-0.289500773,
-0.2302586883,
-0.0070822313,
-0.1021534204,
0.0588494651,
0.2445735633,
0.1722996235,
0.1638797522,
-0.2813106179,
-0.1577966958,
-0.2306459397,
0.6589990258,
-0.2258094102,
0.378985554,
0.0600174852,
0.2099436074,
0.2982420921,
0.2825276256,
0.2726276815,
-0.0003005229,
-0.1618120223,
-0.0062981248,
-0.2738711238,
0.0020630062,
-0.1665724665,
0.1828662455,
0.2584968805,
-0.0665345415,
0.2456975877,
0.1705545485,
-0.1593530178,
0.2583980858,
0.1558145732,
0.1162891984,
-0.1728270352,
0.2136123031,
0.0240421891,
0.1233216897,
-0.085503906,
-0.1272456944,
-0.4410177469,
-0.1121006235,
-0.1680354923,
0.1084948033,
0.0251605175,
0.0635280088,
0.1330225766,
0.1738584787,
-0.0746433586,
0.0802949443,
0.3782275021,
0.1270331442,
-0.1432543695,
-0.0927518755,
-0.1001959145,
-0.1376855969,
0.0591750219,
-0.2534724772,
0.450779438,
-0.0930619016,
0.2641928196,
0.1324883699,
0.2620569766,
-0.1844153553,
0.1038991511,
-0.3080263138,
0.2352757454,
-0.1922246665,
-0.0887037963,
0.007961981,
0.0237387568,
-0.0437166467,
-0.0535073765,
0.0951554403,
0.0235950463,
-0.2101691663,
0.1189062297,
0.1491819918,
0.2326219082,
0.5534222126,
0.2988823354,
-0.090261206,
-0.2370065153,
-0.0626341552,
-0.0594510995,
-0.1318841875,
0.4863537252,
-0.0554531552,
0.2033438087,
0.1254239976,
-0.5329970121,
0.1028760225,
0.7695528865,
0.0413449109,
-0.1699375063,
-0.2594556212,
-0.0285563506,
-0.0032681301,
-0.1611645371,
-0.3714610934,
0.5563142896,
-0.094503887,
0.0448717028,
-0.0440747291,
-0.5466018915,
0.2392415702,
-0.479534775,
0.0947284698,
-0.1087266654,
0.3924559951,
0.4902786314,
-0.0042809993,
-0.0192576572,
0.0582680404,
0.1101681516,
0.0432709716,
-0.2417524159,
0.0718719363,
0.0064340048,
-0.096060805,
-0.2161064297,
0.4115978479,
0.2002281249,
-0.3074797392,
0.0446098372,
-0.3132917583
] |
https://github.com/huggingface/datasets/issues/617 | Compare different Rouge implementations | This is a real issue, sorry for missing the mention @ibeltagy
We implemented a more involved [solution](https://github.com/huggingface/transformers/blob/99cb924bfb6c4092bed9232bea3c242e27c6911f/examples/seq2seq/utils.py#L481) that enforces that sentences are split with `\n` so that rougeLsum scores match papers even if models don't generate newlines.
Unfortunately, the best/laziest way I found to do this introduced an `nltk` dependency (For sentence splitting, all sentences don't end in `.`!!!), but this might be avoidable with some effort.
#### Sidebar: Wouldn't Deterministic Be Better?
`rouge_scorer.scoring.BootstrapAggregator` is well named but is not deterministic which I would like to change for my mental health, unless there is some really good reason to sample 500 observations before computing f-scores.
I have a fix on a branch, but I wanted to get some context before introducting a 4th way to compute rouge. Scores are generally within .03 Rouge2 of boostrap after multiplying by 100, e.g 22.05 vs 22.08 Rouge2.
| I used RougeL implementation provided in `datasets` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py) and it gives numbers that match those reported in the pegasus paper but very different from those reported in other papers, [this](https://arxiv.org/pdf/1909.03186.pdf) for example.
Can you make sure the google-research implementation you are using matches the official perl implementation?
There are a couple of python wrappers around the perl implementation, [this](https://pypi.org/project/pyrouge/) has been commonly used, and [this](https://github.com/pltrdy/files2rouge) is used in fairseq).
There's also a python reimplementation [here](https://github.com/pltrdy/rouge) but its RougeL numbers are way off.
| 144 | Compare different Rouge implementations
I used RougeL implementation provided in `datasets` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py) and it gives numbers that match those reported in the pegasus paper but very different from those reported in other papers, [this](https://arxiv.org/pdf/1909.03186.pdf) for example.
Can you make sure the google-research implementation you are using matches the official perl implementation?
There are a couple of python wrappers around the perl implementation, [this](https://pypi.org/project/pyrouge/) has been commonly used, and [this](https://github.com/pltrdy/files2rouge) is used in fairseq).
There's also a python reimplementation [here](https://github.com/pltrdy/rouge) but its RougeL numbers are way off.
This is a real issue, sorry for missing the mention @ibeltagy
We implemented a more involved [solution](https://github.com/huggingface/transformers/blob/99cb924bfb6c4092bed9232bea3c242e27c6911f/examples/seq2seq/utils.py#L481) that enforces that sentences are split with `\n` so that rougeLsum scores match papers even if models don't generate newlines.
Unfortunately, the best/laziest way I found to do this introduced an `nltk` dependency (For sentence splitting, all sentences don't end in `.`!!!), but this might be avoidable with some effort.
#### Sidebar: Wouldn't Deterministic Be Better?
`rouge_scorer.scoring.BootstrapAggregator` is well named but is not deterministic which I would like to change for my mental health, unless there is some really good reason to sample 500 observations before computing f-scores.
I have a fix on a branch, but I wanted to get some context before introducting a 4th way to compute rouge. Scores are generally within .03 Rouge2 of boostrap after multiplying by 100, e.g 22.05 vs 22.08 Rouge2.
| [
-0.1379877627,
0.0537791997,
-0.0650966167,
0.2345048934,
-0.0384498611,
-0.6865090728,
-0.1374981701,
-0.0325631797,
-0.2779029608,
0.3952675164,
-0.0672829896,
0.1716644466,
0.158900708,
0.363701731,
0.2211122215,
-0.0784865469,
0.1345870048,
-0.0168792792,
-0.1050261706,
-0.1088073924,
0.0128315762,
0.6053341627,
-0.0310587734,
0.1092316881,
-0.149068132,
0.1322922409,
0.2132201642,
0.0989613086,
-0.140973866,
-0.1004520208,
-0.1728938371,
0.1978751421,
-0.109423846,
0.434833616,
-0.0001064469,
-0.1936359704,
0.1715347916,
-0.1881957948,
0.0778006613,
-0.1544618756,
0.0580875725,
-0.0233251415,
-0.1111886427,
-0.0384340808,
-0.2476806641,
-0.0726091862,
-0.2509018183,
0.0748235062,
0.155462116,
0.3872461617,
0.2037398219,
-0.0639966279,
0.1957695186,
-0.2286604345,
0.5280541182,
-0.0566642061,
0.050669089,
0.524813056,
-0.1208932027,
-0.1075497121,
-0.3474803865,
0.337228775,
0.1901491284,
-0.3185479939,
0.1235745028,
0.0154709853,
-0.0340327509,
-0.2019998431,
-0.100061968,
0.6137256026,
-0.1752004772,
0.0161953457,
-0.5487054586,
-0.0667541474,
0.0393241756,
-0.1948318779,
0.0913074017,
-0.0565151721,
0.0368624628,
-0.1985516846,
-0.0645336732,
0.3034346402,
-0.0474747643,
-0.1840779334,
-0.1046660095,
-0.0378048904,
-0.1254390478,
-0.0757326633,
0.409512341,
-0.1220965683,
-0.5111018419,
0.3069493473,
-0.1545220613,
-0.0891522765,
-0.5262222886,
-0.0610734671,
0.1598900557,
0.4157992005,
0.2494129986,
0.4495765567,
0.3137444258,
0.0531714261,
0.0750584379,
0.0652159899,
-0.3272765279,
0.0348075666,
-0.0133850127,
-0.0332112387,
0.3852162361,
0.1316889673,
0.0124618709,
0.127668798,
0.360776186,
-0.5237596035,
-0.2875632644,
0.1228607744,
-0.4410629272,
-0.3779605329,
-0.4348494709,
0.1786826551,
-0.58181566,
-0.0216637366,
0.0735607594,
0.1862039268,
-0.1235138774,
0.2579779029,
0.012791954,
-0.0408550426,
-0.3306753039,
-0.283654362,
-0.2149549425,
0.0749736652,
-0.1546939313,
0.1803351343,
0.0942670777,
-0.0190298036,
0.1723965257,
0.0128591433,
0.1753638089,
-0.2587369978,
0.064031221,
-0.2548102438,
-0.1448671818,
-0.0185466222,
-0.1496221721,
-0.2562120855,
-0.016453553,
-0.1931043416,
-0.1303745806,
-0.1442642212,
-0.1164204031,
-0.0432635807,
0.4312919676,
0.2439417392,
-0.1952557862,
-0.166735068,
-0.0696857572,
0.2667291164,
-0.2485964298,
-0.1255651414,
0.1539246738,
-0.247567147,
-0.2683535814,
-0.0707558393,
0.3554469943,
0.1554045826,
-0.1645866036,
-0.1988317966,
0.5341029763,
-0.0445961021,
0.0099408478,
0.3509602249,
0.1440036297,
0.3776941001,
-0.0017386153,
-0.1347109675,
0.2774406374,
-0.3410015404,
-0.276017189,
-0.0057498664,
0.118654184,
-0.0679781586,
0.1401613355,
-0.0949995145,
0.0232434943,
-0.0039750785,
0.1682726145,
0.2526460588,
0.0599438697,
0.0217987448,
-0.1456856728,
0.0653615594,
0.2068350315,
0.1683457643,
-0.0439079218,
-0.2805391848,
-0.0583608076,
-0.608435154,
0.3301960528,
-0.160640955,
-0.3503609896,
0.3265831172,
0.0191347897,
0.1654942334,
0.2853585482,
-0.0028769746,
0.1243406981,
-0.0277978778,
-0.81345433,
0.1548523009,
0.1258452237,
-0.1853111237,
-0.3264547884,
-0.115909934,
0.3186034262,
-0.4248639047,
0.1786703765,
-0.0310023725,
0.4205991626,
0.1584348083,
0.0026369989,
0.399590373,
-0.1581783295,
0.1119304746,
-0.3919109702,
-0.0032038465,
-0.0094384244,
-0.1483703256,
0.2668651342,
0.4590809047,
0.1804878265,
0.163671568,
-0.0739021599,
0.4581427872,
0.3668298125,
0.0855821371,
0.1537890285,
0.1852807105,
0.03013671,
0.0182522535,
-0.1596734971,
0.5856785178,
0.1365932822,
0.1660447717,
-0.3437054157,
0.5487447977,
-0.47398597,
0.0545279086,
0.1800815016,
-0.1455043107,
0.0748513415,
0.1776164621,
-0.4026846588,
0.0675394535,
0.4525694549,
0.179604575,
0.5824776888,
0.0392692909,
-0.2232960165,
-0.0138713084,
0.2965690196,
-0.2222268432,
0.1048378944,
0.0292697921,
-0.072920844,
-0.1611637473,
0.0832692012,
-0.0469198599,
-0.0286680162,
0.2086730897,
0.013942332,
-0.0416532308,
-0.1332060397,
-0.4041920602,
0.1018151939,
-0.1700232029,
0.0515734591,
0.2663271129,
0.2414885908,
0.1121663302,
-0.3807005584,
-0.3593530655,
-0.3606612384,
-0.0638498813,
-0.1678469032,
0.1870642602,
-0.1131301671,
0.0091962442,
-0.1322090924,
-0.2192119658,
-0.432448715,
-0.2986613512,
0.1582986116,
0.0585573539,
-0.1180020273,
-0.0339628868,
0.323648572,
0.2754865289,
0.0101451725,
0.0402030349,
-0.0221721977,
-0.0944167525,
-0.2344565988,
0.1769818962,
-0.2460257113,
-0.1812219024,
0.0618321449,
-0.310754776,
-0.0086971,
0.0190701634,
-0.4526585042,
0.2786203325,
-0.102907598,
0.2484102547,
0.2135020792,
-0.4276446402,
0.0032279342,
0.4406086504,
0.1218787655,
-0.3287896812,
-0.3009264171,
0.1931617111,
-0.0724636987,
0.1929084212,
-0.0872085243,
-0.2683805227,
0.0207667109,
-0.0987431109,
0.5206074119,
0.1144227684,
0.1009171531,
0.3030359745,
-0.0912742317,
0.0563136376,
0.0778950006,
0.1816980839,
-0.2014035732,
-0.0219407603,
0.3603037298,
-0.3418869078,
-0.4680044353,
-0.1450921297,
0.0597496554,
-0.2673576474,
0.1555729955,
-0.1215312928,
-0.3317545652,
-0.0737200528,
0.0234595835,
-0.0890876353,
0.0716886744,
0.4092110991,
-0.2179151326,
-0.0893820524,
-0.1051584259,
0.0778065622,
0.1946626604,
-0.1805388331,
0.187025249,
-0.1966018677,
0.3595885932,
0.0451524779,
0.0217615087,
0.5290666223,
0.3862196207,
-0.2149225771,
-0.053315416,
0.0830887109,
-0.2630341649,
0.4513286948,
0.3552640378,
0.0063209534,
-0.1950841248,
0.2531484962,
0.0723755807,
0.3628644943,
-0.1058624163,
0.0236286595,
0.0601578504,
-0.3445744514,
0.0368495882,
0.088624537,
0.1257898808,
-0.2093368769,
0.2825525403,
0.0262494907,
-0.1072156578,
0.1978487223,
-0.0251674391,
0.3815751374,
0.1027462557,
-0.1733492315,
0.2707929909,
-0.4753193855,
0.3523251414,
0.1190907508,
-0.071065262,
0.0614492819,
-0.2489384562,
-0.082223475,
-0.2329116315,
0.3343762755,
-0.2879674137,
-0.0426404402,
0.124156341,
-0.2301530093,
-0.3302976191,
-0.1288581043,
-0.3942026794,
-0.3302254975,
0.0731100515,
0.0721998811,
-0.2989217341,
0.0522131957,
0.3209910989,
-0.1172249466,
0.0026302189,
0.1096910834,
-0.267166853,
-0.1202240884,
0.1696539521,
0.1458065212,
-0.0550410934,
0.2817126513,
-0.1646095216,
0.096565865,
0.050751023,
-0.1786044836,
0.3109718263,
0.2856432199,
0.3082368374,
-0.2026072741,
-0.1595377028,
0.1090665311,
-0.1977570802,
-0.1745831519,
0.3606677949,
-0.0489383936,
-0.3824967444,
0.1961781681,
-0.2080851495,
0.2019236088,
0.3150055707,
-0.0898813158,
0.1575018913,
-0.2980255783,
0.145592317,
-0.4090366364,
-0.0804000348,
0.0376454741,
0.1423520297,
0.1401459128,
-0.2033580393,
0.3560955822,
0.3175190091,
-0.2213910222,
-0.3108107746,
-0.0388198718,
-0.1971379519,
0.4677296281,
0.0941880196,
0.8448508978,
0.1214061677,
0.1850342602,
0.1718835682,
-0.2384850681,
0.4823986292,
-0.5614011288,
0.1663890481,
-0.5115355849,
-0.2749325931,
-0.0329949036,
0.0543625653,
-0.0000918265,
0.2027966976,
0.0014264584,
0.5391435027,
0.2287289798,
0.2562926114,
0.1209905818,
0.1763764918,
0.1043923944,
-0.1710022241,
-0.1731212437,
0.2126556784,
0.0154754817,
0.027829241,
-0.0779891908,
0.1016247571,
-0.5096171498,
0.0388415232,
-0.0475020409,
0.0537025928,
-0.3614259958,
-0.0153733063,
0.2476599067,
-0.5977813005,
-0.4231726229,
0.4159291387,
0.2904118896,
0.3292412162,
-0.0346555188,
0.1431695074,
-0.1709623188,
0.3200544715,
0.1805078536,
-0.0252602063,
0.2272321731,
-0.0795199201,
-0.1334780455,
-0.0351763107,
0.2026478946,
-0.3420900106,
-0.287504077,
-0.093145594,
-0.3542185128,
-0.152632609,
0.3578616679,
-0.0374420546,
-0.0467740223,
-0.0479709655,
0.1659759432,
0.091106236,
-0.0351624824,
0.0553897247,
0.0553385727,
-0.1962459534,
-0.1203996688,
0.1445236057,
-0.1053355336,
0.0237730891,
-0.2270730138,
0.0759270713,
-0.0387644283,
-0.207332626,
-0.083103776,
0.1247793213,
-0.1594740748,
0.0426367894,
-0.1619379073,
-0.2588416338,
0.077639088,
0.5247107148,
0.1725974232,
-0.0272150002,
0.2617971897,
-0.2476862073,
-0.1904668808,
0.3711896539,
0.375114888,
-0.0460038669,
0.1523186862,
-0.1036454737,
-0.1489118934,
0.1098959371,
-0.3918215632,
-0.2247711867,
-0.0893317759,
-0.0843494236,
-0.1334339082,
-0.5113668442,
0.1355945915,
0.111934647,
0.1190610975,
0.4008898139,
-0.0387104154,
-0.1745303273,
-0.2668002248,
0.0672652572,
-0.0373616889,
0.1258926839,
-0.0388485305,
0.2830673158,
-0.0477951765,
-0.3878147304,
0.213315025,
0.0649929345,
0.3588872552,
0.1206026077,
-0.2820937037,
0.3386888206,
-0.2158184797,
0.0978475511,
0.1097210795,
0.2145824581,
-0.0102847181,
0.0225073211,
-0.09356004,
-0.0362786725,
-0.0905457139,
0.2599564493,
0.1787951589,
0.0101687871,
0.3479413092,
0.0176669508,
-0.198702544,
-0.2646785975,
0.5470200181,
0.2441310585,
-0.0872399211,
-0.1567059159,
-0.1239550114,
0.2401517481,
-0.2691853642,
0.080348514,
0.0650191158,
-0.0343373306,
0.164313212,
0.5011050701,
0.1254830956,
-0.0478971452,
0.149035871,
-0.084295474,
-0.0768764019,
0.0619588085,
-0.0179571379,
-0.3792603612,
0.2421418428,
0.081241399,
0.5509501696,
-0.1418093294,
0.2926924229,
0.117054224,
0.1912473738,
0.0523613319,
0.0357892029,
0.3498677611,
0.0717564821,
0.0685829446,
-0.069519639,
0.3497828245,
0.1589658558,
-0.0114231315,
-0.1220554858,
0.1632855982,
-0.1536622196,
0.0304485112,
-0.3204542994,
-0.0838659555,
0.0096938703,
-0.4271036685,
-0.2785019875,
-0.2624123096,
-0.4468615949,
0.1273811162,
-0.1581127644,
0.0449716151,
0.1046800613,
0.1011694148,
0.1346333921,
-0.1836872995,
-0.3204914331,
-0.0897288024,
0.7516455054,
-0.2739138007,
0.3097506464,
0.0145212524,
0.217330873,
0.1310404092,
0.2928340435,
0.0901146159,
0.0360215455,
-0.2581721544,
-0.0432590805,
-0.1751989722,
0.1129974425,
-0.0306168646,
0.2174671292,
0.162258625,
-0.0668796599,
0.3745429814,
0.1225539222,
-0.1779458523,
0.1882742494,
0.0499315038,
0.3199844956,
-0.2425409555,
0.1575742215,
0.1437198818,
0.2591030002,
-0.1944752932,
-0.0446774326,
-0.2592722178,
-0.1768310219,
-0.4694062471,
0.1417613477,
0.0533980876,
-0.0886674374,
0.1231106669,
0.1777944863,
0.1758432388,
-0.0021952055,
0.2403032631,
0.0196198411,
-0.1153641716,
-0.1746171117,
-0.0410734639,
-0.0697484091,
0.1757572591,
-0.1810774058,
0.2745839059,
0.0295503214,
0.3539458513,
0.0539212935,
0.1856225729,
-0.0278202854,
0.1713768989,
-0.2799512148,
0.122857213,
-0.4049213529,
-0.2996935844,
0.0162157416,
0.0775861442,
-0.1081046015,
-0.0085343719,
0.1224967092,
0.1487573385,
-0.4328812063,
0.1646028161,
0.1242290139,
0.3373119831,
0.4153500795,
0.2004426718,
-0.0122652873,
-0.4298879206,
-0.1146630272,
-0.0157829374,
-0.2943871915,
0.3858154714,
-0.3126731217,
0.2461761087,
0.2729096413,
-0.2843564749,
-0.0202986598,
0.6423119307,
0.0609293804,
-0.3174881339,
-0.184644267,
-0.0446750224,
0.1111224592,
-0.2338800877,
-0.3199439943,
0.4819263816,
-0.0924897939,
0.3255458772,
-0.040646866,
-0.2320199013,
0.1530935466,
-0.3321328163,
0.3376566768,
-0.0510522313,
0.3423327208,
0.4322225451,
0.0168741737,
-0.2412805557,
0.0299362093,
0.0586192384,
0.1642422378,
0.0302767307,
0.100157328,
-0.1344326735,
-0.0817964971,
-0.2281963825,
0.4084758461,
0.0062792953,
0.0131902918,
0.1105068326,
-0.3058945537
] |
https://github.com/huggingface/datasets/issues/617 | Compare different Rouge implementations | > This is a real issue, sorry for missing the mention @ibeltagy
>
> We implemented a more involved [solution](https://github.com/huggingface/transformers/blob/99cb924bfb6c4092bed9232bea3c242e27c6911f/examples/seq2seq/utils.py#L481) that enforces that sentences are split with `\n` so that rougeLsum scores match papers even if models don't generate newlines.
>
> Unfortunately, the best/laziest way I found to do this introduced an `nltk` dependency (For sentence splitting, all sentences don't end in `.`!!!), but this might be avoidable with some effort.
Thanks for the details, I didn't know about that. Maybe we should consider adding this processing step or at least mention it somewhere in the library or the documentation
> #### Sidebar: Wouldn't Deterministic Be Better?
> `rouge_scorer.scoring.BootstrapAggregator` is well named but is not deterministic which I would like to change for my mental health, unless there is some really good reason to sample 500 observations before computing f-scores.
>
> I have a fix on a branch, but I wanted to get some context before introducting a 4th way to compute rouge. Scores are generally within .03 Rouge2 of boostrap after multiplying by 100, e.g 22.05 vs 22.08 Rouge2.
I think the default `n_samples` of the aggregator is 1000. We could increase it or at least allow users to change it if they want more precise results. | I used RougeL implementation provided in `datasets` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py) and it gives numbers that match those reported in the pegasus paper but very different from those reported in other papers, [this](https://arxiv.org/pdf/1909.03186.pdf) for example.
Can you make sure the google-research implementation you are using matches the official perl implementation?
There are a couple of python wrappers around the perl implementation, [this](https://pypi.org/project/pyrouge/) has been commonly used, and [this](https://github.com/pltrdy/files2rouge) is used in fairseq).
There's also a python reimplementation [here](https://github.com/pltrdy/rouge) but its RougeL numbers are way off.
| 210 | Compare different Rouge implementations
I used RougeL implementation provided in `datasets` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py) and it gives numbers that match those reported in the pegasus paper but very different from those reported in other papers, [this](https://arxiv.org/pdf/1909.03186.pdf) for example.
Can you make sure the google-research implementation you are using matches the official perl implementation?
There are a couple of python wrappers around the perl implementation, [this](https://pypi.org/project/pyrouge/) has been commonly used, and [this](https://github.com/pltrdy/files2rouge) is used in fairseq).
There's also a python reimplementation [here](https://github.com/pltrdy/rouge) but its RougeL numbers are way off.
> This is a real issue, sorry for missing the mention @ibeltagy
>
> We implemented a more involved [solution](https://github.com/huggingface/transformers/blob/99cb924bfb6c4092bed9232bea3c242e27c6911f/examples/seq2seq/utils.py#L481) that enforces that sentences are split with `\n` so that rougeLsum scores match papers even if models don't generate newlines.
>
> Unfortunately, the best/laziest way I found to do this introduced an `nltk` dependency (For sentence splitting, all sentences don't end in `.`!!!), but this might be avoidable with some effort.
Thanks for the details, I didn't know about that. Maybe we should consider adding this processing step or at least mention it somewhere in the library or the documentation
> #### Sidebar: Wouldn't Deterministic Be Better?
> `rouge_scorer.scoring.BootstrapAggregator` is well named but is not deterministic which I would like to change for my mental health, unless there is some really good reason to sample 500 observations before computing f-scores.
>
> I have a fix on a branch, but I wanted to get some context before introducting a 4th way to compute rouge. Scores are generally within .03 Rouge2 of boostrap after multiplying by 100, e.g 22.05 vs 22.08 Rouge2.
I think the default `n_samples` of the aggregator is 1000. We could increase it or at least allow users to change it if they want more precise results. | [
-0.1199244112,
0.0578209162,
-0.0259994268,
0.2475374341,
-0.0194165334,
-0.711432457,
-0.0715661049,
-0.0210266709,
-0.2410771549,
0.3681192994,
-0.0042180419,
0.1752643585,
0.1229401678,
0.3598643243,
0.2975477278,
-0.1921320707,
0.0986103714,
-0.0162562653,
-0.049981501,
-0.0920573995,
-0.0059646741,
0.6030073762,
-0.0572605729,
0.1056355834,
-0.1943649352,
0.143197313,
0.1966952384,
0.0866383687,
-0.1192162782,
-0.1853498966,
-0.2451322973,
0.2077442408,
-0.0349604338,
0.4523562491,
-0.0001091167,
-0.2276299745,
0.2159284353,
-0.2167396694,
0.1104494184,
-0.1605564356,
0.0378191844,
-0.1098200455,
-0.0797065198,
-0.0814655498,
-0.2882641852,
0.0072497837,
-0.1965865642,
0.0967683494,
0.1674870402,
0.3560513258,
0.170536831,
-0.0147535354,
0.1806711107,
-0.2480340451,
0.5668404102,
-0.0728998259,
0.0628357381,
0.611348331,
-0.0754707158,
-0.0680767074,
-0.3386307955,
0.3610009849,
0.1657429636,
-0.3372176886,
0.0628842786,
0.1117007285,
-0.0137627795,
-0.1807056963,
-0.0311420076,
0.6794781089,
-0.1486431658,
0.0029161931,
-0.4949690998,
-0.09837953,
0.0643407851,
-0.3069672585,
0.1008421183,
0.0712011531,
0.0650979131,
-0.1525475681,
-0.1313441098,
0.2446853518,
-0.1071734428,
-0.1755683124,
-0.1494296938,
-0.0151285082,
-0.0904636681,
-0.0794898123,
0.3360756636,
-0.0673166811,
-0.5943484306,
0.3752996325,
-0.1992858946,
-0.1545650214,
-0.4956995547,
-0.0426405482,
0.1276842207,
0.433683604,
0.302570343,
0.445898205,
0.3535337448,
0.0588149726,
0.1667740047,
0.0529164746,
-0.228267774,
0.0254596211,
-0.0179460794,
-0.0743468031,
0.4144559205,
0.1811348051,
0.0121966451,
0.1405445635,
0.4187541306,
-0.5451744199,
-0.1845689714,
0.1008501798,
-0.3514789343,
-0.3432896435,
-0.3676083982,
0.187201947,
-0.5367393494,
-0.0158152245,
-0.0174540281,
0.196838811,
-0.1532121748,
0.2220282555,
0.0514643043,
0.0718089044,
-0.334879905,
-0.3064835072,
-0.1950989217,
0.0791319087,
-0.1510115266,
0.2155081034,
0.0868386477,
-0.0100459792,
0.2042093426,
0.0597339906,
0.1382121891,
-0.3667333126,
0.0484399721,
-0.2148969024,
-0.1565047503,
-0.0596127063,
-0.1266813874,
-0.2758528292,
-0.0598978922,
-0.2508654296,
-0.1477452815,
-0.1589675099,
-0.1754632145,
-0.0020287037,
0.4212627411,
0.2016869038,
-0.2421362102,
-0.1874316335,
-0.0104133971,
0.293857038,
-0.1305385679,
-0.0845899507,
0.1509461403,
-0.2787757516,
-0.324457407,
-0.0561416671,
0.3232985735,
0.2026692629,
-0.1975644678,
-0.2477599233,
0.6061240435,
0.0403279923,
0.033667095,
0.380957514,
0.1653751135,
0.3888625503,
0.0549484566,
-0.0502893478,
0.1929952055,
-0.2700240016,
-0.24221614,
-0.0212493651,
0.0832389444,
-0.0768049657,
0.1449464411,
-0.1102798432,
0.1698947847,
0.0691121668,
0.1431255043,
0.2125445008,
0.1052324325,
0.0390274972,
-0.1694837213,
0.023125127,
0.2083105594,
0.1826230586,
-0.0441776402,
-0.2475353032,
-0.0602214448,
-0.5506240129,
0.3593120873,
-0.2336246371,
-0.2559226751,
0.3169038892,
0.0317748077,
0.13310799,
0.2627289891,
0.0109734982,
0.1177087426,
-0.0579706877,
-0.8542648554,
0.2441547811,
0.0617072433,
-0.1692024171,
-0.3705712557,
-0.1805853546,
0.3003234565,
-0.437109679,
0.1600411534,
-0.0143352821,
0.3412573338,
0.2341624796,
0.0369939506,
0.4147161245,
-0.117359668,
0.1140119135,
-0.4648655653,
0.0019421205,
-0.009559039,
-0.2148297578,
0.3159088492,
0.4534548521,
0.1284362227,
0.1765094101,
-0.0494393855,
0.4062877297,
0.3399906456,
0.1041297838,
0.1333082914,
0.0938195884,
0.0694995448,
-0.0890281796,
-0.1304591149,
0.5995198488,
0.124730736,
0.1686526537,
-0.3227333724,
0.598003149,
-0.3787483573,
0.046395231,
0.1247787178,
-0.1300301105,
0.1045123264,
0.1057683527,
-0.3970209956,
0.0756275952,
0.4484715462,
0.1915376782,
0.5518531203,
0.0433865339,
-0.3257265389,
-0.0421127081,
0.3595893383,
-0.1903333068,
0.1797348559,
0.0371971875,
-0.0867276117,
-0.1773939133,
-0.0219608508,
-0.0783407539,
0.0025473982,
0.219588846,
0.0197211206,
-0.0007428713,
-0.1408707052,
-0.4390755296,
0.0764094368,
-0.1429208368,
-0.0605829582,
0.3043670952,
0.2423510551,
0.0617751591,
-0.4098846912,
-0.2927572131,
-0.3892459571,
-0.0420893803,
-0.1957098246,
0.1807436794,
-0.0944137201,
0.1418757737,
-0.1394219398,
-0.1282726973,
-0.5086535215,
-0.235935986,
0.2334479094,
0.0763521343,
-0.130482018,
-0.024574928,
0.303912282,
0.313480556,
0.0245653242,
0.0222488064,
0.0049759001,
-0.1759691089,
-0.2359922081,
0.1460829377,
-0.2856432199,
-0.1068562791,
0.083633326,
-0.3754405081,
-0.0247781351,
0.0332226753,
-0.4409694076,
0.2545733452,
-0.0748175085,
0.275578618,
0.2000905275,
-0.366266489,
-0.0426003002,
0.4201519489,
0.1592187434,
-0.312210381,
-0.2852805853,
0.2487557679,
-0.0951728225,
0.204825595,
-0.1195690185,
-0.41275841,
0.0095929485,
-0.1093280911,
0.4913552999,
0.1069438681,
0.0888677612,
0.2736889422,
-0.05428835,
0.0500338636,
-0.009820994,
0.2073386312,
-0.1839256734,
-0.0335970223,
0.342264086,
-0.3354376853,
-0.5135299563,
-0.1427980959,
0.0637116432,
-0.2013811022,
0.2057773322,
-0.1288752109,
-0.2995448112,
-0.0593247563,
-0.0508390926,
-0.0551187694,
0.0820388421,
0.4349642396,
-0.2504158914,
-0.0757331252,
-0.1027090177,
0.0838333368,
0.3216792643,
-0.1883358955,
0.2084261477,
-0.1588684916,
0.3865531385,
0.0372822955,
0.0425753593,
0.6114038229,
0.3491764963,
-0.1800513268,
-0.0786045492,
0.096094355,
-0.2694203258,
0.4766198695,
0.2780717611,
-0.0155188739,
-0.1733079106,
0.3527807295,
0.0544786081,
0.291913867,
-0.1228650361,
-0.0032973811,
-0.0022114292,
-0.4111375809,
0.085220933,
0.2045759559,
0.2007600367,
-0.2420282364,
0.2248526961,
0.0827758834,
-0.1478874385,
0.1690865755,
0.0599400438,
0.4045790434,
0.1620070338,
-0.2010130584,
0.2774576843,
-0.4924328029,
0.3232811391,
0.1231624037,
0.0037520118,
0.0346342921,
-0.2645380497,
-0.0398271903,
-0.1963595152,
0.2720934153,
-0.3343458474,
-0.0398721509,
0.0852229446,
-0.2270896435,
-0.2626543343,
-0.0394852757,
-0.466812104,
-0.4294920266,
0.1189718246,
0.025722824,
-0.4162072539,
0.1408768147,
0.294952333,
-0.0939941555,
-0.0376575142,
0.0707492977,
-0.3008877039,
-0.1250915378,
0.1245124564,
0.2273995727,
0.0153754056,
0.281619668,
-0.1882918179,
0.0350895002,
0.0939709023,
-0.1933540404,
0.2874558568,
0.3182861507,
0.3587850034,
-0.296759367,
-0.1858538389,
0.1272570789,
-0.1283438355,
-0.19191432,
0.3262425661,
-0.0692246407,
-0.4199405313,
0.2371009886,
-0.2747508883,
0.1988114268,
0.351726234,
-0.0434972979,
0.231982559,
-0.2294519544,
0.168238759,
-0.3555619717,
-0.1029997766,
-0.0069566704,
0.1039782763,
0.1005925834,
-0.1379530728,
0.3263966441,
0.2669791281,
-0.1933407187,
-0.4485455751,
-0.0590692051,
-0.2420428693,
0.4088695049,
0.0335080177,
0.9406026602,
0.1146585122,
0.215993315,
0.1509401351,
-0.2938795686,
0.485711962,
-0.4107206166,
0.1659611911,
-0.5914514065,
-0.1540631652,
-0.0236731172,
0.0152200684,
0.0090304017,
0.2254936993,
-0.1336909235,
0.5277870893,
0.1549178064,
0.2343394011,
0.0684799626,
0.2131848335,
0.0914744437,
-0.1363696158,
-0.2000810504,
0.1548395455,
-0.0257829651,
0.0350451469,
-0.0692337453,
0.0165992193,
-0.4950155616,
0.0172245651,
-0.0018319786,
0.0330699235,
-0.2493986487,
-0.0405306816,
0.2403572202,
-0.6468833685,
-0.3628727198,
0.4100300074,
0.3001852334,
0.3615040481,
-0.1065691859,
0.1137762815,
-0.2730525434,
0.2960500121,
0.2464797497,
-0.0022383183,
0.1830039322,
-0.0635612607,
-0.1858986318,
-0.0480877087,
0.1917341501,
-0.3508484066,
-0.290160656,
-0.0844339952,
-0.3248622119,
-0.1663080156,
0.3957933784,
-0.0780091286,
-0.0337801166,
-0.0825269967,
0.1380173564,
0.1615873277,
0.0219264999,
0.0516943112,
0.078588888,
-0.2714337111,
-0.1466720998,
0.0999818444,
-0.062063016,
0.0250698775,
-0.1733117849,
0.112014547,
-0.0616239794,
-0.2199271321,
-0.0949828923,
0.121655032,
-0.3167486489,
0.0629126281,
-0.0662865043,
-0.1874003261,
0.084854804,
0.4541495442,
0.1611186713,
-0.0432568677,
0.2365882844,
-0.2844912112,
-0.1792441905,
0.4154626429,
0.338075161,
-0.0232429504,
0.1925075948,
-0.0853181034,
-0.068743065,
0.0929522961,
-0.3633158803,
-0.2334937006,
-0.0807267278,
-0.0226744525,
-0.0575198457,
-0.5269919038,
0.0452140644,
0.1302617937,
0.071752958,
0.3796460629,
-0.0466157533,
-0.1393401623,
-0.294829607,
0.0863277763,
-0.0445367917,
0.2001894414,
-0.0187495798,
0.2959343791,
-0.0253695752,
-0.3763458729,
0.2337667495,
0.0076601282,
0.3363521695,
0.1284157038,
-0.2368224859,
0.3274031281,
-0.2537736297,
0.076803036,
0.0251888763,
0.1082151979,
0.0066505,
0.0160391312,
-0.1115240008,
-0.0190690756,
-0.0546855479,
0.2950346768,
0.2611951232,
0.0595225804,
0.3306888044,
-0.00116242,
-0.2251048684,
-0.2662790418,
0.5229460001,
0.2373593152,
-0.0539454594,
-0.2033505589,
-0.0982935727,
0.2226355821,
-0.2943924665,
0.0724181384,
0.0778273344,
-0.0986257195,
0.1442730278,
0.4975625873,
0.161487788,
-0.0648138747,
0.1025288403,
-0.0809709877,
-0.0199593809,
-0.0075993231,
0.0267789625,
-0.349716574,
0.2409505844,
0.0704932064,
0.5140093565,
-0.1148748249,
0.2195922881,
0.0879001543,
0.2013389915,
0.034309119,
0.0216833986,
0.4369773865,
0.0568814501,
0.1682256162,
-0.1973413825,
0.3928685784,
0.1391464472,
0.0145085026,
-0.1283886135,
0.2978076935,
-0.1089281291,
0.0387909971,
-0.3456052542,
-0.0703535527,
0.0274057314,
-0.4653732181,
-0.2823556662,
-0.2791446447,
-0.4714231193,
0.1094114035,
-0.1195451468,
0.0771695152,
0.1154010519,
0.1312566102,
0.1319939494,
-0.2858125269,
-0.3366743028,
-0.1074163467,
0.673306942,
-0.3037390113,
0.2952615023,
-0.0401768498,
0.2007912397,
0.1491844654,
0.3282279968,
0.0973955095,
0.0338948704,
-0.2017309815,
-0.0645841211,
-0.1578637362,
0.1661271006,
-0.0251391418,
0.1989767253,
0.2382317781,
-0.1389899999,
0.2985790074,
0.0918388516,
-0.133091867,
0.2325230092,
0.041057013,
0.3143247962,
-0.2229611576,
0.176420629,
0.1007426083,
0.2382913977,
-0.2092223167,
-0.1073141918,
-0.3077573776,
-0.1275486797,
-0.4239981472,
0.1715765893,
0.0421764292,
-0.1228060126,
0.1027342379,
0.1181353629,
0.1602745205,
-0.0267958827,
0.2585560381,
0.0192660093,
-0.0747789294,
-0.1981174052,
-0.0183101222,
-0.0648415834,
0.1657652259,
-0.2037338614,
0.3187338412,
-0.0221431181,
0.3684748411,
-0.0281962827,
0.2310346812,
-0.0696301311,
0.1423701942,
-0.320133239,
0.1295228601,
-0.3543375134,
-0.2733685374,
0.0231841132,
0.1113184467,
-0.0601809174,
-0.0097246356,
0.0663770214,
0.1673562229,
-0.3448731899,
0.1648920178,
0.1763140559,
0.2991234958,
0.4876224995,
0.2544558048,
-0.0168118142,
-0.3954349458,
-0.1899193972,
-0.0650434047,
-0.2933447659,
0.4509833455,
-0.2019187659,
0.3059763014,
0.2800118327,
-0.2772437334,
-0.0059781335,
0.7327029109,
0.0470348075,
-0.3717381656,
-0.1414088756,
-0.1147761047,
0.180014953,
-0.2592807412,
-0.3121393025,
0.4041659534,
-0.0878884345,
0.2905735672,
-0.0122393668,
-0.3303340673,
0.1637310833,
-0.3309813142,
0.2920036912,
-0.0520890392,
0.3888616264,
0.4497038722,
0.0056047942,
-0.3092067242,
0.0215134174,
0.0359866545,
0.1832431257,
0.0314044058,
0.12591438,
-0.0964608639,
-0.1132310778,
-0.2234428972,
0.4332584739,
0.044549305,
-0.0269824564,
0.0888876319,
-0.3598042727
] |
https://github.com/huggingface/datasets/issues/617 | Compare different Rouge implementations | Hi, thanks for the solution.
I am not sure if this is a bug, but on line [510](https://github.com/huggingface/transformers/blob/99cb924bfb6c4092bed9232bea3c242e27c6911f/examples/seq2seq/utils.py#L510), are pred, tgt supposed to be swapped? | I used RougeL implementation provided in `datasets` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py) and it gives numbers that match those reported in the pegasus paper but very different from those reported in other papers, [this](https://arxiv.org/pdf/1909.03186.pdf) for example.
Can you make sure the google-research implementation you are using matches the official perl implementation?
There are a couple of python wrappers around the perl implementation, [this](https://pypi.org/project/pyrouge/) has been commonly used, and [this](https://github.com/pltrdy/files2rouge) is used in fairseq).
There's also a python reimplementation [here](https://github.com/pltrdy/rouge) but its RougeL numbers are way off.
| 25 | Compare different Rouge implementations
I used RougeL implementation provided in `datasets` [here](https://github.com/huggingface/datasets/blob/master/metrics/rouge/rouge.py) and it gives numbers that match those reported in the pegasus paper but very different from those reported in other papers, [this](https://arxiv.org/pdf/1909.03186.pdf) for example.
Can you make sure the google-research implementation you are using matches the official perl implementation?
There are a couple of python wrappers around the perl implementation, [this](https://pypi.org/project/pyrouge/) has been commonly used, and [this](https://github.com/pltrdy/files2rouge) is used in fairseq).
There's also a python reimplementation [here](https://github.com/pltrdy/rouge) but its RougeL numbers are way off.
Hi, thanks for the solution.
I am not sure if this is a bug, but on line [510](https://github.com/huggingface/transformers/blob/99cb924bfb6c4092bed9232bea3c242e27c6911f/examples/seq2seq/utils.py#L510), are pred, tgt supposed to be swapped? | [
-0.046239879,
-0.3402344584,
-0.014098607,
0.379103899,
-0.0787524506,
-0.6177633405,
0.0081207678,
0.0337362289,
-0.3628860414,
0.3499984443,
-0.26577878,
-0.1302829385,
0.1970667839,
0.2637880445,
0.2604075074,
-0.2404888421,
0.087067835,
-0.0158298649,
-0.3073503375,
-0.1472105235,
-0.1258978546,
0.7282137275,
-0.118752569,
0.1422053874,
-0.1178554446,
0.3190754354,
0.0754644796,
0.142885536,
-0.0769109428,
0.0503386632,
-0.077392526,
0.1364056915,
0.0535057113,
0.6272997856,
-0.0001120195,
-0.0545935668,
0.047430601,
-0.1357920468,
0.0693165511,
-0.0228228718,
-0.1206794605,
-0.1080062389,
-0.1051930934,
-0.1453910023,
-0.2766058743,
-0.1439554691,
-0.2889887393,
0.2026911527,
0.2424776852,
0.4388734102,
0.183044523,
0.0371562615,
0.1999946833,
-0.1163284183,
0.5332652926,
-0.0113951638,
0.0095242485,
0.4414726198,
0.0845699757,
0.0121362284,
-0.1068576276,
0.2801780701,
0.2016874403,
-0.2245486677,
0.0831993371,
0.1899203062,
0.0646680146,
-0.0947259888,
0.0609513745,
0.49447155,
-0.0338066556,
-0.2444930673,
-0.4565924704,
0.0317239165,
-0.2688061893,
-0.3344593048,
0.0806371272,
0.0214644447,
0.3556173444,
-0.0614489764,
-0.2099110931,
0.0630048364,
-0.0412148982,
-0.2200862467,
-0.4216421843,
-0.058421284,
-0.1846104264,
0.0430523232,
0.1635530442,
-0.1405413896,
-0.56049335,
0.2374845147,
-0.1599107683,
-0.1976957023,
-0.3531376123,
-0.1151278242,
0.1373969615,
0.3342139423,
0.0371529013,
0.4631066322,
0.1254403889,
0.1216643006,
0.0856849998,
0.058696609,
-0.3227404058,
0.2932860255,
-0.1890388131,
0.1720276922,
0.3554430902,
0.0926143378,
0.0749708563,
0.0584119596,
0.4156937599,
-0.6989447474,
-0.2002398968,
0.0524521247,
-0.1671948135,
-0.4326246381,
-0.429902643,
-0.0271915831,
-0.176751852,
-0.0308979191,
0.1167363301,
0.1503124386,
-0.1495214403,
0.0424441881,
0.1605863571,
-0.005533766,
-0.5098375082,
-0.1614807844,
-0.2780582607,
0.1651213765,
-0.1665262431,
0.1229582205,
0.0699660927,
0.1001438126,
0.2009732425,
0.096145913,
0.1871935129,
-0.0009885058,
0.1759387553,
-0.0594163425,
-0.1663106084,
0.1371477544,
-0.1835514754,
-0.0147335343,
0.0321946666,
0.0244992077,
-0.1782719344,
-0.2502809763,
-0.4921046495,
-0.1060734168,
0.2954382002,
0.190746218,
-0.2942807972,
-0.0692167282,
-0.1578971744,
0.2298371494,
-0.035716068,
-0.5121923685,
0.0543524958,
-0.3018952608,
-0.3901175857,
-0.160663113,
0.3361693025,
0.1987191737,
-0.1800173223,
-0.1948148012,
0.5356314778,
-0.0672033504,
0.110706903,
0.4747157097,
0.049293071,
0.005550025,
-0.1745309085,
-0.2681949735,
0.1999695599,
-0.4523168802,
-0.3830573857,
0.0476569533,
0.0821939558,
-0.2138217986,
0.1366919875,
-0.1346651763,
-0.0554779395,
-0.1711652577,
0.1451754868,
0.0700771809,
0.0844068751,
0.0982006639,
-0.2546146512,
0.0316348448,
0.2501724362,
0.0316303596,
-0.0061232699,
-0.0144204274,
-0.012352176,
-0.4560584128,
0.1696413904,
-0.0568993799,
-0.1102577969,
0.3962090611,
-0.025027981,
0.0397523716,
0.3803076148,
0.1711031646,
-0.1741193235,
-0.0921760052,
-0.9340206385,
0.1069791615,
0.1560129225,
-0.0174842849,
-0.4241198301,
-0.2808051705,
0.3445938826,
-0.4135505259,
0.1371367425,
-0.0728049427,
0.3419044018,
0.1799040586,
0.2065045685,
0.303301543,
0.2343680114,
0.0536503233,
-0.3724865615,
0.110248372,
-0.0694721192,
-0.0858073682,
0.2577387989,
0.3375179172,
0.246300444,
-0.022180317,
-0.0713819116,
0.4225406945,
0.2885461748,
0.3854726255,
0.336746484,
0.4276492596,
0.2519123852,
-0.2109387964,
-0.1562891006,
0.5646334291,
0.1081747711,
0.0383512974,
-0.4680088162,
0.5065366626,
-0.2661363482,
-0.0121040642,
0.1127236709,
-0.0463253669,
0.0814128295,
0.0910871476,
-0.3921840489,
0.1565761417,
0.2917781174,
0.3464290798,
0.6304524541,
0.1406930238,
-0.096763581,
0.1004114449,
0.3997429311,
-0.101715602,
-0.0067248903,
0.1135701537,
-0.1569256485,
-0.2928292751,
0.1107683703,
-0.089380078,
0.0598919094,
0.2097008079,
0.1487045437,
0.0371915475,
-0.1481209248,
-0.3523477912,
0.0504463129,
-0.1629638523,
-0.2154396772,
0.1390167773,
0.1744086444,
0.035320159,
-0.2427976578,
-0.2507175803,
-0.2253890783,
0.1806871742,
-0.2774428427,
0.245785743,
0.003850244,
0.2574870586,
-0.0747097358,
-0.3186458051,
-0.3954649866,
-0.2954205871,
0.264813602,
0.0942244083,
-0.139735505,
0.0806233883,
0.2114581913,
0.1704194993,
0.1263625026,
-0.0609060675,
-0.2691189349,
0.067047134,
-0.3442385495,
0.1047260463,
-0.2446594983,
-0.0788620412,
0.0806714073,
-0.4296239614,
0.0975878909,
-0.1162440702,
-0.5498434901,
0.1776522398,
0.0356231295,
0.3770738244,
0.1881418079,
-0.3002070189,
-0.0184458196,
0.4710702896,
0.2002329826,
-0.3231211901,
-0.1657702327,
0.3012692332,
-0.2839274406,
0.2294912636,
-0.2235219777,
-0.3297276199,
0.0143192094,
-0.086290203,
0.2834469974,
0.1472924501,
0.092527777,
0.229897216,
0.064915061,
-0.1279924661,
0.387096107,
0.1898572147,
-0.3099277616,
-0.2520888448,
0.3893835843,
-0.3500760794,
-0.5373866558,
-0.2022143304,
0.0193116069,
-0.0161611978,
0.2700972259,
-0.1697933376,
-0.3686189651,
-0.3688878715,
-0.1145495027,
0.0201560371,
0.1319064796,
0.4350543618,
-0.171393007,
-0.1104350388,
-0.2714864016,
0.0526834801,
0.1724310666,
-0.0486957729,
0.2192721814,
-0.0858305693,
0.2024034113,
-0.0671012253,
0.3914038539,
0.4602386355,
0.1244401187,
-0.0473766252,
-0.1539420933,
0.2863890529,
-0.1996269971,
0.2588323057,
0.1851466149,
0.2490400523,
-0.061673671,
0.3672212958,
0.2207797766,
0.2248218358,
-0.1708764881,
0.2625939846,
-0.1587224752,
-0.3359984756,
-0.084134981,
0.397485584,
0.1898153126,
-0.0296097547,
0.0822152495,
-0.0941662639,
-0.1437634528,
0.2429081798,
0.1454965621,
0.3763532639,
0.2039704919,
-0.1141538545,
0.1987540722,
-0.6007113457,
0.4602782726,
-0.0555289797,
-0.0449600853,
-0.0817773342,
-0.1381321102,
-0.0442637578,
-0.2185064852,
0.2711828351,
-0.2302881628,
0.0479133539,
0.1509731263,
-0.4011393189,
-0.3824819326,
-0.097754471,
-0.4528390467,
-0.2298432291,
0.2018151879,
0.2029627562,
-0.537237227,
-0.0735696405,
0.0599506982,
-0.1079467461,
-0.1870277524,
0.2506587803,
-0.2709744573,
-0.2244588286,
0.2324507833,
-0.0348382704,
0.0112215728,
0.2210058272,
-0.2367371023,
0.1307561398,
0.2223873585,
-0.1823404431,
0.2928214073,
0.2518825531,
0.4831474721,
-0.23469612,
-0.1617090702,
0.1818935275,
-0.0874763951,
0.0406565145,
0.3343541026,
0.0864176303,
-0.2289610207,
0.4154284596,
-0.0384386554,
-0.0139007121,
0.238321811,
-0.1080385149,
0.013707146,
-0.1961780936,
0.3133020997,
-0.4277526438,
-0.0448633358,
0.2073246688,
0.1391384304,
-0.0488648713,
-0.2333491743,
0.2371944785,
0.329957664,
-0.2377562821,
-0.2284530252,
0.0542808361,
-0.2001757324,
0.1643239856,
0.0023744926,
1.0280885696,
0.1047323346,
0.2302794456,
0.2844593227,
-0.362120688,
0.2803661823,
-0.4588981867,
0.1027160585,
-0.5660175085,
-0.2690150142,
-0.0265094116,
-0.0703129098,
-0.1235000193,
-0.1072547138,
-0.1337423474,
0.4592169821,
0.1445992291,
0.1331367195,
-0.0539651997,
0.1167640015,
0.1683039963,
0.0048504174,
0.1752375066,
0.1566210538,
0.0639585331,
0.2994731665,
-0.1798163503,
-0.0644776896,
-0.5498410463,
-0.0796332359,
-0.0305480361,
-0.135121569,
0.0250813179,
0.2051969171,
-0.0030894754,
-0.5478056669,
-0.4278319478,
0.467351377,
0.3638888597,
0.1543482095,
-0.2965243161,
0.1685762852,
-0.4701935053,
0.3899484575,
0.0743404478,
0.1156045496,
0.0724764392,
-0.0994995087,
-0.0437537134,
-0.0537650697,
0.1490103006,
-0.250687778,
-0.6245260239,
-0.1295892745,
-0.5371043086,
-0.310464412,
0.5415868163,
-0.0805540979,
-0.1622421741,
-0.0381717309,
0.1411793232,
0.2169873565,
-0.0275865998,
0.324952662,
0.039310459,
-0.4234240055,
-0.1621248126,
0.3064824939,
0.0121894628,
-0.0430701822,
-0.0259228759,
0.0730395913,
0.0162067786,
-0.1115278751,
-0.1409135461,
0.1159519851,
-0.4342989922,
0.2139222622,
-0.0991929993,
-0.0253002346,
0.1100039035,
0.4212901592,
0.2335809618,
0.0027086753,
0.2833829522,
-0.1890314817,
-0.3145739436,
0.4179907739,
0.1682227701,
0.186131686,
0.0181394741,
-0.1133886576,
0.1064388007,
0.0218629111,
-0.3656167984,
0.0785605982,
-0.0261602029,
0.0308773518,
-0.1768050492,
-0.4855468869,
0.1199009046,
0.1508075297,
0.0167067386,
0.0663633794,
-0.2826023698,
-0.1009673029,
-0.338539511,
0.0959462002,
-0.1007137671,
-0.0454224609,
-0.0591186248,
0.2864865065,
-0.0228683613,
-0.2581014633,
0.247131139,
0.1842896044,
0.3011256158,
0.1415239573,
-0.148214817,
0.0849049538,
-0.1509341151,
0.1452335119,
0.315030098,
0.0186424702,
-0.0567420647,
0.0243447423,
-0.189332366,
-0.1604847014,
0.0903149247,
0.1853341013,
0.2324613929,
0.0039605834,
0.3097017109,
0.056821622,
0.0860439837,
-0.104204163,
0.6678876281,
0.1300584376,
-0.0921560824,
-0.0505930856,
-0.1103082895,
0.1822528392,
-0.1761768162,
0.043317154,
0.1966630071,
0.0390420519,
0.1300090402,
0.3549924493,
0.2215702385,
0.0620074868,
0.4383098483,
-0.0137923546,
0.186324209,
-0.1359263361,
-0.0669825152,
-0.2568195462,
-0.0755849183,
0.2242778987,
0.5384521484,
-0.0414159037,
0.2130903155,
0.2482745051,
0.1608667821,
0.055745624,
-0.0978939831,
0.1659348756,
0.1322990805,
-0.0367491804,
-0.3595758677,
0.2460220456,
0.1544193625,
0.0825497359,
-0.0643205345,
0.2218967974,
-0.1357973218,
0.1727040112,
-0.3701014817,
0.0143398568,
-0.1660202146,
-0.4361302853,
-0.1228215992,
-0.3909068704,
-0.2636037767,
0.0903255939,
-0.0920612961,
0.0655299127,
0.2612420022,
0.0224436428,
-0.018302843,
-0.0816771835,
-0.4220671356,
0.0550417379,
0.4008125365,
0.004868567,
0.2830739319,
0.1144407988,
0.0623964071,
0.2156861722,
0.3841762543,
0.1169891953,
-0.0475930013,
-0.1749842465,
-0.0584877133,
0.1209591255,
0.1552107334,
-0.1109540612,
0.2898358405,
0.1443904936,
-0.3088453412,
0.1412968636,
0.1111897528,
-0.1289557368,
0.2223315984,
-0.2095453441,
0.2623899877,
-0.2080375999,
0.1602799296,
-0.159402892,
0.1173659116,
-0.1060338169,
-0.1884704083,
-0.3168027401,
-0.1629389971,
-0.355791688,
0.1665894687,
0.1886274815,
-0.0896943361,
0.0911408663,
0.1815398335,
0.0131885856,
0.0751545057,
0.2211910933,
-0.0654823333,
-0.1103831977,
-0.2407247722,
0.1426550448,
0.0655627847,
0.3086327016,
-0.2443651855,
0.2922241092,
-0.0720493197,
0.1290784478,
0.3750628829,
0.1218327433,
0.1178660616,
0.2167735398,
-0.3038434088,
0.1798714995,
-0.3895220757,
-0.166802898,
0.0952638835,
0.0987609327,
0.0573728792,
0.0368029624,
0.1220122948,
-0.1249680892,
-0.1437063366,
0.1236509234,
0.2193996757,
0.3691052794,
0.5210086703,
0.2177211791,
-0.1560081989,
-0.131056875,
-0.1762014329,
-0.114132151,
-0.0443402641,
0.1646037102,
-0.3079449236,
0.0857177451,
0.1169649586,
-0.5630624294,
-0.1023759395,
0.6308069825,
0.0810671151,
-0.309879154,
-0.1672503054,
-0.0631316677,
0.0277876258,
-0.323536545,
-0.2412038445,
0.5562444925,
0.0396714881,
0.2578608096,
-0.1080275252,
-0.3692856133,
0.4570892453,
-0.337316066,
0.2436260879,
-0.1098593175,
0.3534841537,
0.4649354219,
-0.0062323865,
0.0651136786,
0.1634216905,
0.0482459478,
0.1417990774,
0.0699137673,
0.1563724577,
-0.0514437594,
-0.1225458682,
-0.1321689487,
0.6715167165,
0.1798332036,
-0.1915442646,
0.213578105,
-0.1527656615
] |
https://github.com/huggingface/datasets/issues/616 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors | I think the only way to avoid this warning would be to do a copy of the numpy array before providing it.
This would slow down a bit the iteration over the dataset but maybe it would be safer. We could disable the copy with a flag on the `set_format` command.
In most typical cases of training a NLP model, PyTorch shouldn't modify the input so it's ok to have a non-writable array but I can understand the warning is a bit scary so maybe we could choose the side of non-warning/slower by default and have an option to speedup.
What do you think @lhoestq ? | I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List. | 106 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors
I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List.
I think the only way to avoid this warning would be to do a copy of the numpy array before providing it.
This would slow down a bit the iteration over the dataset but maybe it would be safer. We could disable the copy with a flag on the `set_format` command.
In most typical cases of training a NLP model, PyTorch shouldn't modify the input so it's ok to have a non-writable array but I can understand the warning is a bit scary so maybe we could choose the side of non-warning/slower by default and have an option to speedup.
What do you think @lhoestq ? | [
0.2382436991,
-0.3691009879,
0.0651262179,
0.1298750937,
0.4380832016,
0.0914001763,
0.6286587119,
0.2343869358,
0.297601819,
0.0604874976,
-0.1968633384,
0.4237813354,
-0.3091077507,
-0.2186416239,
-0.2565727532,
-0.2071908116,
-0.0199281424,
0.1609406918,
0.0264156014,
-0.0268478245,
-0.0556125604,
0.0349227414,
-0.173422873,
0.3379642367,
-0.225587368,
-0.409830004,
0.2081688046,
-0.0633970797,
-0.0799703747,
-0.3318473399,
0.1405587494,
-0.2741777301,
-0.0424051955,
0.1276778132,
-0.0001266726,
0.115719974,
0.0211010482,
-0.0969709903,
-0.0743484646,
-0.1103796959,
0.7791624069,
-0.3006571531,
0.2598287463,
-0.5041794777,
-0.0403004214,
-0.2053054571,
-0.0674015433,
-0.2887424231,
0.6341737509,
0.2032752633,
0.0752524808,
0.5057702065,
0.1049248055,
0.3881486058,
-0.0660524815,
0.3452899754,
-0.2550108433,
-0.0293221362,
0.0295191612,
0.1392834634,
-0.0000623427,
0.2642534673,
-0.591407299,
0.0171456039,
0.1526442766,
-0.0334381759,
-0.3494703472,
-0.1997034103,
-0.056856554,
0.4435257912,
0.3706564307,
-0.2254173011,
-0.0462095737,
-0.3330504298,
-0.1216261089,
-0.15876773,
0.1649688482,
0.0478616655,
-0.1130081713,
0.1626305133,
0.0142953508,
-0.0069781654,
-0.1369983852,
0.2631309032,
0.0543760359,
0.1673308313,
-0.1042309105,
0.3020677567,
-0.0703181326,
-0.0098414812,
0.052553013,
0.0074248388,
0.190158084,
0.1962173134,
-0.1164152175,
-0.3119843006,
-0.259937942,
-0.7474352717,
-0.0870940238,
-0.2492241859,
0.3341350555,
0.2540560067,
-0.1607196927,
0.0971487164,
0.146938771,
0.0135782659,
0.1208702549,
0.3030396402,
-0.0211056247,
-0.264131099,
0.4715954661,
0.0567320958,
0.1360284239,
-0.2395027131,
0.0944851488,
0.0855746642,
0.0345737711,
0.0435557663,
-0.4489812255,
0.1235724092,
-0.2449782193,
0.0512569025,
-0.0606116205,
0.0437430851,
-0.0110614114,
0.0084558968,
0.0809399411,
0.3331674039,
-0.2172212154,
0.2093103677,
0.0050660782,
0.2291955501,
-0.1825449467,
-0.1943187118,
-0.0641829595,
0.2339447439,
0.0189593509,
0.261413604,
0.0302924141,
0.0009380952,
0.1234879121,
-0.2359407246,
0.6530772448,
0.2942769527,
-0.1781086177,
0.2948441505,
0.3621657789,
-0.0307601988,
-0.3972629905,
0.2834928632,
-0.3104486465,
-0.2276653945,
-0.5769512057,
0.0020176312,
-0.1900073588,
0.1614835858,
0.1787660718,
-0.2559921443,
0.6716822386,
-0.1626642048,
0.4028994739,
-0.437556386,
-0.0927187949,
-0.3298019171,
-0.0530219451,
0.3662084937,
-0.3259871602,
-0.0640388876,
0.2587724626,
-0.0162681695,
0.4336456358,
0.4094215333,
-0.1056689024,
0.1454502344,
-0.1220319569,
0.2255038023,
0.3607750535,
-0.1224516407,
-0.0684337988,
0.1887910813,
-0.1807033718,
0.3171916008,
0.1701441705,
0.0854641497,
-0.1042877659,
0.1792413145,
0.2188511193,
0.0108663999,
0.3695319891,
-0.0156878829,
-0.3240776062,
-0.2027048916,
0.4832918048,
-0.135716185,
-0.0672294647,
-0.1775229275,
-0.1403666735,
0.0221993178,
0.3671857119,
-0.1119872481,
0.029092133,
-0.0411327407,
-0.1673816741,
0.239643842,
-0.1404467672,
0.1050847545,
-0.1710646003,
-0.0152081698,
0.0745917261,
0.1085797846,
-0.2709116638,
-0.1766556054,
-0.105172269,
0.0071569905,
-0.3517927229,
-0.0465407893,
-0.1111046374,
-0.1885503531,
-0.1136639789,
0.1317767352,
-0.1785909235,
-0.1946375519,
-0.5005661249,
0.1517337561,
-0.4120476246,
-0.1020532027,
-0.1012719423,
-0.2124432474,
-0.3982958198,
0.3267835379,
0.2324446142,
-0.0355697833,
-0.0395299904,
0.0819810331,
0.2321104109,
-0.1997633576,
-0.9175687432,
0.1638943851,
0.1763909906,
-0.0909303054,
0.247208178,
0.1546778977,
-0.1466421187,
-0.0960708112,
-0.0257538296,
0.6306298375,
-0.0646004081,
0.0608096346,
-0.5003480315,
0.2619295716,
-0.1911873221,
0.1908104569,
-0.104991138,
-0.4325118661,
-0.0143133253,
0.1304036528,
-0.2320646793,
-0.0843059868,
-0.4490355253,
-0.1313849539,
0.0598227531,
0.0488688499,
0.2781394124,
-0.0805687159,
0.1933730543,
-0.2550164759,
0.2152751684,
-0.542080164,
0.1550076306,
-0.1119837165,
-0.0120684784,
-0.0852295607,
-0.1558952183,
-0.2969709039,
-0.0197966024,
0.0393949226,
-0.016540613,
0.1850438118,
-0.0722505078,
0.1588742882,
-0.3456683755,
0.1977575421,
-0.1231663674,
0.0587117001,
-0.4983721375,
0.2802624404,
-0.1423192024,
0.0969335139,
-0.2249714434,
-0.2606526017,
0.0266309939,
-0.0663829297,
-0.0077089262,
0.0847670138,
-0.2427747399,
0.1668353081,
-0.0719062611,
0.0822577551,
0.1976012886,
-0.1022715643,
-0.0642505735,
0.0905624181,
-0.0684162527,
0.0219139904,
-0.0892238319,
-0.1524734497,
0.2516382933,
-0.0162401274,
-0.2449785173,
-0.100350745,
-0.2109434903,
0.1382278949,
-0.1868401617,
-0.1263593435,
0.3529061079,
0.1456684619,
-0.2808857262,
-0.2442341745,
-0.0112628527,
-0.0327508897,
0.0331093892,
0.1553822458,
0.0217528008,
0.1336048841,
-0.0648510531,
-0.1858778596,
-0.4236634672,
-0.0582789332,
0.1405141503,
-0.1072302759,
0.4788044989,
0.2815491557,
0.2555831075,
0.1540787518,
-0.0803939626,
0.2228563279,
-0.2713090479,
-0.0046551824,
0.3146921098,
-0.1752707064,
-0.1823731512,
-0.2065906227,
0.1696433276,
0.0142174046,
0.0761996806,
-0.3303347826,
-0.1466783583,
-0.0949349925,
0.1207671911,
-0.2022130191,
-0.1867316663,
0.241856575,
0.2284965515,
0.0300903469,
0.1885069311,
0.2500556707,
0.0332655087,
-0.2170520723,
-0.0861485377,
-0.0437795743,
0.1183455586,
0.2379997671,
0.4617760181,
-0.04200764,
-0.2846428454,
0.1601476371,
0.0010736026,
0.5781361461,
-0.1647350937,
-0.3022521138,
0.0216057971,
0.076786153,
0.0009506643,
-0.1333919913,
-0.3836981654,
-0.0001531877,
0.2423481345,
0.0825090408,
0.1085125506,
-0.2214960456,
0.4477954209,
-0.3171699345,
0.0087405182,
-0.1848477274,
0.7597175837,
-0.1085423976,
-0.2938228846,
-0.1163827181,
0.2062702775,
0.2668506503,
-0.0254102536,
-0.4820897579,
-0.3349626362,
-0.3202604651,
0.2719302177,
0.1470128447,
0.6929335594,
-0.1348297298,
-0.2319810092,
0.3621507883,
0.2398637682,
0.2768635154,
0.0184219684,
-0.2167359442,
0.2924697995,
0.079120338,
-0.3491104245,
-0.114071846,
-0.0095899254,
0.1074212492,
-0.0088581964,
0.4121201932,
-0.1308528185,
-0.2242531031,
0.1573219597,
0.1882412881,
-0.1615239978,
0.0613937303,
-0.2136749923,
-0.2623848915,
-0.3775331676,
0.1652600169,
0.2398949862,
0.2294293642,
-0.2508633137,
0.0488718301,
-0.0084932987,
0.245666936,
0.317879051,
-0.0956499428,
0.1522447765,
0.1483535916,
-0.3418127,
0.3096509874,
0.3398792148,
-0.0277442262,
-0.1095982566,
-0.1247121692,
0.1159104705,
0.1117185652,
-0.0217786059,
0.2172192186,
0.4647897482,
0.1280826926,
0.4410914779,
-0.1509371698,
-0.1095777303,
-0.1212500036,
0.1708003879,
0.404702723,
-0.0328603759,
-0.3123688102,
-0.4118335247,
0.259888202,
0.1042160988,
-0.0633762181,
0.3375406265,
-0.0369731858,
-0.1301323026,
0.6268112659,
0.7443352938,
1.2204270363,
-0.2085443139,
0.5956696868,
0.3273388743,
-0.0674319789,
0.6187636256,
-0.0102656037,
0.042403549,
-0.2683009505,
0.1548625231,
-0.1980776787,
-0.1577159613,
0.2015676349,
-0.0019608289,
-0.0411036313,
-0.0536004901,
-0.5109753609,
0.3334376514,
0.0068877749,
0.0026787296,
-0.101127699,
-0.2555353642,
0.2363881171,
-0.1040956751,
-0.079145968,
0.3209194541,
-0.0350440517,
0.0317473672,
0.1873893738,
-0.27684021,
-0.2541230321,
-0.0294099823,
0.1044800133,
0.5604251623,
0.0119874794,
-0.609200716,
0.2883669138,
0.4530895352,
0.380843848,
0.1035279259,
0.0959939808,
-0.1774269342,
0.2179631293,
-0.3489566445,
-0.048572395,
-0.0852726698,
0.4823371768,
0.1212895066,
-0.08624883,
0.452154994,
-0.0263081845,
-0.1532788128,
-0.3243341446,
0.097329177,
0.4660350978,
-0.656159699,
-0.0559298396,
-0.2054632306,
0.0001007468,
-0.062172398,
-0.0448218659,
0.21907565,
0.0646789074,
0.103734754,
-0.2519076765,
-0.4340578914,
-0.036706157,
0.2222429961,
0.2893690765,
-0.117922321,
0.879553318,
0.0664853454,
-0.3234810829,
-0.1874034107,
0.037170127,
0.2153206617,
-0.1092451066,
-0.0018604547,
0.1581524909,
-0.1175391674,
-0.3384717703,
0.3617874384,
0.0083908569,
0.2841087282,
-0.3869201839,
-0.1965320259,
-0.4224751592,
-0.0321235135,
0.2779538929,
0.3617931008,
0.0696603879,
0.1996623129,
-0.1247276366,
-0.1013158858,
-0.1560570002,
-0.0752203166,
0.020221211,
0.0376656204,
0.0879769921,
-0.0319657587,
-0.0273688193,
-0.1754417121,
-0.0822160318,
0.0666388795,
-0.1285915673,
-0.0146795921,
-0.2238382995,
0.257093668,
0.1262626499,
-0.4150514603,
-0.152778849,
-0.2237046957,
0.0586871617,
-0.1579182148,
0.1590574384,
0.5828672051,
0.1063295007,
0.3782193661,
-0.2470910251,
0.56589818,
-0.0748700351,
-0.0877131075,
-0.3030568361,
0.0571165048,
-0.039043311,
-0.0320369825,
0.0130809136,
-0.0603504628,
0.0060970448,
0.1890246868,
0.2629618645,
0.0186588839,
-0.0052154213,
-0.0120388456,
-0.1070672274,
-0.3753263354,
-0.0517678708,
0.189355582,
-0.2479213476,
-0.5765467882,
0.2642937005,
0.0123998243,
0.0169081911,
0.0137433447,
-0.4336454272,
0.0795568451,
-0.031048052,
0.170232892,
0.0805719495,
0.1417364776,
-0.0181905963,
0.09814547,
0.2180120051,
-0.3502316177,
0.12948443,
0.4828861952,
-0.2283920646,
-0.0452727117,
0.4522110522,
-0.0312917605,
0.2341942936,
-0.1334195435,
0.0915353298,
0.4512062967,
0.1692260206,
-0.0341872424,
-0.3178962767,
-0.3285260201,
0.1474927664,
0.2378780991,
0.0970565528,
0.3567236066,
0.2927554846,
0.4019019902,
-0.4451822639,
-0.0209467206,
0.0176270828,
0.2360850871,
-0.3745494187,
-0.0451739728,
-0.3808813691,
0.0096498877,
0.0484975874,
-0.1260537207,
-0.159731403,
0.0313260742,
0.0340272896,
-0.0899489671,
-0.2625426054,
0.1525456458,
0.2857145369,
0.0725883618,
0.1878762394,
-0.2299561948,
-0.1870911866,
0.0890277028,
0.1067847461,
-0.0448155813,
0.0374788716,
0.4447661936,
0.358741641,
-0.10350582,
-0.0842834264,
-0.1756999195,
-0.0139695816,
0.1993031055,
0.4816289544,
0.018682234,
0.0138763636,
0.287936151,
-0.0183102097,
-0.0949821919,
0.0105205849,
0.2952637374,
0.2227563858,
0.1748559922,
0.3795814514,
-0.0568386167,
-0.4856646061,
0.0033313073,
-0.0617631972,
-0.1641254723,
0.1717436314,
0.3564146459,
0.2143042982,
0.1797748655,
-0.1373206377,
0.0054708868,
0.3706023097,
0.443055898,
0.251837343,
-0.0295525715,
-0.3219552636,
0.3352751136,
-0.3268110156,
0.4080523252,
-0.0225137025,
-0.3146441579,
-0.0225451831,
-0.1043130159,
0.3645898104,
0.0440230034,
-0.3006753027,
0.1305756569,
-0.24810341,
0.0429528914,
-0.0048683807,
0.1862839609,
0.099533312,
-0.0594905838,
-0.0888574198,
-0.3397578001,
0.444216311,
0.1383631527,
-0.0315220393,
-0.4207602143,
0.1421373487,
-0.2364388108,
0.2782037854,
0.3304498494,
0.1277319789,
0.4267578125,
-0.0317622982,
-0.1237798929,
-0.0141851921,
0.0862122327,
-0.1058101058,
0.2842881083,
0.2063486129,
0.2731181979,
0.0385228321,
0.054244712,
-0.2951193154,
0.5469293594,
-0.1160323024,
-0.3950822353,
-0.0497786291,
0.2726547718,
-0.2424453646,
-0.1353651434,
0.0816416889,
0.2359433323,
0.3239525557,
0.2044597566,
-0.3263272047,
-0.3508822918,
0.5388799906,
-0.364967972,
-0.1342417747,
-0.1359106451,
0.086956203,
0.3645114303,
0.11802385,
-0.7096751928,
-0.009728387,
0.2363465428,
-0.1331491768,
-0.0409499221,
-0.1120016724,
-0.1369568706,
-0.1903095245,
-0.2534229457,
0.6608252525,
-0.0571324974,
-0.0422460362,
-0.1349737495,
-0.3494077325
] |
https://github.com/huggingface/datasets/issues/616 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors | @thomwolf Would it be possible to have the array look writeable, but raise an error if it is actually written to?
I would like to keep my code free of warning, but I also wouldn't like to slow down the program because of unnecessary copy operations. | I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List. | 46 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors
I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List.
@thomwolf Would it be possible to have the array look writeable, but raise an error if it is actually written to?
I would like to keep my code free of warning, but I also wouldn't like to slow down the program because of unnecessary copy operations. | [
0.2382436991,
-0.3691009879,
0.0651262179,
0.1298750937,
0.4380832016,
0.0914001763,
0.6286587119,
0.2343869358,
0.297601819,
0.0604874976,
-0.1968633384,
0.4237813354,
-0.3091077507,
-0.2186416239,
-0.2565727532,
-0.2071908116,
-0.0199281424,
0.1609406918,
0.0264156014,
-0.0268478245,
-0.0556125604,
0.0349227414,
-0.173422873,
0.3379642367,
-0.225587368,
-0.409830004,
0.2081688046,
-0.0633970797,
-0.0799703747,
-0.3318473399,
0.1405587494,
-0.2741777301,
-0.0424051955,
0.1276778132,
-0.0001266726,
0.115719974,
0.0211010482,
-0.0969709903,
-0.0743484646,
-0.1103796959,
0.7791624069,
-0.3006571531,
0.2598287463,
-0.5041794777,
-0.0403004214,
-0.2053054571,
-0.0674015433,
-0.2887424231,
0.6341737509,
0.2032752633,
0.0752524808,
0.5057702065,
0.1049248055,
0.3881486058,
-0.0660524815,
0.3452899754,
-0.2550108433,
-0.0293221362,
0.0295191612,
0.1392834634,
-0.0000623427,
0.2642534673,
-0.591407299,
0.0171456039,
0.1526442766,
-0.0334381759,
-0.3494703472,
-0.1997034103,
-0.056856554,
0.4435257912,
0.3706564307,
-0.2254173011,
-0.0462095737,
-0.3330504298,
-0.1216261089,
-0.15876773,
0.1649688482,
0.0478616655,
-0.1130081713,
0.1626305133,
0.0142953508,
-0.0069781654,
-0.1369983852,
0.2631309032,
0.0543760359,
0.1673308313,
-0.1042309105,
0.3020677567,
-0.0703181326,
-0.0098414812,
0.052553013,
0.0074248388,
0.190158084,
0.1962173134,
-0.1164152175,
-0.3119843006,
-0.259937942,
-0.7474352717,
-0.0870940238,
-0.2492241859,
0.3341350555,
0.2540560067,
-0.1607196927,
0.0971487164,
0.146938771,
0.0135782659,
0.1208702549,
0.3030396402,
-0.0211056247,
-0.264131099,
0.4715954661,
0.0567320958,
0.1360284239,
-0.2395027131,
0.0944851488,
0.0855746642,
0.0345737711,
0.0435557663,
-0.4489812255,
0.1235724092,
-0.2449782193,
0.0512569025,
-0.0606116205,
0.0437430851,
-0.0110614114,
0.0084558968,
0.0809399411,
0.3331674039,
-0.2172212154,
0.2093103677,
0.0050660782,
0.2291955501,
-0.1825449467,
-0.1943187118,
-0.0641829595,
0.2339447439,
0.0189593509,
0.261413604,
0.0302924141,
0.0009380952,
0.1234879121,
-0.2359407246,
0.6530772448,
0.2942769527,
-0.1781086177,
0.2948441505,
0.3621657789,
-0.0307601988,
-0.3972629905,
0.2834928632,
-0.3104486465,
-0.2276653945,
-0.5769512057,
0.0020176312,
-0.1900073588,
0.1614835858,
0.1787660718,
-0.2559921443,
0.6716822386,
-0.1626642048,
0.4028994739,
-0.437556386,
-0.0927187949,
-0.3298019171,
-0.0530219451,
0.3662084937,
-0.3259871602,
-0.0640388876,
0.2587724626,
-0.0162681695,
0.4336456358,
0.4094215333,
-0.1056689024,
0.1454502344,
-0.1220319569,
0.2255038023,
0.3607750535,
-0.1224516407,
-0.0684337988,
0.1887910813,
-0.1807033718,
0.3171916008,
0.1701441705,
0.0854641497,
-0.1042877659,
0.1792413145,
0.2188511193,
0.0108663999,
0.3695319891,
-0.0156878829,
-0.3240776062,
-0.2027048916,
0.4832918048,
-0.135716185,
-0.0672294647,
-0.1775229275,
-0.1403666735,
0.0221993178,
0.3671857119,
-0.1119872481,
0.029092133,
-0.0411327407,
-0.1673816741,
0.239643842,
-0.1404467672,
0.1050847545,
-0.1710646003,
-0.0152081698,
0.0745917261,
0.1085797846,
-0.2709116638,
-0.1766556054,
-0.105172269,
0.0071569905,
-0.3517927229,
-0.0465407893,
-0.1111046374,
-0.1885503531,
-0.1136639789,
0.1317767352,
-0.1785909235,
-0.1946375519,
-0.5005661249,
0.1517337561,
-0.4120476246,
-0.1020532027,
-0.1012719423,
-0.2124432474,
-0.3982958198,
0.3267835379,
0.2324446142,
-0.0355697833,
-0.0395299904,
0.0819810331,
0.2321104109,
-0.1997633576,
-0.9175687432,
0.1638943851,
0.1763909906,
-0.0909303054,
0.247208178,
0.1546778977,
-0.1466421187,
-0.0960708112,
-0.0257538296,
0.6306298375,
-0.0646004081,
0.0608096346,
-0.5003480315,
0.2619295716,
-0.1911873221,
0.1908104569,
-0.104991138,
-0.4325118661,
-0.0143133253,
0.1304036528,
-0.2320646793,
-0.0843059868,
-0.4490355253,
-0.1313849539,
0.0598227531,
0.0488688499,
0.2781394124,
-0.0805687159,
0.1933730543,
-0.2550164759,
0.2152751684,
-0.542080164,
0.1550076306,
-0.1119837165,
-0.0120684784,
-0.0852295607,
-0.1558952183,
-0.2969709039,
-0.0197966024,
0.0393949226,
-0.016540613,
0.1850438118,
-0.0722505078,
0.1588742882,
-0.3456683755,
0.1977575421,
-0.1231663674,
0.0587117001,
-0.4983721375,
0.2802624404,
-0.1423192024,
0.0969335139,
-0.2249714434,
-0.2606526017,
0.0266309939,
-0.0663829297,
-0.0077089262,
0.0847670138,
-0.2427747399,
0.1668353081,
-0.0719062611,
0.0822577551,
0.1976012886,
-0.1022715643,
-0.0642505735,
0.0905624181,
-0.0684162527,
0.0219139904,
-0.0892238319,
-0.1524734497,
0.2516382933,
-0.0162401274,
-0.2449785173,
-0.100350745,
-0.2109434903,
0.1382278949,
-0.1868401617,
-0.1263593435,
0.3529061079,
0.1456684619,
-0.2808857262,
-0.2442341745,
-0.0112628527,
-0.0327508897,
0.0331093892,
0.1553822458,
0.0217528008,
0.1336048841,
-0.0648510531,
-0.1858778596,
-0.4236634672,
-0.0582789332,
0.1405141503,
-0.1072302759,
0.4788044989,
0.2815491557,
0.2555831075,
0.1540787518,
-0.0803939626,
0.2228563279,
-0.2713090479,
-0.0046551824,
0.3146921098,
-0.1752707064,
-0.1823731512,
-0.2065906227,
0.1696433276,
0.0142174046,
0.0761996806,
-0.3303347826,
-0.1466783583,
-0.0949349925,
0.1207671911,
-0.2022130191,
-0.1867316663,
0.241856575,
0.2284965515,
0.0300903469,
0.1885069311,
0.2500556707,
0.0332655087,
-0.2170520723,
-0.0861485377,
-0.0437795743,
0.1183455586,
0.2379997671,
0.4617760181,
-0.04200764,
-0.2846428454,
0.1601476371,
0.0010736026,
0.5781361461,
-0.1647350937,
-0.3022521138,
0.0216057971,
0.076786153,
0.0009506643,
-0.1333919913,
-0.3836981654,
-0.0001531877,
0.2423481345,
0.0825090408,
0.1085125506,
-0.2214960456,
0.4477954209,
-0.3171699345,
0.0087405182,
-0.1848477274,
0.7597175837,
-0.1085423976,
-0.2938228846,
-0.1163827181,
0.2062702775,
0.2668506503,
-0.0254102536,
-0.4820897579,
-0.3349626362,
-0.3202604651,
0.2719302177,
0.1470128447,
0.6929335594,
-0.1348297298,
-0.2319810092,
0.3621507883,
0.2398637682,
0.2768635154,
0.0184219684,
-0.2167359442,
0.2924697995,
0.079120338,
-0.3491104245,
-0.114071846,
-0.0095899254,
0.1074212492,
-0.0088581964,
0.4121201932,
-0.1308528185,
-0.2242531031,
0.1573219597,
0.1882412881,
-0.1615239978,
0.0613937303,
-0.2136749923,
-0.2623848915,
-0.3775331676,
0.1652600169,
0.2398949862,
0.2294293642,
-0.2508633137,
0.0488718301,
-0.0084932987,
0.245666936,
0.317879051,
-0.0956499428,
0.1522447765,
0.1483535916,
-0.3418127,
0.3096509874,
0.3398792148,
-0.0277442262,
-0.1095982566,
-0.1247121692,
0.1159104705,
0.1117185652,
-0.0217786059,
0.2172192186,
0.4647897482,
0.1280826926,
0.4410914779,
-0.1509371698,
-0.1095777303,
-0.1212500036,
0.1708003879,
0.404702723,
-0.0328603759,
-0.3123688102,
-0.4118335247,
0.259888202,
0.1042160988,
-0.0633762181,
0.3375406265,
-0.0369731858,
-0.1301323026,
0.6268112659,
0.7443352938,
1.2204270363,
-0.2085443139,
0.5956696868,
0.3273388743,
-0.0674319789,
0.6187636256,
-0.0102656037,
0.042403549,
-0.2683009505,
0.1548625231,
-0.1980776787,
-0.1577159613,
0.2015676349,
-0.0019608289,
-0.0411036313,
-0.0536004901,
-0.5109753609,
0.3334376514,
0.0068877749,
0.0026787296,
-0.101127699,
-0.2555353642,
0.2363881171,
-0.1040956751,
-0.079145968,
0.3209194541,
-0.0350440517,
0.0317473672,
0.1873893738,
-0.27684021,
-0.2541230321,
-0.0294099823,
0.1044800133,
0.5604251623,
0.0119874794,
-0.609200716,
0.2883669138,
0.4530895352,
0.380843848,
0.1035279259,
0.0959939808,
-0.1774269342,
0.2179631293,
-0.3489566445,
-0.048572395,
-0.0852726698,
0.4823371768,
0.1212895066,
-0.08624883,
0.452154994,
-0.0263081845,
-0.1532788128,
-0.3243341446,
0.097329177,
0.4660350978,
-0.656159699,
-0.0559298396,
-0.2054632306,
0.0001007468,
-0.062172398,
-0.0448218659,
0.21907565,
0.0646789074,
0.103734754,
-0.2519076765,
-0.4340578914,
-0.036706157,
0.2222429961,
0.2893690765,
-0.117922321,
0.879553318,
0.0664853454,
-0.3234810829,
-0.1874034107,
0.037170127,
0.2153206617,
-0.1092451066,
-0.0018604547,
0.1581524909,
-0.1175391674,
-0.3384717703,
0.3617874384,
0.0083908569,
0.2841087282,
-0.3869201839,
-0.1965320259,
-0.4224751592,
-0.0321235135,
0.2779538929,
0.3617931008,
0.0696603879,
0.1996623129,
-0.1247276366,
-0.1013158858,
-0.1560570002,
-0.0752203166,
0.020221211,
0.0376656204,
0.0879769921,
-0.0319657587,
-0.0273688193,
-0.1754417121,
-0.0822160318,
0.0666388795,
-0.1285915673,
-0.0146795921,
-0.2238382995,
0.257093668,
0.1262626499,
-0.4150514603,
-0.152778849,
-0.2237046957,
0.0586871617,
-0.1579182148,
0.1590574384,
0.5828672051,
0.1063295007,
0.3782193661,
-0.2470910251,
0.56589818,
-0.0748700351,
-0.0877131075,
-0.3030568361,
0.0571165048,
-0.039043311,
-0.0320369825,
0.0130809136,
-0.0603504628,
0.0060970448,
0.1890246868,
0.2629618645,
0.0186588839,
-0.0052154213,
-0.0120388456,
-0.1070672274,
-0.3753263354,
-0.0517678708,
0.189355582,
-0.2479213476,
-0.5765467882,
0.2642937005,
0.0123998243,
0.0169081911,
0.0137433447,
-0.4336454272,
0.0795568451,
-0.031048052,
0.170232892,
0.0805719495,
0.1417364776,
-0.0181905963,
0.09814547,
0.2180120051,
-0.3502316177,
0.12948443,
0.4828861952,
-0.2283920646,
-0.0452727117,
0.4522110522,
-0.0312917605,
0.2341942936,
-0.1334195435,
0.0915353298,
0.4512062967,
0.1692260206,
-0.0341872424,
-0.3178962767,
-0.3285260201,
0.1474927664,
0.2378780991,
0.0970565528,
0.3567236066,
0.2927554846,
0.4019019902,
-0.4451822639,
-0.0209467206,
0.0176270828,
0.2360850871,
-0.3745494187,
-0.0451739728,
-0.3808813691,
0.0096498877,
0.0484975874,
-0.1260537207,
-0.159731403,
0.0313260742,
0.0340272896,
-0.0899489671,
-0.2625426054,
0.1525456458,
0.2857145369,
0.0725883618,
0.1878762394,
-0.2299561948,
-0.1870911866,
0.0890277028,
0.1067847461,
-0.0448155813,
0.0374788716,
0.4447661936,
0.358741641,
-0.10350582,
-0.0842834264,
-0.1756999195,
-0.0139695816,
0.1993031055,
0.4816289544,
0.018682234,
0.0138763636,
0.287936151,
-0.0183102097,
-0.0949821919,
0.0105205849,
0.2952637374,
0.2227563858,
0.1748559922,
0.3795814514,
-0.0568386167,
-0.4856646061,
0.0033313073,
-0.0617631972,
-0.1641254723,
0.1717436314,
0.3564146459,
0.2143042982,
0.1797748655,
-0.1373206377,
0.0054708868,
0.3706023097,
0.443055898,
0.251837343,
-0.0295525715,
-0.3219552636,
0.3352751136,
-0.3268110156,
0.4080523252,
-0.0225137025,
-0.3146441579,
-0.0225451831,
-0.1043130159,
0.3645898104,
0.0440230034,
-0.3006753027,
0.1305756569,
-0.24810341,
0.0429528914,
-0.0048683807,
0.1862839609,
0.099533312,
-0.0594905838,
-0.0888574198,
-0.3397578001,
0.444216311,
0.1383631527,
-0.0315220393,
-0.4207602143,
0.1421373487,
-0.2364388108,
0.2782037854,
0.3304498494,
0.1277319789,
0.4267578125,
-0.0317622982,
-0.1237798929,
-0.0141851921,
0.0862122327,
-0.1058101058,
0.2842881083,
0.2063486129,
0.2731181979,
0.0385228321,
0.054244712,
-0.2951193154,
0.5469293594,
-0.1160323024,
-0.3950822353,
-0.0497786291,
0.2726547718,
-0.2424453646,
-0.1353651434,
0.0816416889,
0.2359433323,
0.3239525557,
0.2044597566,
-0.3263272047,
-0.3508822918,
0.5388799906,
-0.364967972,
-0.1342417747,
-0.1359106451,
0.086956203,
0.3645114303,
0.11802385,
-0.7096751928,
-0.009728387,
0.2363465428,
-0.1331491768,
-0.0409499221,
-0.1120016724,
-0.1369568706,
-0.1903095245,
-0.2534229457,
0.6608252525,
-0.0571324974,
-0.0422460362,
-0.1349737495,
-0.3494077325
] |
https://github.com/huggingface/datasets/issues/616 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors | Well because I don't know the internal of numpy as well as you I guess hahahah, do you want to try to open a PR proposing a solution? | I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List. | 28 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors
I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List.
Well because I don't know the internal of numpy as well as you I guess hahahah, do you want to try to open a PR proposing a solution? | [
0.2382436991,
-0.3691009879,
0.0651262179,
0.1298750937,
0.4380832016,
0.0914001763,
0.6286587119,
0.2343869358,
0.297601819,
0.0604874976,
-0.1968633384,
0.4237813354,
-0.3091077507,
-0.2186416239,
-0.2565727532,
-0.2071908116,
-0.0199281424,
0.1609406918,
0.0264156014,
-0.0268478245,
-0.0556125604,
0.0349227414,
-0.173422873,
0.3379642367,
-0.225587368,
-0.409830004,
0.2081688046,
-0.0633970797,
-0.0799703747,
-0.3318473399,
0.1405587494,
-0.2741777301,
-0.0424051955,
0.1276778132,
-0.0001266726,
0.115719974,
0.0211010482,
-0.0969709903,
-0.0743484646,
-0.1103796959,
0.7791624069,
-0.3006571531,
0.2598287463,
-0.5041794777,
-0.0403004214,
-0.2053054571,
-0.0674015433,
-0.2887424231,
0.6341737509,
0.2032752633,
0.0752524808,
0.5057702065,
0.1049248055,
0.3881486058,
-0.0660524815,
0.3452899754,
-0.2550108433,
-0.0293221362,
0.0295191612,
0.1392834634,
-0.0000623427,
0.2642534673,
-0.591407299,
0.0171456039,
0.1526442766,
-0.0334381759,
-0.3494703472,
-0.1997034103,
-0.056856554,
0.4435257912,
0.3706564307,
-0.2254173011,
-0.0462095737,
-0.3330504298,
-0.1216261089,
-0.15876773,
0.1649688482,
0.0478616655,
-0.1130081713,
0.1626305133,
0.0142953508,
-0.0069781654,
-0.1369983852,
0.2631309032,
0.0543760359,
0.1673308313,
-0.1042309105,
0.3020677567,
-0.0703181326,
-0.0098414812,
0.052553013,
0.0074248388,
0.190158084,
0.1962173134,
-0.1164152175,
-0.3119843006,
-0.259937942,
-0.7474352717,
-0.0870940238,
-0.2492241859,
0.3341350555,
0.2540560067,
-0.1607196927,
0.0971487164,
0.146938771,
0.0135782659,
0.1208702549,
0.3030396402,
-0.0211056247,
-0.264131099,
0.4715954661,
0.0567320958,
0.1360284239,
-0.2395027131,
0.0944851488,
0.0855746642,
0.0345737711,
0.0435557663,
-0.4489812255,
0.1235724092,
-0.2449782193,
0.0512569025,
-0.0606116205,
0.0437430851,
-0.0110614114,
0.0084558968,
0.0809399411,
0.3331674039,
-0.2172212154,
0.2093103677,
0.0050660782,
0.2291955501,
-0.1825449467,
-0.1943187118,
-0.0641829595,
0.2339447439,
0.0189593509,
0.261413604,
0.0302924141,
0.0009380952,
0.1234879121,
-0.2359407246,
0.6530772448,
0.2942769527,
-0.1781086177,
0.2948441505,
0.3621657789,
-0.0307601988,
-0.3972629905,
0.2834928632,
-0.3104486465,
-0.2276653945,
-0.5769512057,
0.0020176312,
-0.1900073588,
0.1614835858,
0.1787660718,
-0.2559921443,
0.6716822386,
-0.1626642048,
0.4028994739,
-0.437556386,
-0.0927187949,
-0.3298019171,
-0.0530219451,
0.3662084937,
-0.3259871602,
-0.0640388876,
0.2587724626,
-0.0162681695,
0.4336456358,
0.4094215333,
-0.1056689024,
0.1454502344,
-0.1220319569,
0.2255038023,
0.3607750535,
-0.1224516407,
-0.0684337988,
0.1887910813,
-0.1807033718,
0.3171916008,
0.1701441705,
0.0854641497,
-0.1042877659,
0.1792413145,
0.2188511193,
0.0108663999,
0.3695319891,
-0.0156878829,
-0.3240776062,
-0.2027048916,
0.4832918048,
-0.135716185,
-0.0672294647,
-0.1775229275,
-0.1403666735,
0.0221993178,
0.3671857119,
-0.1119872481,
0.029092133,
-0.0411327407,
-0.1673816741,
0.239643842,
-0.1404467672,
0.1050847545,
-0.1710646003,
-0.0152081698,
0.0745917261,
0.1085797846,
-0.2709116638,
-0.1766556054,
-0.105172269,
0.0071569905,
-0.3517927229,
-0.0465407893,
-0.1111046374,
-0.1885503531,
-0.1136639789,
0.1317767352,
-0.1785909235,
-0.1946375519,
-0.5005661249,
0.1517337561,
-0.4120476246,
-0.1020532027,
-0.1012719423,
-0.2124432474,
-0.3982958198,
0.3267835379,
0.2324446142,
-0.0355697833,
-0.0395299904,
0.0819810331,
0.2321104109,
-0.1997633576,
-0.9175687432,
0.1638943851,
0.1763909906,
-0.0909303054,
0.247208178,
0.1546778977,
-0.1466421187,
-0.0960708112,
-0.0257538296,
0.6306298375,
-0.0646004081,
0.0608096346,
-0.5003480315,
0.2619295716,
-0.1911873221,
0.1908104569,
-0.104991138,
-0.4325118661,
-0.0143133253,
0.1304036528,
-0.2320646793,
-0.0843059868,
-0.4490355253,
-0.1313849539,
0.0598227531,
0.0488688499,
0.2781394124,
-0.0805687159,
0.1933730543,
-0.2550164759,
0.2152751684,
-0.542080164,
0.1550076306,
-0.1119837165,
-0.0120684784,
-0.0852295607,
-0.1558952183,
-0.2969709039,
-0.0197966024,
0.0393949226,
-0.016540613,
0.1850438118,
-0.0722505078,
0.1588742882,
-0.3456683755,
0.1977575421,
-0.1231663674,
0.0587117001,
-0.4983721375,
0.2802624404,
-0.1423192024,
0.0969335139,
-0.2249714434,
-0.2606526017,
0.0266309939,
-0.0663829297,
-0.0077089262,
0.0847670138,
-0.2427747399,
0.1668353081,
-0.0719062611,
0.0822577551,
0.1976012886,
-0.1022715643,
-0.0642505735,
0.0905624181,
-0.0684162527,
0.0219139904,
-0.0892238319,
-0.1524734497,
0.2516382933,
-0.0162401274,
-0.2449785173,
-0.100350745,
-0.2109434903,
0.1382278949,
-0.1868401617,
-0.1263593435,
0.3529061079,
0.1456684619,
-0.2808857262,
-0.2442341745,
-0.0112628527,
-0.0327508897,
0.0331093892,
0.1553822458,
0.0217528008,
0.1336048841,
-0.0648510531,
-0.1858778596,
-0.4236634672,
-0.0582789332,
0.1405141503,
-0.1072302759,
0.4788044989,
0.2815491557,
0.2555831075,
0.1540787518,
-0.0803939626,
0.2228563279,
-0.2713090479,
-0.0046551824,
0.3146921098,
-0.1752707064,
-0.1823731512,
-0.2065906227,
0.1696433276,
0.0142174046,
0.0761996806,
-0.3303347826,
-0.1466783583,
-0.0949349925,
0.1207671911,
-0.2022130191,
-0.1867316663,
0.241856575,
0.2284965515,
0.0300903469,
0.1885069311,
0.2500556707,
0.0332655087,
-0.2170520723,
-0.0861485377,
-0.0437795743,
0.1183455586,
0.2379997671,
0.4617760181,
-0.04200764,
-0.2846428454,
0.1601476371,
0.0010736026,
0.5781361461,
-0.1647350937,
-0.3022521138,
0.0216057971,
0.076786153,
0.0009506643,
-0.1333919913,
-0.3836981654,
-0.0001531877,
0.2423481345,
0.0825090408,
0.1085125506,
-0.2214960456,
0.4477954209,
-0.3171699345,
0.0087405182,
-0.1848477274,
0.7597175837,
-0.1085423976,
-0.2938228846,
-0.1163827181,
0.2062702775,
0.2668506503,
-0.0254102536,
-0.4820897579,
-0.3349626362,
-0.3202604651,
0.2719302177,
0.1470128447,
0.6929335594,
-0.1348297298,
-0.2319810092,
0.3621507883,
0.2398637682,
0.2768635154,
0.0184219684,
-0.2167359442,
0.2924697995,
0.079120338,
-0.3491104245,
-0.114071846,
-0.0095899254,
0.1074212492,
-0.0088581964,
0.4121201932,
-0.1308528185,
-0.2242531031,
0.1573219597,
0.1882412881,
-0.1615239978,
0.0613937303,
-0.2136749923,
-0.2623848915,
-0.3775331676,
0.1652600169,
0.2398949862,
0.2294293642,
-0.2508633137,
0.0488718301,
-0.0084932987,
0.245666936,
0.317879051,
-0.0956499428,
0.1522447765,
0.1483535916,
-0.3418127,
0.3096509874,
0.3398792148,
-0.0277442262,
-0.1095982566,
-0.1247121692,
0.1159104705,
0.1117185652,
-0.0217786059,
0.2172192186,
0.4647897482,
0.1280826926,
0.4410914779,
-0.1509371698,
-0.1095777303,
-0.1212500036,
0.1708003879,
0.404702723,
-0.0328603759,
-0.3123688102,
-0.4118335247,
0.259888202,
0.1042160988,
-0.0633762181,
0.3375406265,
-0.0369731858,
-0.1301323026,
0.6268112659,
0.7443352938,
1.2204270363,
-0.2085443139,
0.5956696868,
0.3273388743,
-0.0674319789,
0.6187636256,
-0.0102656037,
0.042403549,
-0.2683009505,
0.1548625231,
-0.1980776787,
-0.1577159613,
0.2015676349,
-0.0019608289,
-0.0411036313,
-0.0536004901,
-0.5109753609,
0.3334376514,
0.0068877749,
0.0026787296,
-0.101127699,
-0.2555353642,
0.2363881171,
-0.1040956751,
-0.079145968,
0.3209194541,
-0.0350440517,
0.0317473672,
0.1873893738,
-0.27684021,
-0.2541230321,
-0.0294099823,
0.1044800133,
0.5604251623,
0.0119874794,
-0.609200716,
0.2883669138,
0.4530895352,
0.380843848,
0.1035279259,
0.0959939808,
-0.1774269342,
0.2179631293,
-0.3489566445,
-0.048572395,
-0.0852726698,
0.4823371768,
0.1212895066,
-0.08624883,
0.452154994,
-0.0263081845,
-0.1532788128,
-0.3243341446,
0.097329177,
0.4660350978,
-0.656159699,
-0.0559298396,
-0.2054632306,
0.0001007468,
-0.062172398,
-0.0448218659,
0.21907565,
0.0646789074,
0.103734754,
-0.2519076765,
-0.4340578914,
-0.036706157,
0.2222429961,
0.2893690765,
-0.117922321,
0.879553318,
0.0664853454,
-0.3234810829,
-0.1874034107,
0.037170127,
0.2153206617,
-0.1092451066,
-0.0018604547,
0.1581524909,
-0.1175391674,
-0.3384717703,
0.3617874384,
0.0083908569,
0.2841087282,
-0.3869201839,
-0.1965320259,
-0.4224751592,
-0.0321235135,
0.2779538929,
0.3617931008,
0.0696603879,
0.1996623129,
-0.1247276366,
-0.1013158858,
-0.1560570002,
-0.0752203166,
0.020221211,
0.0376656204,
0.0879769921,
-0.0319657587,
-0.0273688193,
-0.1754417121,
-0.0822160318,
0.0666388795,
-0.1285915673,
-0.0146795921,
-0.2238382995,
0.257093668,
0.1262626499,
-0.4150514603,
-0.152778849,
-0.2237046957,
0.0586871617,
-0.1579182148,
0.1590574384,
0.5828672051,
0.1063295007,
0.3782193661,
-0.2470910251,
0.56589818,
-0.0748700351,
-0.0877131075,
-0.3030568361,
0.0571165048,
-0.039043311,
-0.0320369825,
0.0130809136,
-0.0603504628,
0.0060970448,
0.1890246868,
0.2629618645,
0.0186588839,
-0.0052154213,
-0.0120388456,
-0.1070672274,
-0.3753263354,
-0.0517678708,
0.189355582,
-0.2479213476,
-0.5765467882,
0.2642937005,
0.0123998243,
0.0169081911,
0.0137433447,
-0.4336454272,
0.0795568451,
-0.031048052,
0.170232892,
0.0805719495,
0.1417364776,
-0.0181905963,
0.09814547,
0.2180120051,
-0.3502316177,
0.12948443,
0.4828861952,
-0.2283920646,
-0.0452727117,
0.4522110522,
-0.0312917605,
0.2341942936,
-0.1334195435,
0.0915353298,
0.4512062967,
0.1692260206,
-0.0341872424,
-0.3178962767,
-0.3285260201,
0.1474927664,
0.2378780991,
0.0970565528,
0.3567236066,
0.2927554846,
0.4019019902,
-0.4451822639,
-0.0209467206,
0.0176270828,
0.2360850871,
-0.3745494187,
-0.0451739728,
-0.3808813691,
0.0096498877,
0.0484975874,
-0.1260537207,
-0.159731403,
0.0313260742,
0.0340272896,
-0.0899489671,
-0.2625426054,
0.1525456458,
0.2857145369,
0.0725883618,
0.1878762394,
-0.2299561948,
-0.1870911866,
0.0890277028,
0.1067847461,
-0.0448155813,
0.0374788716,
0.4447661936,
0.358741641,
-0.10350582,
-0.0842834264,
-0.1756999195,
-0.0139695816,
0.1993031055,
0.4816289544,
0.018682234,
0.0138763636,
0.287936151,
-0.0183102097,
-0.0949821919,
0.0105205849,
0.2952637374,
0.2227563858,
0.1748559922,
0.3795814514,
-0.0568386167,
-0.4856646061,
0.0033313073,
-0.0617631972,
-0.1641254723,
0.1717436314,
0.3564146459,
0.2143042982,
0.1797748655,
-0.1373206377,
0.0054708868,
0.3706023097,
0.443055898,
0.251837343,
-0.0295525715,
-0.3219552636,
0.3352751136,
-0.3268110156,
0.4080523252,
-0.0225137025,
-0.3146441579,
-0.0225451831,
-0.1043130159,
0.3645898104,
0.0440230034,
-0.3006753027,
0.1305756569,
-0.24810341,
0.0429528914,
-0.0048683807,
0.1862839609,
0.099533312,
-0.0594905838,
-0.0888574198,
-0.3397578001,
0.444216311,
0.1383631527,
-0.0315220393,
-0.4207602143,
0.1421373487,
-0.2364388108,
0.2782037854,
0.3304498494,
0.1277319789,
0.4267578125,
-0.0317622982,
-0.1237798929,
-0.0141851921,
0.0862122327,
-0.1058101058,
0.2842881083,
0.2063486129,
0.2731181979,
0.0385228321,
0.054244712,
-0.2951193154,
0.5469293594,
-0.1160323024,
-0.3950822353,
-0.0497786291,
0.2726547718,
-0.2424453646,
-0.1353651434,
0.0816416889,
0.2359433323,
0.3239525557,
0.2044597566,
-0.3263272047,
-0.3508822918,
0.5388799906,
-0.364967972,
-0.1342417747,
-0.1359106451,
0.086956203,
0.3645114303,
0.11802385,
-0.7096751928,
-0.009728387,
0.2363465428,
-0.1331491768,
-0.0409499221,
-0.1120016724,
-0.1369568706,
-0.1903095245,
-0.2534229457,
0.6608252525,
-0.0571324974,
-0.0422460362,
-0.1349737495,
-0.3494077325
] |
https://github.com/huggingface/datasets/issues/616 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors | @thomwolf @AndreasMadsen I think this is a terrible idea, n/o, and I am very much against it. Modifying internals of an array in such a hacky way is bound to run into other (user) issues down the line. To users it would not be clear at all what is going on e.g. when they check for write access (which will return True) but then they get a warning that the array is not writeable. That's extremely confusing.
If your only goal is to get rid of warnings in your code, then you can just use a [simplefilter](https://docs.python.org/3.8/library/warnings.html#temporarily-suppressing-warnings) for UserWarnings in your own code. Changing the code-base in such an intuitive way does not seem like a good way to go and sets a bad precedent, imo.
(Feel free to disagree, of course.)
IMO a warning can stay (as they can be filtered by users anyway), but it can be clarified why the warning takes place. | I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List. | 155 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors
I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List.
@thomwolf @AndreasMadsen I think this is a terrible idea, n/o, and I am very much against it. Modifying internals of an array in such a hacky way is bound to run into other (user) issues down the line. To users it would not be clear at all what is going on e.g. when they check for write access (which will return True) but then they get a warning that the array is not writeable. That's extremely confusing.
If your only goal is to get rid of warnings in your code, then you can just use a [simplefilter](https://docs.python.org/3.8/library/warnings.html#temporarily-suppressing-warnings) for UserWarnings in your own code. Changing the code-base in such an intuitive way does not seem like a good way to go and sets a bad precedent, imo.
(Feel free to disagree, of course.)
IMO a warning can stay (as they can be filtered by users anyway), but it can be clarified why the warning takes place. | [
0.2382436991,
-0.3691009879,
0.0651262179,
0.1298750937,
0.4380832016,
0.0914001763,
0.6286587119,
0.2343869358,
0.297601819,
0.0604874976,
-0.1968633384,
0.4237813354,
-0.3091077507,
-0.2186416239,
-0.2565727532,
-0.2071908116,
-0.0199281424,
0.1609406918,
0.0264156014,
-0.0268478245,
-0.0556125604,
0.0349227414,
-0.173422873,
0.3379642367,
-0.225587368,
-0.409830004,
0.2081688046,
-0.0633970797,
-0.0799703747,
-0.3318473399,
0.1405587494,
-0.2741777301,
-0.0424051955,
0.1276778132,
-0.0001266726,
0.115719974,
0.0211010482,
-0.0969709903,
-0.0743484646,
-0.1103796959,
0.7791624069,
-0.3006571531,
0.2598287463,
-0.5041794777,
-0.0403004214,
-0.2053054571,
-0.0674015433,
-0.2887424231,
0.6341737509,
0.2032752633,
0.0752524808,
0.5057702065,
0.1049248055,
0.3881486058,
-0.0660524815,
0.3452899754,
-0.2550108433,
-0.0293221362,
0.0295191612,
0.1392834634,
-0.0000623427,
0.2642534673,
-0.591407299,
0.0171456039,
0.1526442766,
-0.0334381759,
-0.3494703472,
-0.1997034103,
-0.056856554,
0.4435257912,
0.3706564307,
-0.2254173011,
-0.0462095737,
-0.3330504298,
-0.1216261089,
-0.15876773,
0.1649688482,
0.0478616655,
-0.1130081713,
0.1626305133,
0.0142953508,
-0.0069781654,
-0.1369983852,
0.2631309032,
0.0543760359,
0.1673308313,
-0.1042309105,
0.3020677567,
-0.0703181326,
-0.0098414812,
0.052553013,
0.0074248388,
0.190158084,
0.1962173134,
-0.1164152175,
-0.3119843006,
-0.259937942,
-0.7474352717,
-0.0870940238,
-0.2492241859,
0.3341350555,
0.2540560067,
-0.1607196927,
0.0971487164,
0.146938771,
0.0135782659,
0.1208702549,
0.3030396402,
-0.0211056247,
-0.264131099,
0.4715954661,
0.0567320958,
0.1360284239,
-0.2395027131,
0.0944851488,
0.0855746642,
0.0345737711,
0.0435557663,
-0.4489812255,
0.1235724092,
-0.2449782193,
0.0512569025,
-0.0606116205,
0.0437430851,
-0.0110614114,
0.0084558968,
0.0809399411,
0.3331674039,
-0.2172212154,
0.2093103677,
0.0050660782,
0.2291955501,
-0.1825449467,
-0.1943187118,
-0.0641829595,
0.2339447439,
0.0189593509,
0.261413604,
0.0302924141,
0.0009380952,
0.1234879121,
-0.2359407246,
0.6530772448,
0.2942769527,
-0.1781086177,
0.2948441505,
0.3621657789,
-0.0307601988,
-0.3972629905,
0.2834928632,
-0.3104486465,
-0.2276653945,
-0.5769512057,
0.0020176312,
-0.1900073588,
0.1614835858,
0.1787660718,
-0.2559921443,
0.6716822386,
-0.1626642048,
0.4028994739,
-0.437556386,
-0.0927187949,
-0.3298019171,
-0.0530219451,
0.3662084937,
-0.3259871602,
-0.0640388876,
0.2587724626,
-0.0162681695,
0.4336456358,
0.4094215333,
-0.1056689024,
0.1454502344,
-0.1220319569,
0.2255038023,
0.3607750535,
-0.1224516407,
-0.0684337988,
0.1887910813,
-0.1807033718,
0.3171916008,
0.1701441705,
0.0854641497,
-0.1042877659,
0.1792413145,
0.2188511193,
0.0108663999,
0.3695319891,
-0.0156878829,
-0.3240776062,
-0.2027048916,
0.4832918048,
-0.135716185,
-0.0672294647,
-0.1775229275,
-0.1403666735,
0.0221993178,
0.3671857119,
-0.1119872481,
0.029092133,
-0.0411327407,
-0.1673816741,
0.239643842,
-0.1404467672,
0.1050847545,
-0.1710646003,
-0.0152081698,
0.0745917261,
0.1085797846,
-0.2709116638,
-0.1766556054,
-0.105172269,
0.0071569905,
-0.3517927229,
-0.0465407893,
-0.1111046374,
-0.1885503531,
-0.1136639789,
0.1317767352,
-0.1785909235,
-0.1946375519,
-0.5005661249,
0.1517337561,
-0.4120476246,
-0.1020532027,
-0.1012719423,
-0.2124432474,
-0.3982958198,
0.3267835379,
0.2324446142,
-0.0355697833,
-0.0395299904,
0.0819810331,
0.2321104109,
-0.1997633576,
-0.9175687432,
0.1638943851,
0.1763909906,
-0.0909303054,
0.247208178,
0.1546778977,
-0.1466421187,
-0.0960708112,
-0.0257538296,
0.6306298375,
-0.0646004081,
0.0608096346,
-0.5003480315,
0.2619295716,
-0.1911873221,
0.1908104569,
-0.104991138,
-0.4325118661,
-0.0143133253,
0.1304036528,
-0.2320646793,
-0.0843059868,
-0.4490355253,
-0.1313849539,
0.0598227531,
0.0488688499,
0.2781394124,
-0.0805687159,
0.1933730543,
-0.2550164759,
0.2152751684,
-0.542080164,
0.1550076306,
-0.1119837165,
-0.0120684784,
-0.0852295607,
-0.1558952183,
-0.2969709039,
-0.0197966024,
0.0393949226,
-0.016540613,
0.1850438118,
-0.0722505078,
0.1588742882,
-0.3456683755,
0.1977575421,
-0.1231663674,
0.0587117001,
-0.4983721375,
0.2802624404,
-0.1423192024,
0.0969335139,
-0.2249714434,
-0.2606526017,
0.0266309939,
-0.0663829297,
-0.0077089262,
0.0847670138,
-0.2427747399,
0.1668353081,
-0.0719062611,
0.0822577551,
0.1976012886,
-0.1022715643,
-0.0642505735,
0.0905624181,
-0.0684162527,
0.0219139904,
-0.0892238319,
-0.1524734497,
0.2516382933,
-0.0162401274,
-0.2449785173,
-0.100350745,
-0.2109434903,
0.1382278949,
-0.1868401617,
-0.1263593435,
0.3529061079,
0.1456684619,
-0.2808857262,
-0.2442341745,
-0.0112628527,
-0.0327508897,
0.0331093892,
0.1553822458,
0.0217528008,
0.1336048841,
-0.0648510531,
-0.1858778596,
-0.4236634672,
-0.0582789332,
0.1405141503,
-0.1072302759,
0.4788044989,
0.2815491557,
0.2555831075,
0.1540787518,
-0.0803939626,
0.2228563279,
-0.2713090479,
-0.0046551824,
0.3146921098,
-0.1752707064,
-0.1823731512,
-0.2065906227,
0.1696433276,
0.0142174046,
0.0761996806,
-0.3303347826,
-0.1466783583,
-0.0949349925,
0.1207671911,
-0.2022130191,
-0.1867316663,
0.241856575,
0.2284965515,
0.0300903469,
0.1885069311,
0.2500556707,
0.0332655087,
-0.2170520723,
-0.0861485377,
-0.0437795743,
0.1183455586,
0.2379997671,
0.4617760181,
-0.04200764,
-0.2846428454,
0.1601476371,
0.0010736026,
0.5781361461,
-0.1647350937,
-0.3022521138,
0.0216057971,
0.076786153,
0.0009506643,
-0.1333919913,
-0.3836981654,
-0.0001531877,
0.2423481345,
0.0825090408,
0.1085125506,
-0.2214960456,
0.4477954209,
-0.3171699345,
0.0087405182,
-0.1848477274,
0.7597175837,
-0.1085423976,
-0.2938228846,
-0.1163827181,
0.2062702775,
0.2668506503,
-0.0254102536,
-0.4820897579,
-0.3349626362,
-0.3202604651,
0.2719302177,
0.1470128447,
0.6929335594,
-0.1348297298,
-0.2319810092,
0.3621507883,
0.2398637682,
0.2768635154,
0.0184219684,
-0.2167359442,
0.2924697995,
0.079120338,
-0.3491104245,
-0.114071846,
-0.0095899254,
0.1074212492,
-0.0088581964,
0.4121201932,
-0.1308528185,
-0.2242531031,
0.1573219597,
0.1882412881,
-0.1615239978,
0.0613937303,
-0.2136749923,
-0.2623848915,
-0.3775331676,
0.1652600169,
0.2398949862,
0.2294293642,
-0.2508633137,
0.0488718301,
-0.0084932987,
0.245666936,
0.317879051,
-0.0956499428,
0.1522447765,
0.1483535916,
-0.3418127,
0.3096509874,
0.3398792148,
-0.0277442262,
-0.1095982566,
-0.1247121692,
0.1159104705,
0.1117185652,
-0.0217786059,
0.2172192186,
0.4647897482,
0.1280826926,
0.4410914779,
-0.1509371698,
-0.1095777303,
-0.1212500036,
0.1708003879,
0.404702723,
-0.0328603759,
-0.3123688102,
-0.4118335247,
0.259888202,
0.1042160988,
-0.0633762181,
0.3375406265,
-0.0369731858,
-0.1301323026,
0.6268112659,
0.7443352938,
1.2204270363,
-0.2085443139,
0.5956696868,
0.3273388743,
-0.0674319789,
0.6187636256,
-0.0102656037,
0.042403549,
-0.2683009505,
0.1548625231,
-0.1980776787,
-0.1577159613,
0.2015676349,
-0.0019608289,
-0.0411036313,
-0.0536004901,
-0.5109753609,
0.3334376514,
0.0068877749,
0.0026787296,
-0.101127699,
-0.2555353642,
0.2363881171,
-0.1040956751,
-0.079145968,
0.3209194541,
-0.0350440517,
0.0317473672,
0.1873893738,
-0.27684021,
-0.2541230321,
-0.0294099823,
0.1044800133,
0.5604251623,
0.0119874794,
-0.609200716,
0.2883669138,
0.4530895352,
0.380843848,
0.1035279259,
0.0959939808,
-0.1774269342,
0.2179631293,
-0.3489566445,
-0.048572395,
-0.0852726698,
0.4823371768,
0.1212895066,
-0.08624883,
0.452154994,
-0.0263081845,
-0.1532788128,
-0.3243341446,
0.097329177,
0.4660350978,
-0.656159699,
-0.0559298396,
-0.2054632306,
0.0001007468,
-0.062172398,
-0.0448218659,
0.21907565,
0.0646789074,
0.103734754,
-0.2519076765,
-0.4340578914,
-0.036706157,
0.2222429961,
0.2893690765,
-0.117922321,
0.879553318,
0.0664853454,
-0.3234810829,
-0.1874034107,
0.037170127,
0.2153206617,
-0.1092451066,
-0.0018604547,
0.1581524909,
-0.1175391674,
-0.3384717703,
0.3617874384,
0.0083908569,
0.2841087282,
-0.3869201839,
-0.1965320259,
-0.4224751592,
-0.0321235135,
0.2779538929,
0.3617931008,
0.0696603879,
0.1996623129,
-0.1247276366,
-0.1013158858,
-0.1560570002,
-0.0752203166,
0.020221211,
0.0376656204,
0.0879769921,
-0.0319657587,
-0.0273688193,
-0.1754417121,
-0.0822160318,
0.0666388795,
-0.1285915673,
-0.0146795921,
-0.2238382995,
0.257093668,
0.1262626499,
-0.4150514603,
-0.152778849,
-0.2237046957,
0.0586871617,
-0.1579182148,
0.1590574384,
0.5828672051,
0.1063295007,
0.3782193661,
-0.2470910251,
0.56589818,
-0.0748700351,
-0.0877131075,
-0.3030568361,
0.0571165048,
-0.039043311,
-0.0320369825,
0.0130809136,
-0.0603504628,
0.0060970448,
0.1890246868,
0.2629618645,
0.0186588839,
-0.0052154213,
-0.0120388456,
-0.1070672274,
-0.3753263354,
-0.0517678708,
0.189355582,
-0.2479213476,
-0.5765467882,
0.2642937005,
0.0123998243,
0.0169081911,
0.0137433447,
-0.4336454272,
0.0795568451,
-0.031048052,
0.170232892,
0.0805719495,
0.1417364776,
-0.0181905963,
0.09814547,
0.2180120051,
-0.3502316177,
0.12948443,
0.4828861952,
-0.2283920646,
-0.0452727117,
0.4522110522,
-0.0312917605,
0.2341942936,
-0.1334195435,
0.0915353298,
0.4512062967,
0.1692260206,
-0.0341872424,
-0.3178962767,
-0.3285260201,
0.1474927664,
0.2378780991,
0.0970565528,
0.3567236066,
0.2927554846,
0.4019019902,
-0.4451822639,
-0.0209467206,
0.0176270828,
0.2360850871,
-0.3745494187,
-0.0451739728,
-0.3808813691,
0.0096498877,
0.0484975874,
-0.1260537207,
-0.159731403,
0.0313260742,
0.0340272896,
-0.0899489671,
-0.2625426054,
0.1525456458,
0.2857145369,
0.0725883618,
0.1878762394,
-0.2299561948,
-0.1870911866,
0.0890277028,
0.1067847461,
-0.0448155813,
0.0374788716,
0.4447661936,
0.358741641,
-0.10350582,
-0.0842834264,
-0.1756999195,
-0.0139695816,
0.1993031055,
0.4816289544,
0.018682234,
0.0138763636,
0.287936151,
-0.0183102097,
-0.0949821919,
0.0105205849,
0.2952637374,
0.2227563858,
0.1748559922,
0.3795814514,
-0.0568386167,
-0.4856646061,
0.0033313073,
-0.0617631972,
-0.1641254723,
0.1717436314,
0.3564146459,
0.2143042982,
0.1797748655,
-0.1373206377,
0.0054708868,
0.3706023097,
0.443055898,
0.251837343,
-0.0295525715,
-0.3219552636,
0.3352751136,
-0.3268110156,
0.4080523252,
-0.0225137025,
-0.3146441579,
-0.0225451831,
-0.1043130159,
0.3645898104,
0.0440230034,
-0.3006753027,
0.1305756569,
-0.24810341,
0.0429528914,
-0.0048683807,
0.1862839609,
0.099533312,
-0.0594905838,
-0.0888574198,
-0.3397578001,
0.444216311,
0.1383631527,
-0.0315220393,
-0.4207602143,
0.1421373487,
-0.2364388108,
0.2782037854,
0.3304498494,
0.1277319789,
0.4267578125,
-0.0317622982,
-0.1237798929,
-0.0141851921,
0.0862122327,
-0.1058101058,
0.2842881083,
0.2063486129,
0.2731181979,
0.0385228321,
0.054244712,
-0.2951193154,
0.5469293594,
-0.1160323024,
-0.3950822353,
-0.0497786291,
0.2726547718,
-0.2424453646,
-0.1353651434,
0.0816416889,
0.2359433323,
0.3239525557,
0.2044597566,
-0.3263272047,
-0.3508822918,
0.5388799906,
-0.364967972,
-0.1342417747,
-0.1359106451,
0.086956203,
0.3645114303,
0.11802385,
-0.7096751928,
-0.009728387,
0.2363465428,
-0.1331491768,
-0.0409499221,
-0.1120016724,
-0.1369568706,
-0.1903095245,
-0.2534229457,
0.6608252525,
-0.0571324974,
-0.0422460362,
-0.1349737495,
-0.3494077325
] |
https://github.com/huggingface/datasets/issues/616 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors | > To users it would not be clear at all what is going on e.g. when they check for write access (which will return True) but then they get a warning that the array is not writeable. That's extremely confusing.
Confusion can be resolved with a helpful error message. In this case, that error message can be controlled by huggingface/datasets. The right argument here is that if code depends on `.flags.writable` being truthful (not just for warnings), then it will cause unavoidable errors. Although, I can't imagine such a use-case.
> If your only goal is to get rid of warnings in your code, then you can just use a simplefilter for UserWarnings in your own code. Changing the code-base in such an intuitive way does not seem like a good way to go and sets a bad precedent, imo.
I don't want to ignore all `UserWarnings`, nor all warnings regarding non-writable arrays. Ignoring warnings leads to hard to debug issues.
> IMO a warning can stay (as they can be filtered by users anyway), but it can be clarified why the warning takes place.
Plain use cases should really not generate warnings. It teaches developers to ignore warnings which is a terrible practice.
---
The best solution would be to allow non-writable arrays in `DataLoader`, but that is a PyTorch issue. | I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List. | 222 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors
I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List.
> To users it would not be clear at all what is going on e.g. when they check for write access (which will return True) but then they get a warning that the array is not writeable. That's extremely confusing.
Confusion can be resolved with a helpful error message. In this case, that error message can be controlled by huggingface/datasets. The right argument here is that if code depends on `.flags.writable` being truthful (not just for warnings), then it will cause unavoidable errors. Although, I can't imagine such a use-case.
> If your only goal is to get rid of warnings in your code, then you can just use a simplefilter for UserWarnings in your own code. Changing the code-base in such an intuitive way does not seem like a good way to go and sets a bad precedent, imo.
I don't want to ignore all `UserWarnings`, nor all warnings regarding non-writable arrays. Ignoring warnings leads to hard to debug issues.
> IMO a warning can stay (as they can be filtered by users anyway), but it can be clarified why the warning takes place.
Plain use cases should really not generate warnings. It teaches developers to ignore warnings which is a terrible practice.
---
The best solution would be to allow non-writable arrays in `DataLoader`, but that is a PyTorch issue. | [
0.2382436991,
-0.3691009879,
0.0651262179,
0.1298750937,
0.4380832016,
0.0914001763,
0.6286587119,
0.2343869358,
0.297601819,
0.0604874976,
-0.1968633384,
0.4237813354,
-0.3091077507,
-0.2186416239,
-0.2565727532,
-0.2071908116,
-0.0199281424,
0.1609406918,
0.0264156014,
-0.0268478245,
-0.0556125604,
0.0349227414,
-0.173422873,
0.3379642367,
-0.225587368,
-0.409830004,
0.2081688046,
-0.0633970797,
-0.0799703747,
-0.3318473399,
0.1405587494,
-0.2741777301,
-0.0424051955,
0.1276778132,
-0.0001266726,
0.115719974,
0.0211010482,
-0.0969709903,
-0.0743484646,
-0.1103796959,
0.7791624069,
-0.3006571531,
0.2598287463,
-0.5041794777,
-0.0403004214,
-0.2053054571,
-0.0674015433,
-0.2887424231,
0.6341737509,
0.2032752633,
0.0752524808,
0.5057702065,
0.1049248055,
0.3881486058,
-0.0660524815,
0.3452899754,
-0.2550108433,
-0.0293221362,
0.0295191612,
0.1392834634,
-0.0000623427,
0.2642534673,
-0.591407299,
0.0171456039,
0.1526442766,
-0.0334381759,
-0.3494703472,
-0.1997034103,
-0.056856554,
0.4435257912,
0.3706564307,
-0.2254173011,
-0.0462095737,
-0.3330504298,
-0.1216261089,
-0.15876773,
0.1649688482,
0.0478616655,
-0.1130081713,
0.1626305133,
0.0142953508,
-0.0069781654,
-0.1369983852,
0.2631309032,
0.0543760359,
0.1673308313,
-0.1042309105,
0.3020677567,
-0.0703181326,
-0.0098414812,
0.052553013,
0.0074248388,
0.190158084,
0.1962173134,
-0.1164152175,
-0.3119843006,
-0.259937942,
-0.7474352717,
-0.0870940238,
-0.2492241859,
0.3341350555,
0.2540560067,
-0.1607196927,
0.0971487164,
0.146938771,
0.0135782659,
0.1208702549,
0.3030396402,
-0.0211056247,
-0.264131099,
0.4715954661,
0.0567320958,
0.1360284239,
-0.2395027131,
0.0944851488,
0.0855746642,
0.0345737711,
0.0435557663,
-0.4489812255,
0.1235724092,
-0.2449782193,
0.0512569025,
-0.0606116205,
0.0437430851,
-0.0110614114,
0.0084558968,
0.0809399411,
0.3331674039,
-0.2172212154,
0.2093103677,
0.0050660782,
0.2291955501,
-0.1825449467,
-0.1943187118,
-0.0641829595,
0.2339447439,
0.0189593509,
0.261413604,
0.0302924141,
0.0009380952,
0.1234879121,
-0.2359407246,
0.6530772448,
0.2942769527,
-0.1781086177,
0.2948441505,
0.3621657789,
-0.0307601988,
-0.3972629905,
0.2834928632,
-0.3104486465,
-0.2276653945,
-0.5769512057,
0.0020176312,
-0.1900073588,
0.1614835858,
0.1787660718,
-0.2559921443,
0.6716822386,
-0.1626642048,
0.4028994739,
-0.437556386,
-0.0927187949,
-0.3298019171,
-0.0530219451,
0.3662084937,
-0.3259871602,
-0.0640388876,
0.2587724626,
-0.0162681695,
0.4336456358,
0.4094215333,
-0.1056689024,
0.1454502344,
-0.1220319569,
0.2255038023,
0.3607750535,
-0.1224516407,
-0.0684337988,
0.1887910813,
-0.1807033718,
0.3171916008,
0.1701441705,
0.0854641497,
-0.1042877659,
0.1792413145,
0.2188511193,
0.0108663999,
0.3695319891,
-0.0156878829,
-0.3240776062,
-0.2027048916,
0.4832918048,
-0.135716185,
-0.0672294647,
-0.1775229275,
-0.1403666735,
0.0221993178,
0.3671857119,
-0.1119872481,
0.029092133,
-0.0411327407,
-0.1673816741,
0.239643842,
-0.1404467672,
0.1050847545,
-0.1710646003,
-0.0152081698,
0.0745917261,
0.1085797846,
-0.2709116638,
-0.1766556054,
-0.105172269,
0.0071569905,
-0.3517927229,
-0.0465407893,
-0.1111046374,
-0.1885503531,
-0.1136639789,
0.1317767352,
-0.1785909235,
-0.1946375519,
-0.5005661249,
0.1517337561,
-0.4120476246,
-0.1020532027,
-0.1012719423,
-0.2124432474,
-0.3982958198,
0.3267835379,
0.2324446142,
-0.0355697833,
-0.0395299904,
0.0819810331,
0.2321104109,
-0.1997633576,
-0.9175687432,
0.1638943851,
0.1763909906,
-0.0909303054,
0.247208178,
0.1546778977,
-0.1466421187,
-0.0960708112,
-0.0257538296,
0.6306298375,
-0.0646004081,
0.0608096346,
-0.5003480315,
0.2619295716,
-0.1911873221,
0.1908104569,
-0.104991138,
-0.4325118661,
-0.0143133253,
0.1304036528,
-0.2320646793,
-0.0843059868,
-0.4490355253,
-0.1313849539,
0.0598227531,
0.0488688499,
0.2781394124,
-0.0805687159,
0.1933730543,
-0.2550164759,
0.2152751684,
-0.542080164,
0.1550076306,
-0.1119837165,
-0.0120684784,
-0.0852295607,
-0.1558952183,
-0.2969709039,
-0.0197966024,
0.0393949226,
-0.016540613,
0.1850438118,
-0.0722505078,
0.1588742882,
-0.3456683755,
0.1977575421,
-0.1231663674,
0.0587117001,
-0.4983721375,
0.2802624404,
-0.1423192024,
0.0969335139,
-0.2249714434,
-0.2606526017,
0.0266309939,
-0.0663829297,
-0.0077089262,
0.0847670138,
-0.2427747399,
0.1668353081,
-0.0719062611,
0.0822577551,
0.1976012886,
-0.1022715643,
-0.0642505735,
0.0905624181,
-0.0684162527,
0.0219139904,
-0.0892238319,
-0.1524734497,
0.2516382933,
-0.0162401274,
-0.2449785173,
-0.100350745,
-0.2109434903,
0.1382278949,
-0.1868401617,
-0.1263593435,
0.3529061079,
0.1456684619,
-0.2808857262,
-0.2442341745,
-0.0112628527,
-0.0327508897,
0.0331093892,
0.1553822458,
0.0217528008,
0.1336048841,
-0.0648510531,
-0.1858778596,
-0.4236634672,
-0.0582789332,
0.1405141503,
-0.1072302759,
0.4788044989,
0.2815491557,
0.2555831075,
0.1540787518,
-0.0803939626,
0.2228563279,
-0.2713090479,
-0.0046551824,
0.3146921098,
-0.1752707064,
-0.1823731512,
-0.2065906227,
0.1696433276,
0.0142174046,
0.0761996806,
-0.3303347826,
-0.1466783583,
-0.0949349925,
0.1207671911,
-0.2022130191,
-0.1867316663,
0.241856575,
0.2284965515,
0.0300903469,
0.1885069311,
0.2500556707,
0.0332655087,
-0.2170520723,
-0.0861485377,
-0.0437795743,
0.1183455586,
0.2379997671,
0.4617760181,
-0.04200764,
-0.2846428454,
0.1601476371,
0.0010736026,
0.5781361461,
-0.1647350937,
-0.3022521138,
0.0216057971,
0.076786153,
0.0009506643,
-0.1333919913,
-0.3836981654,
-0.0001531877,
0.2423481345,
0.0825090408,
0.1085125506,
-0.2214960456,
0.4477954209,
-0.3171699345,
0.0087405182,
-0.1848477274,
0.7597175837,
-0.1085423976,
-0.2938228846,
-0.1163827181,
0.2062702775,
0.2668506503,
-0.0254102536,
-0.4820897579,
-0.3349626362,
-0.3202604651,
0.2719302177,
0.1470128447,
0.6929335594,
-0.1348297298,
-0.2319810092,
0.3621507883,
0.2398637682,
0.2768635154,
0.0184219684,
-0.2167359442,
0.2924697995,
0.079120338,
-0.3491104245,
-0.114071846,
-0.0095899254,
0.1074212492,
-0.0088581964,
0.4121201932,
-0.1308528185,
-0.2242531031,
0.1573219597,
0.1882412881,
-0.1615239978,
0.0613937303,
-0.2136749923,
-0.2623848915,
-0.3775331676,
0.1652600169,
0.2398949862,
0.2294293642,
-0.2508633137,
0.0488718301,
-0.0084932987,
0.245666936,
0.317879051,
-0.0956499428,
0.1522447765,
0.1483535916,
-0.3418127,
0.3096509874,
0.3398792148,
-0.0277442262,
-0.1095982566,
-0.1247121692,
0.1159104705,
0.1117185652,
-0.0217786059,
0.2172192186,
0.4647897482,
0.1280826926,
0.4410914779,
-0.1509371698,
-0.1095777303,
-0.1212500036,
0.1708003879,
0.404702723,
-0.0328603759,
-0.3123688102,
-0.4118335247,
0.259888202,
0.1042160988,
-0.0633762181,
0.3375406265,
-0.0369731858,
-0.1301323026,
0.6268112659,
0.7443352938,
1.2204270363,
-0.2085443139,
0.5956696868,
0.3273388743,
-0.0674319789,
0.6187636256,
-0.0102656037,
0.042403549,
-0.2683009505,
0.1548625231,
-0.1980776787,
-0.1577159613,
0.2015676349,
-0.0019608289,
-0.0411036313,
-0.0536004901,
-0.5109753609,
0.3334376514,
0.0068877749,
0.0026787296,
-0.101127699,
-0.2555353642,
0.2363881171,
-0.1040956751,
-0.079145968,
0.3209194541,
-0.0350440517,
0.0317473672,
0.1873893738,
-0.27684021,
-0.2541230321,
-0.0294099823,
0.1044800133,
0.5604251623,
0.0119874794,
-0.609200716,
0.2883669138,
0.4530895352,
0.380843848,
0.1035279259,
0.0959939808,
-0.1774269342,
0.2179631293,
-0.3489566445,
-0.048572395,
-0.0852726698,
0.4823371768,
0.1212895066,
-0.08624883,
0.452154994,
-0.0263081845,
-0.1532788128,
-0.3243341446,
0.097329177,
0.4660350978,
-0.656159699,
-0.0559298396,
-0.2054632306,
0.0001007468,
-0.062172398,
-0.0448218659,
0.21907565,
0.0646789074,
0.103734754,
-0.2519076765,
-0.4340578914,
-0.036706157,
0.2222429961,
0.2893690765,
-0.117922321,
0.879553318,
0.0664853454,
-0.3234810829,
-0.1874034107,
0.037170127,
0.2153206617,
-0.1092451066,
-0.0018604547,
0.1581524909,
-0.1175391674,
-0.3384717703,
0.3617874384,
0.0083908569,
0.2841087282,
-0.3869201839,
-0.1965320259,
-0.4224751592,
-0.0321235135,
0.2779538929,
0.3617931008,
0.0696603879,
0.1996623129,
-0.1247276366,
-0.1013158858,
-0.1560570002,
-0.0752203166,
0.020221211,
0.0376656204,
0.0879769921,
-0.0319657587,
-0.0273688193,
-0.1754417121,
-0.0822160318,
0.0666388795,
-0.1285915673,
-0.0146795921,
-0.2238382995,
0.257093668,
0.1262626499,
-0.4150514603,
-0.152778849,
-0.2237046957,
0.0586871617,
-0.1579182148,
0.1590574384,
0.5828672051,
0.1063295007,
0.3782193661,
-0.2470910251,
0.56589818,
-0.0748700351,
-0.0877131075,
-0.3030568361,
0.0571165048,
-0.039043311,
-0.0320369825,
0.0130809136,
-0.0603504628,
0.0060970448,
0.1890246868,
0.2629618645,
0.0186588839,
-0.0052154213,
-0.0120388456,
-0.1070672274,
-0.3753263354,
-0.0517678708,
0.189355582,
-0.2479213476,
-0.5765467882,
0.2642937005,
0.0123998243,
0.0169081911,
0.0137433447,
-0.4336454272,
0.0795568451,
-0.031048052,
0.170232892,
0.0805719495,
0.1417364776,
-0.0181905963,
0.09814547,
0.2180120051,
-0.3502316177,
0.12948443,
0.4828861952,
-0.2283920646,
-0.0452727117,
0.4522110522,
-0.0312917605,
0.2341942936,
-0.1334195435,
0.0915353298,
0.4512062967,
0.1692260206,
-0.0341872424,
-0.3178962767,
-0.3285260201,
0.1474927664,
0.2378780991,
0.0970565528,
0.3567236066,
0.2927554846,
0.4019019902,
-0.4451822639,
-0.0209467206,
0.0176270828,
0.2360850871,
-0.3745494187,
-0.0451739728,
-0.3808813691,
0.0096498877,
0.0484975874,
-0.1260537207,
-0.159731403,
0.0313260742,
0.0340272896,
-0.0899489671,
-0.2625426054,
0.1525456458,
0.2857145369,
0.0725883618,
0.1878762394,
-0.2299561948,
-0.1870911866,
0.0890277028,
0.1067847461,
-0.0448155813,
0.0374788716,
0.4447661936,
0.358741641,
-0.10350582,
-0.0842834264,
-0.1756999195,
-0.0139695816,
0.1993031055,
0.4816289544,
0.018682234,
0.0138763636,
0.287936151,
-0.0183102097,
-0.0949821919,
0.0105205849,
0.2952637374,
0.2227563858,
0.1748559922,
0.3795814514,
-0.0568386167,
-0.4856646061,
0.0033313073,
-0.0617631972,
-0.1641254723,
0.1717436314,
0.3564146459,
0.2143042982,
0.1797748655,
-0.1373206377,
0.0054708868,
0.3706023097,
0.443055898,
0.251837343,
-0.0295525715,
-0.3219552636,
0.3352751136,
-0.3268110156,
0.4080523252,
-0.0225137025,
-0.3146441579,
-0.0225451831,
-0.1043130159,
0.3645898104,
0.0440230034,
-0.3006753027,
0.1305756569,
-0.24810341,
0.0429528914,
-0.0048683807,
0.1862839609,
0.099533312,
-0.0594905838,
-0.0888574198,
-0.3397578001,
0.444216311,
0.1383631527,
-0.0315220393,
-0.4207602143,
0.1421373487,
-0.2364388108,
0.2782037854,
0.3304498494,
0.1277319789,
0.4267578125,
-0.0317622982,
-0.1237798929,
-0.0141851921,
0.0862122327,
-0.1058101058,
0.2842881083,
0.2063486129,
0.2731181979,
0.0385228321,
0.054244712,
-0.2951193154,
0.5469293594,
-0.1160323024,
-0.3950822353,
-0.0497786291,
0.2726547718,
-0.2424453646,
-0.1353651434,
0.0816416889,
0.2359433323,
0.3239525557,
0.2044597566,
-0.3263272047,
-0.3508822918,
0.5388799906,
-0.364967972,
-0.1342417747,
-0.1359106451,
0.086956203,
0.3645114303,
0.11802385,
-0.7096751928,
-0.009728387,
0.2363465428,
-0.1331491768,
-0.0409499221,
-0.1120016724,
-0.1369568706,
-0.1903095245,
-0.2534229457,
0.6608252525,
-0.0571324974,
-0.0422460362,
-0.1349737495,
-0.3494077325
] |
https://github.com/huggingface/datasets/issues/616 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors | > The right argument here is that if code depends on `.flags.writable` being truthful (not just for warnings), then it will cause unavoidable errors. Although, I can't imagine such a use-case.
That's exactly the argument in my first sentence. Too often someone "cannot think of a use-case", but you can not foresee the use-cases of a whole research community.
> I don't want to ignore all `UserWarnings`, nor all warnings regarding non-writable arrays. Ignoring warnings leads to hard to debug issues.
That's fair.
> Plain use cases should really not generate warnings. It teaches developers to ignore warnings which is a terrible practice.
But this is not a plain use-case (because Pytorch does not support these read-only tensors). Manually setting the flag to writable will solve the issue on the surface but is basically just a hack to compensate for something that is not allowed in another library.
What about an "ignore_warnings" flag in `set_format` that when True wraps the offending code in a block to ignore userwarnings at that specific step in [_convert_outputs](https://github.com/huggingface/datasets/blob/880c2c76a8223a00c303eab2909371e857113063/src/datasets/arrow_dataset.py#L821)? Something like:
```python
def _convert_outputs(..., ignore_warnings=True):
...
with warnings.catch_warnings():
if ignore_warnings:
warnings.simplefilter("ignore", UserWarning)
return torch.tensor(...)
# continues without warning filter after context manager...
``` | I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List. | 198 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors
I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List.
> The right argument here is that if code depends on `.flags.writable` being truthful (not just for warnings), then it will cause unavoidable errors. Although, I can't imagine such a use-case.
That's exactly the argument in my first sentence. Too often someone "cannot think of a use-case", but you can not foresee the use-cases of a whole research community.
> I don't want to ignore all `UserWarnings`, nor all warnings regarding non-writable arrays. Ignoring warnings leads to hard to debug issues.
That's fair.
> Plain use cases should really not generate warnings. It teaches developers to ignore warnings which is a terrible practice.
But this is not a plain use-case (because Pytorch does not support these read-only tensors). Manually setting the flag to writable will solve the issue on the surface but is basically just a hack to compensate for something that is not allowed in another library.
What about an "ignore_warnings" flag in `set_format` that when True wraps the offending code in a block to ignore userwarnings at that specific step in [_convert_outputs](https://github.com/huggingface/datasets/blob/880c2c76a8223a00c303eab2909371e857113063/src/datasets/arrow_dataset.py#L821)? Something like:
```python
def _convert_outputs(..., ignore_warnings=True):
...
with warnings.catch_warnings():
if ignore_warnings:
warnings.simplefilter("ignore", UserWarning)
return torch.tensor(...)
# continues without warning filter after context manager...
``` | [
0.2382436991,
-0.3691009879,
0.0651262179,
0.1298750937,
0.4380832016,
0.0914001763,
0.6286587119,
0.2343869358,
0.297601819,
0.0604874976,
-0.1968633384,
0.4237813354,
-0.3091077507,
-0.2186416239,
-0.2565727532,
-0.2071908116,
-0.0199281424,
0.1609406918,
0.0264156014,
-0.0268478245,
-0.0556125604,
0.0349227414,
-0.173422873,
0.3379642367,
-0.225587368,
-0.409830004,
0.2081688046,
-0.0633970797,
-0.0799703747,
-0.3318473399,
0.1405587494,
-0.2741777301,
-0.0424051955,
0.1276778132,
-0.0001266726,
0.115719974,
0.0211010482,
-0.0969709903,
-0.0743484646,
-0.1103796959,
0.7791624069,
-0.3006571531,
0.2598287463,
-0.5041794777,
-0.0403004214,
-0.2053054571,
-0.0674015433,
-0.2887424231,
0.6341737509,
0.2032752633,
0.0752524808,
0.5057702065,
0.1049248055,
0.3881486058,
-0.0660524815,
0.3452899754,
-0.2550108433,
-0.0293221362,
0.0295191612,
0.1392834634,
-0.0000623427,
0.2642534673,
-0.591407299,
0.0171456039,
0.1526442766,
-0.0334381759,
-0.3494703472,
-0.1997034103,
-0.056856554,
0.4435257912,
0.3706564307,
-0.2254173011,
-0.0462095737,
-0.3330504298,
-0.1216261089,
-0.15876773,
0.1649688482,
0.0478616655,
-0.1130081713,
0.1626305133,
0.0142953508,
-0.0069781654,
-0.1369983852,
0.2631309032,
0.0543760359,
0.1673308313,
-0.1042309105,
0.3020677567,
-0.0703181326,
-0.0098414812,
0.052553013,
0.0074248388,
0.190158084,
0.1962173134,
-0.1164152175,
-0.3119843006,
-0.259937942,
-0.7474352717,
-0.0870940238,
-0.2492241859,
0.3341350555,
0.2540560067,
-0.1607196927,
0.0971487164,
0.146938771,
0.0135782659,
0.1208702549,
0.3030396402,
-0.0211056247,
-0.264131099,
0.4715954661,
0.0567320958,
0.1360284239,
-0.2395027131,
0.0944851488,
0.0855746642,
0.0345737711,
0.0435557663,
-0.4489812255,
0.1235724092,
-0.2449782193,
0.0512569025,
-0.0606116205,
0.0437430851,
-0.0110614114,
0.0084558968,
0.0809399411,
0.3331674039,
-0.2172212154,
0.2093103677,
0.0050660782,
0.2291955501,
-0.1825449467,
-0.1943187118,
-0.0641829595,
0.2339447439,
0.0189593509,
0.261413604,
0.0302924141,
0.0009380952,
0.1234879121,
-0.2359407246,
0.6530772448,
0.2942769527,
-0.1781086177,
0.2948441505,
0.3621657789,
-0.0307601988,
-0.3972629905,
0.2834928632,
-0.3104486465,
-0.2276653945,
-0.5769512057,
0.0020176312,
-0.1900073588,
0.1614835858,
0.1787660718,
-0.2559921443,
0.6716822386,
-0.1626642048,
0.4028994739,
-0.437556386,
-0.0927187949,
-0.3298019171,
-0.0530219451,
0.3662084937,
-0.3259871602,
-0.0640388876,
0.2587724626,
-0.0162681695,
0.4336456358,
0.4094215333,
-0.1056689024,
0.1454502344,
-0.1220319569,
0.2255038023,
0.3607750535,
-0.1224516407,
-0.0684337988,
0.1887910813,
-0.1807033718,
0.3171916008,
0.1701441705,
0.0854641497,
-0.1042877659,
0.1792413145,
0.2188511193,
0.0108663999,
0.3695319891,
-0.0156878829,
-0.3240776062,
-0.2027048916,
0.4832918048,
-0.135716185,
-0.0672294647,
-0.1775229275,
-0.1403666735,
0.0221993178,
0.3671857119,
-0.1119872481,
0.029092133,
-0.0411327407,
-0.1673816741,
0.239643842,
-0.1404467672,
0.1050847545,
-0.1710646003,
-0.0152081698,
0.0745917261,
0.1085797846,
-0.2709116638,
-0.1766556054,
-0.105172269,
0.0071569905,
-0.3517927229,
-0.0465407893,
-0.1111046374,
-0.1885503531,
-0.1136639789,
0.1317767352,
-0.1785909235,
-0.1946375519,
-0.5005661249,
0.1517337561,
-0.4120476246,
-0.1020532027,
-0.1012719423,
-0.2124432474,
-0.3982958198,
0.3267835379,
0.2324446142,
-0.0355697833,
-0.0395299904,
0.0819810331,
0.2321104109,
-0.1997633576,
-0.9175687432,
0.1638943851,
0.1763909906,
-0.0909303054,
0.247208178,
0.1546778977,
-0.1466421187,
-0.0960708112,
-0.0257538296,
0.6306298375,
-0.0646004081,
0.0608096346,
-0.5003480315,
0.2619295716,
-0.1911873221,
0.1908104569,
-0.104991138,
-0.4325118661,
-0.0143133253,
0.1304036528,
-0.2320646793,
-0.0843059868,
-0.4490355253,
-0.1313849539,
0.0598227531,
0.0488688499,
0.2781394124,
-0.0805687159,
0.1933730543,
-0.2550164759,
0.2152751684,
-0.542080164,
0.1550076306,
-0.1119837165,
-0.0120684784,
-0.0852295607,
-0.1558952183,
-0.2969709039,
-0.0197966024,
0.0393949226,
-0.016540613,
0.1850438118,
-0.0722505078,
0.1588742882,
-0.3456683755,
0.1977575421,
-0.1231663674,
0.0587117001,
-0.4983721375,
0.2802624404,
-0.1423192024,
0.0969335139,
-0.2249714434,
-0.2606526017,
0.0266309939,
-0.0663829297,
-0.0077089262,
0.0847670138,
-0.2427747399,
0.1668353081,
-0.0719062611,
0.0822577551,
0.1976012886,
-0.1022715643,
-0.0642505735,
0.0905624181,
-0.0684162527,
0.0219139904,
-0.0892238319,
-0.1524734497,
0.2516382933,
-0.0162401274,
-0.2449785173,
-0.100350745,
-0.2109434903,
0.1382278949,
-0.1868401617,
-0.1263593435,
0.3529061079,
0.1456684619,
-0.2808857262,
-0.2442341745,
-0.0112628527,
-0.0327508897,
0.0331093892,
0.1553822458,
0.0217528008,
0.1336048841,
-0.0648510531,
-0.1858778596,
-0.4236634672,
-0.0582789332,
0.1405141503,
-0.1072302759,
0.4788044989,
0.2815491557,
0.2555831075,
0.1540787518,
-0.0803939626,
0.2228563279,
-0.2713090479,
-0.0046551824,
0.3146921098,
-0.1752707064,
-0.1823731512,
-0.2065906227,
0.1696433276,
0.0142174046,
0.0761996806,
-0.3303347826,
-0.1466783583,
-0.0949349925,
0.1207671911,
-0.2022130191,
-0.1867316663,
0.241856575,
0.2284965515,
0.0300903469,
0.1885069311,
0.2500556707,
0.0332655087,
-0.2170520723,
-0.0861485377,
-0.0437795743,
0.1183455586,
0.2379997671,
0.4617760181,
-0.04200764,
-0.2846428454,
0.1601476371,
0.0010736026,
0.5781361461,
-0.1647350937,
-0.3022521138,
0.0216057971,
0.076786153,
0.0009506643,
-0.1333919913,
-0.3836981654,
-0.0001531877,
0.2423481345,
0.0825090408,
0.1085125506,
-0.2214960456,
0.4477954209,
-0.3171699345,
0.0087405182,
-0.1848477274,
0.7597175837,
-0.1085423976,
-0.2938228846,
-0.1163827181,
0.2062702775,
0.2668506503,
-0.0254102536,
-0.4820897579,
-0.3349626362,
-0.3202604651,
0.2719302177,
0.1470128447,
0.6929335594,
-0.1348297298,
-0.2319810092,
0.3621507883,
0.2398637682,
0.2768635154,
0.0184219684,
-0.2167359442,
0.2924697995,
0.079120338,
-0.3491104245,
-0.114071846,
-0.0095899254,
0.1074212492,
-0.0088581964,
0.4121201932,
-0.1308528185,
-0.2242531031,
0.1573219597,
0.1882412881,
-0.1615239978,
0.0613937303,
-0.2136749923,
-0.2623848915,
-0.3775331676,
0.1652600169,
0.2398949862,
0.2294293642,
-0.2508633137,
0.0488718301,
-0.0084932987,
0.245666936,
0.317879051,
-0.0956499428,
0.1522447765,
0.1483535916,
-0.3418127,
0.3096509874,
0.3398792148,
-0.0277442262,
-0.1095982566,
-0.1247121692,
0.1159104705,
0.1117185652,
-0.0217786059,
0.2172192186,
0.4647897482,
0.1280826926,
0.4410914779,
-0.1509371698,
-0.1095777303,
-0.1212500036,
0.1708003879,
0.404702723,
-0.0328603759,
-0.3123688102,
-0.4118335247,
0.259888202,
0.1042160988,
-0.0633762181,
0.3375406265,
-0.0369731858,
-0.1301323026,
0.6268112659,
0.7443352938,
1.2204270363,
-0.2085443139,
0.5956696868,
0.3273388743,
-0.0674319789,
0.6187636256,
-0.0102656037,
0.042403549,
-0.2683009505,
0.1548625231,
-0.1980776787,
-0.1577159613,
0.2015676349,
-0.0019608289,
-0.0411036313,
-0.0536004901,
-0.5109753609,
0.3334376514,
0.0068877749,
0.0026787296,
-0.101127699,
-0.2555353642,
0.2363881171,
-0.1040956751,
-0.079145968,
0.3209194541,
-0.0350440517,
0.0317473672,
0.1873893738,
-0.27684021,
-0.2541230321,
-0.0294099823,
0.1044800133,
0.5604251623,
0.0119874794,
-0.609200716,
0.2883669138,
0.4530895352,
0.380843848,
0.1035279259,
0.0959939808,
-0.1774269342,
0.2179631293,
-0.3489566445,
-0.048572395,
-0.0852726698,
0.4823371768,
0.1212895066,
-0.08624883,
0.452154994,
-0.0263081845,
-0.1532788128,
-0.3243341446,
0.097329177,
0.4660350978,
-0.656159699,
-0.0559298396,
-0.2054632306,
0.0001007468,
-0.062172398,
-0.0448218659,
0.21907565,
0.0646789074,
0.103734754,
-0.2519076765,
-0.4340578914,
-0.036706157,
0.2222429961,
0.2893690765,
-0.117922321,
0.879553318,
0.0664853454,
-0.3234810829,
-0.1874034107,
0.037170127,
0.2153206617,
-0.1092451066,
-0.0018604547,
0.1581524909,
-0.1175391674,
-0.3384717703,
0.3617874384,
0.0083908569,
0.2841087282,
-0.3869201839,
-0.1965320259,
-0.4224751592,
-0.0321235135,
0.2779538929,
0.3617931008,
0.0696603879,
0.1996623129,
-0.1247276366,
-0.1013158858,
-0.1560570002,
-0.0752203166,
0.020221211,
0.0376656204,
0.0879769921,
-0.0319657587,
-0.0273688193,
-0.1754417121,
-0.0822160318,
0.0666388795,
-0.1285915673,
-0.0146795921,
-0.2238382995,
0.257093668,
0.1262626499,
-0.4150514603,
-0.152778849,
-0.2237046957,
0.0586871617,
-0.1579182148,
0.1590574384,
0.5828672051,
0.1063295007,
0.3782193661,
-0.2470910251,
0.56589818,
-0.0748700351,
-0.0877131075,
-0.3030568361,
0.0571165048,
-0.039043311,
-0.0320369825,
0.0130809136,
-0.0603504628,
0.0060970448,
0.1890246868,
0.2629618645,
0.0186588839,
-0.0052154213,
-0.0120388456,
-0.1070672274,
-0.3753263354,
-0.0517678708,
0.189355582,
-0.2479213476,
-0.5765467882,
0.2642937005,
0.0123998243,
0.0169081911,
0.0137433447,
-0.4336454272,
0.0795568451,
-0.031048052,
0.170232892,
0.0805719495,
0.1417364776,
-0.0181905963,
0.09814547,
0.2180120051,
-0.3502316177,
0.12948443,
0.4828861952,
-0.2283920646,
-0.0452727117,
0.4522110522,
-0.0312917605,
0.2341942936,
-0.1334195435,
0.0915353298,
0.4512062967,
0.1692260206,
-0.0341872424,
-0.3178962767,
-0.3285260201,
0.1474927664,
0.2378780991,
0.0970565528,
0.3567236066,
0.2927554846,
0.4019019902,
-0.4451822639,
-0.0209467206,
0.0176270828,
0.2360850871,
-0.3745494187,
-0.0451739728,
-0.3808813691,
0.0096498877,
0.0484975874,
-0.1260537207,
-0.159731403,
0.0313260742,
0.0340272896,
-0.0899489671,
-0.2625426054,
0.1525456458,
0.2857145369,
0.0725883618,
0.1878762394,
-0.2299561948,
-0.1870911866,
0.0890277028,
0.1067847461,
-0.0448155813,
0.0374788716,
0.4447661936,
0.358741641,
-0.10350582,
-0.0842834264,
-0.1756999195,
-0.0139695816,
0.1993031055,
0.4816289544,
0.018682234,
0.0138763636,
0.287936151,
-0.0183102097,
-0.0949821919,
0.0105205849,
0.2952637374,
0.2227563858,
0.1748559922,
0.3795814514,
-0.0568386167,
-0.4856646061,
0.0033313073,
-0.0617631972,
-0.1641254723,
0.1717436314,
0.3564146459,
0.2143042982,
0.1797748655,
-0.1373206377,
0.0054708868,
0.3706023097,
0.443055898,
0.251837343,
-0.0295525715,
-0.3219552636,
0.3352751136,
-0.3268110156,
0.4080523252,
-0.0225137025,
-0.3146441579,
-0.0225451831,
-0.1043130159,
0.3645898104,
0.0440230034,
-0.3006753027,
0.1305756569,
-0.24810341,
0.0429528914,
-0.0048683807,
0.1862839609,
0.099533312,
-0.0594905838,
-0.0888574198,
-0.3397578001,
0.444216311,
0.1383631527,
-0.0315220393,
-0.4207602143,
0.1421373487,
-0.2364388108,
0.2782037854,
0.3304498494,
0.1277319789,
0.4267578125,
-0.0317622982,
-0.1237798929,
-0.0141851921,
0.0862122327,
-0.1058101058,
0.2842881083,
0.2063486129,
0.2731181979,
0.0385228321,
0.054244712,
-0.2951193154,
0.5469293594,
-0.1160323024,
-0.3950822353,
-0.0497786291,
0.2726547718,
-0.2424453646,
-0.1353651434,
0.0816416889,
0.2359433323,
0.3239525557,
0.2044597566,
-0.3263272047,
-0.3508822918,
0.5388799906,
-0.364967972,
-0.1342417747,
-0.1359106451,
0.086956203,
0.3645114303,
0.11802385,
-0.7096751928,
-0.009728387,
0.2363465428,
-0.1331491768,
-0.0409499221,
-0.1120016724,
-0.1369568706,
-0.1903095245,
-0.2534229457,
0.6608252525,
-0.0571324974,
-0.0422460362,
-0.1349737495,
-0.3494077325
] |
https://github.com/huggingface/datasets/issues/616 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors | > But this is not a plain use-case (because Pytorch does not support these read-only tensors).
By "plain", I mean the recommended way to use `datasets` with PyTorch according to the `datasets` documentation. | I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List. | 33 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors
I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List.
> But this is not a plain use-case (because Pytorch does not support these read-only tensors).
By "plain", I mean the recommended way to use `datasets` with PyTorch according to the `datasets` documentation. | [
0.2382436991,
-0.3691009879,
0.0651262179,
0.1298750937,
0.4380832016,
0.0914001763,
0.6286587119,
0.2343869358,
0.297601819,
0.0604874976,
-0.1968633384,
0.4237813354,
-0.3091077507,
-0.2186416239,
-0.2565727532,
-0.2071908116,
-0.0199281424,
0.1609406918,
0.0264156014,
-0.0268478245,
-0.0556125604,
0.0349227414,
-0.173422873,
0.3379642367,
-0.225587368,
-0.409830004,
0.2081688046,
-0.0633970797,
-0.0799703747,
-0.3318473399,
0.1405587494,
-0.2741777301,
-0.0424051955,
0.1276778132,
-0.0001266726,
0.115719974,
0.0211010482,
-0.0969709903,
-0.0743484646,
-0.1103796959,
0.7791624069,
-0.3006571531,
0.2598287463,
-0.5041794777,
-0.0403004214,
-0.2053054571,
-0.0674015433,
-0.2887424231,
0.6341737509,
0.2032752633,
0.0752524808,
0.5057702065,
0.1049248055,
0.3881486058,
-0.0660524815,
0.3452899754,
-0.2550108433,
-0.0293221362,
0.0295191612,
0.1392834634,
-0.0000623427,
0.2642534673,
-0.591407299,
0.0171456039,
0.1526442766,
-0.0334381759,
-0.3494703472,
-0.1997034103,
-0.056856554,
0.4435257912,
0.3706564307,
-0.2254173011,
-0.0462095737,
-0.3330504298,
-0.1216261089,
-0.15876773,
0.1649688482,
0.0478616655,
-0.1130081713,
0.1626305133,
0.0142953508,
-0.0069781654,
-0.1369983852,
0.2631309032,
0.0543760359,
0.1673308313,
-0.1042309105,
0.3020677567,
-0.0703181326,
-0.0098414812,
0.052553013,
0.0074248388,
0.190158084,
0.1962173134,
-0.1164152175,
-0.3119843006,
-0.259937942,
-0.7474352717,
-0.0870940238,
-0.2492241859,
0.3341350555,
0.2540560067,
-0.1607196927,
0.0971487164,
0.146938771,
0.0135782659,
0.1208702549,
0.3030396402,
-0.0211056247,
-0.264131099,
0.4715954661,
0.0567320958,
0.1360284239,
-0.2395027131,
0.0944851488,
0.0855746642,
0.0345737711,
0.0435557663,
-0.4489812255,
0.1235724092,
-0.2449782193,
0.0512569025,
-0.0606116205,
0.0437430851,
-0.0110614114,
0.0084558968,
0.0809399411,
0.3331674039,
-0.2172212154,
0.2093103677,
0.0050660782,
0.2291955501,
-0.1825449467,
-0.1943187118,
-0.0641829595,
0.2339447439,
0.0189593509,
0.261413604,
0.0302924141,
0.0009380952,
0.1234879121,
-0.2359407246,
0.6530772448,
0.2942769527,
-0.1781086177,
0.2948441505,
0.3621657789,
-0.0307601988,
-0.3972629905,
0.2834928632,
-0.3104486465,
-0.2276653945,
-0.5769512057,
0.0020176312,
-0.1900073588,
0.1614835858,
0.1787660718,
-0.2559921443,
0.6716822386,
-0.1626642048,
0.4028994739,
-0.437556386,
-0.0927187949,
-0.3298019171,
-0.0530219451,
0.3662084937,
-0.3259871602,
-0.0640388876,
0.2587724626,
-0.0162681695,
0.4336456358,
0.4094215333,
-0.1056689024,
0.1454502344,
-0.1220319569,
0.2255038023,
0.3607750535,
-0.1224516407,
-0.0684337988,
0.1887910813,
-0.1807033718,
0.3171916008,
0.1701441705,
0.0854641497,
-0.1042877659,
0.1792413145,
0.2188511193,
0.0108663999,
0.3695319891,
-0.0156878829,
-0.3240776062,
-0.2027048916,
0.4832918048,
-0.135716185,
-0.0672294647,
-0.1775229275,
-0.1403666735,
0.0221993178,
0.3671857119,
-0.1119872481,
0.029092133,
-0.0411327407,
-0.1673816741,
0.239643842,
-0.1404467672,
0.1050847545,
-0.1710646003,
-0.0152081698,
0.0745917261,
0.1085797846,
-0.2709116638,
-0.1766556054,
-0.105172269,
0.0071569905,
-0.3517927229,
-0.0465407893,
-0.1111046374,
-0.1885503531,
-0.1136639789,
0.1317767352,
-0.1785909235,
-0.1946375519,
-0.5005661249,
0.1517337561,
-0.4120476246,
-0.1020532027,
-0.1012719423,
-0.2124432474,
-0.3982958198,
0.3267835379,
0.2324446142,
-0.0355697833,
-0.0395299904,
0.0819810331,
0.2321104109,
-0.1997633576,
-0.9175687432,
0.1638943851,
0.1763909906,
-0.0909303054,
0.247208178,
0.1546778977,
-0.1466421187,
-0.0960708112,
-0.0257538296,
0.6306298375,
-0.0646004081,
0.0608096346,
-0.5003480315,
0.2619295716,
-0.1911873221,
0.1908104569,
-0.104991138,
-0.4325118661,
-0.0143133253,
0.1304036528,
-0.2320646793,
-0.0843059868,
-0.4490355253,
-0.1313849539,
0.0598227531,
0.0488688499,
0.2781394124,
-0.0805687159,
0.1933730543,
-0.2550164759,
0.2152751684,
-0.542080164,
0.1550076306,
-0.1119837165,
-0.0120684784,
-0.0852295607,
-0.1558952183,
-0.2969709039,
-0.0197966024,
0.0393949226,
-0.016540613,
0.1850438118,
-0.0722505078,
0.1588742882,
-0.3456683755,
0.1977575421,
-0.1231663674,
0.0587117001,
-0.4983721375,
0.2802624404,
-0.1423192024,
0.0969335139,
-0.2249714434,
-0.2606526017,
0.0266309939,
-0.0663829297,
-0.0077089262,
0.0847670138,
-0.2427747399,
0.1668353081,
-0.0719062611,
0.0822577551,
0.1976012886,
-0.1022715643,
-0.0642505735,
0.0905624181,
-0.0684162527,
0.0219139904,
-0.0892238319,
-0.1524734497,
0.2516382933,
-0.0162401274,
-0.2449785173,
-0.100350745,
-0.2109434903,
0.1382278949,
-0.1868401617,
-0.1263593435,
0.3529061079,
0.1456684619,
-0.2808857262,
-0.2442341745,
-0.0112628527,
-0.0327508897,
0.0331093892,
0.1553822458,
0.0217528008,
0.1336048841,
-0.0648510531,
-0.1858778596,
-0.4236634672,
-0.0582789332,
0.1405141503,
-0.1072302759,
0.4788044989,
0.2815491557,
0.2555831075,
0.1540787518,
-0.0803939626,
0.2228563279,
-0.2713090479,
-0.0046551824,
0.3146921098,
-0.1752707064,
-0.1823731512,
-0.2065906227,
0.1696433276,
0.0142174046,
0.0761996806,
-0.3303347826,
-0.1466783583,
-0.0949349925,
0.1207671911,
-0.2022130191,
-0.1867316663,
0.241856575,
0.2284965515,
0.0300903469,
0.1885069311,
0.2500556707,
0.0332655087,
-0.2170520723,
-0.0861485377,
-0.0437795743,
0.1183455586,
0.2379997671,
0.4617760181,
-0.04200764,
-0.2846428454,
0.1601476371,
0.0010736026,
0.5781361461,
-0.1647350937,
-0.3022521138,
0.0216057971,
0.076786153,
0.0009506643,
-0.1333919913,
-0.3836981654,
-0.0001531877,
0.2423481345,
0.0825090408,
0.1085125506,
-0.2214960456,
0.4477954209,
-0.3171699345,
0.0087405182,
-0.1848477274,
0.7597175837,
-0.1085423976,
-0.2938228846,
-0.1163827181,
0.2062702775,
0.2668506503,
-0.0254102536,
-0.4820897579,
-0.3349626362,
-0.3202604651,
0.2719302177,
0.1470128447,
0.6929335594,
-0.1348297298,
-0.2319810092,
0.3621507883,
0.2398637682,
0.2768635154,
0.0184219684,
-0.2167359442,
0.2924697995,
0.079120338,
-0.3491104245,
-0.114071846,
-0.0095899254,
0.1074212492,
-0.0088581964,
0.4121201932,
-0.1308528185,
-0.2242531031,
0.1573219597,
0.1882412881,
-0.1615239978,
0.0613937303,
-0.2136749923,
-0.2623848915,
-0.3775331676,
0.1652600169,
0.2398949862,
0.2294293642,
-0.2508633137,
0.0488718301,
-0.0084932987,
0.245666936,
0.317879051,
-0.0956499428,
0.1522447765,
0.1483535916,
-0.3418127,
0.3096509874,
0.3398792148,
-0.0277442262,
-0.1095982566,
-0.1247121692,
0.1159104705,
0.1117185652,
-0.0217786059,
0.2172192186,
0.4647897482,
0.1280826926,
0.4410914779,
-0.1509371698,
-0.1095777303,
-0.1212500036,
0.1708003879,
0.404702723,
-0.0328603759,
-0.3123688102,
-0.4118335247,
0.259888202,
0.1042160988,
-0.0633762181,
0.3375406265,
-0.0369731858,
-0.1301323026,
0.6268112659,
0.7443352938,
1.2204270363,
-0.2085443139,
0.5956696868,
0.3273388743,
-0.0674319789,
0.6187636256,
-0.0102656037,
0.042403549,
-0.2683009505,
0.1548625231,
-0.1980776787,
-0.1577159613,
0.2015676349,
-0.0019608289,
-0.0411036313,
-0.0536004901,
-0.5109753609,
0.3334376514,
0.0068877749,
0.0026787296,
-0.101127699,
-0.2555353642,
0.2363881171,
-0.1040956751,
-0.079145968,
0.3209194541,
-0.0350440517,
0.0317473672,
0.1873893738,
-0.27684021,
-0.2541230321,
-0.0294099823,
0.1044800133,
0.5604251623,
0.0119874794,
-0.609200716,
0.2883669138,
0.4530895352,
0.380843848,
0.1035279259,
0.0959939808,
-0.1774269342,
0.2179631293,
-0.3489566445,
-0.048572395,
-0.0852726698,
0.4823371768,
0.1212895066,
-0.08624883,
0.452154994,
-0.0263081845,
-0.1532788128,
-0.3243341446,
0.097329177,
0.4660350978,
-0.656159699,
-0.0559298396,
-0.2054632306,
0.0001007468,
-0.062172398,
-0.0448218659,
0.21907565,
0.0646789074,
0.103734754,
-0.2519076765,
-0.4340578914,
-0.036706157,
0.2222429961,
0.2893690765,
-0.117922321,
0.879553318,
0.0664853454,
-0.3234810829,
-0.1874034107,
0.037170127,
0.2153206617,
-0.1092451066,
-0.0018604547,
0.1581524909,
-0.1175391674,
-0.3384717703,
0.3617874384,
0.0083908569,
0.2841087282,
-0.3869201839,
-0.1965320259,
-0.4224751592,
-0.0321235135,
0.2779538929,
0.3617931008,
0.0696603879,
0.1996623129,
-0.1247276366,
-0.1013158858,
-0.1560570002,
-0.0752203166,
0.020221211,
0.0376656204,
0.0879769921,
-0.0319657587,
-0.0273688193,
-0.1754417121,
-0.0822160318,
0.0666388795,
-0.1285915673,
-0.0146795921,
-0.2238382995,
0.257093668,
0.1262626499,
-0.4150514603,
-0.152778849,
-0.2237046957,
0.0586871617,
-0.1579182148,
0.1590574384,
0.5828672051,
0.1063295007,
0.3782193661,
-0.2470910251,
0.56589818,
-0.0748700351,
-0.0877131075,
-0.3030568361,
0.0571165048,
-0.039043311,
-0.0320369825,
0.0130809136,
-0.0603504628,
0.0060970448,
0.1890246868,
0.2629618645,
0.0186588839,
-0.0052154213,
-0.0120388456,
-0.1070672274,
-0.3753263354,
-0.0517678708,
0.189355582,
-0.2479213476,
-0.5765467882,
0.2642937005,
0.0123998243,
0.0169081911,
0.0137433447,
-0.4336454272,
0.0795568451,
-0.031048052,
0.170232892,
0.0805719495,
0.1417364776,
-0.0181905963,
0.09814547,
0.2180120051,
-0.3502316177,
0.12948443,
0.4828861952,
-0.2283920646,
-0.0452727117,
0.4522110522,
-0.0312917605,
0.2341942936,
-0.1334195435,
0.0915353298,
0.4512062967,
0.1692260206,
-0.0341872424,
-0.3178962767,
-0.3285260201,
0.1474927664,
0.2378780991,
0.0970565528,
0.3567236066,
0.2927554846,
0.4019019902,
-0.4451822639,
-0.0209467206,
0.0176270828,
0.2360850871,
-0.3745494187,
-0.0451739728,
-0.3808813691,
0.0096498877,
0.0484975874,
-0.1260537207,
-0.159731403,
0.0313260742,
0.0340272896,
-0.0899489671,
-0.2625426054,
0.1525456458,
0.2857145369,
0.0725883618,
0.1878762394,
-0.2299561948,
-0.1870911866,
0.0890277028,
0.1067847461,
-0.0448155813,
0.0374788716,
0.4447661936,
0.358741641,
-0.10350582,
-0.0842834264,
-0.1756999195,
-0.0139695816,
0.1993031055,
0.4816289544,
0.018682234,
0.0138763636,
0.287936151,
-0.0183102097,
-0.0949821919,
0.0105205849,
0.2952637374,
0.2227563858,
0.1748559922,
0.3795814514,
-0.0568386167,
-0.4856646061,
0.0033313073,
-0.0617631972,
-0.1641254723,
0.1717436314,
0.3564146459,
0.2143042982,
0.1797748655,
-0.1373206377,
0.0054708868,
0.3706023097,
0.443055898,
0.251837343,
-0.0295525715,
-0.3219552636,
0.3352751136,
-0.3268110156,
0.4080523252,
-0.0225137025,
-0.3146441579,
-0.0225451831,
-0.1043130159,
0.3645898104,
0.0440230034,
-0.3006753027,
0.1305756569,
-0.24810341,
0.0429528914,
-0.0048683807,
0.1862839609,
0.099533312,
-0.0594905838,
-0.0888574198,
-0.3397578001,
0.444216311,
0.1383631527,
-0.0315220393,
-0.4207602143,
0.1421373487,
-0.2364388108,
0.2782037854,
0.3304498494,
0.1277319789,
0.4267578125,
-0.0317622982,
-0.1237798929,
-0.0141851921,
0.0862122327,
-0.1058101058,
0.2842881083,
0.2063486129,
0.2731181979,
0.0385228321,
0.054244712,
-0.2951193154,
0.5469293594,
-0.1160323024,
-0.3950822353,
-0.0497786291,
0.2726547718,
-0.2424453646,
-0.1353651434,
0.0816416889,
0.2359433323,
0.3239525557,
0.2044597566,
-0.3263272047,
-0.3508822918,
0.5388799906,
-0.364967972,
-0.1342417747,
-0.1359106451,
0.086956203,
0.3645114303,
0.11802385,
-0.7096751928,
-0.009728387,
0.2363465428,
-0.1331491768,
-0.0409499221,
-0.1120016724,
-0.1369568706,
-0.1903095245,
-0.2534229457,
0.6608252525,
-0.0571324974,
-0.0422460362,
-0.1349737495,
-0.3494077325
] |
https://github.com/huggingface/datasets/issues/616 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors | This error is what I see when I run the first lines of the Pytorch Quickstart. It should also say that it should be ignored and/or how to fix it. BTW, this is a Pytorch error message -- not a Huggingface error message. My code runs anyway. | I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List. | 47 | UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors
I am trying out the library and want to load in pickled data with `from_dict`. In that dict, one column `text` should be tokenized and the other (an embedding vector) should be retained. All other columns should be removed. When I eventually try to set the format for the columns with `set_format` I am getting this strange Userwarning without a stack trace:
> Set __getitem__(key) output type to torch for ['input_ids', 'sembedding'] columns (when key is int or slice) and don't output other (un-formatted) columns.
> C:\Users\bramv\.virtualenvs\dutch-simplification-nbNdqK9u\lib\site-packages\datasets\arrow_dataset.py:835: UserWarning: The given NumPy array is not writeable, and PyTorch does not support non-writeable tensors. This means you can write to the underlying (supposedly non-writeable) NumPy array using the tensor. You may want to copy the array to protect its data or make it writeable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ..\torch\csrc\utils\tensor_numpy.cpp:141.)
> return torch.tensor(x, **format_kwargs)
The first one might not be related to the warning, but it is odd that it is shown, too. It is unclear whether that is something that I should do or something that that the program is doing at that moment.
Snippet:
```
dataset = Dataset.from_dict(torch.load("data/dummy.pt.pt"))
print(dataset)
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
keys_to_retain = {"input_ids", "sembedding"}
dataset = dataset.map(lambda example: tokenizer(example["text"], padding='max_length'), batched=True)
dataset.remove_columns_(set(dataset.column_names) - keys_to_retain)
dataset.set_format(type="torch", columns=["input_ids", "sembedding"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=2)
print(next(iter(dataloader)))
```
PS: the input type for `remove_columns_` should probably be an Iterable rather than just a List.
This error is what I see when I run the first lines of the Pytorch Quickstart. It should also say that it should be ignored and/or how to fix it. BTW, this is a Pytorch error message -- not a Huggingface error message. My code runs anyway. | [
0.2382436991,
-0.3691009879,
0.0651262179,
0.1298750937,
0.4380832016,
0.0914001763,
0.6286587119,
0.2343869358,
0.297601819,
0.0604874976,
-0.1968633384,
0.4237813354,
-0.3091077507,
-0.2186416239,
-0.2565727532,
-0.2071908116,
-0.0199281424,
0.1609406918,
0.0264156014,
-0.0268478245,
-0.0556125604,
0.0349227414,
-0.173422873,
0.3379642367,
-0.225587368,
-0.409830004,
0.2081688046,
-0.0633970797,
-0.0799703747,
-0.3318473399,
0.1405587494,
-0.2741777301,
-0.0424051955,
0.1276778132,
-0.0001266726,
0.115719974,
0.0211010482,
-0.0969709903,
-0.0743484646,
-0.1103796959,
0.7791624069,
-0.3006571531,
0.2598287463,
-0.5041794777,
-0.0403004214,
-0.2053054571,
-0.0674015433,
-0.2887424231,
0.6341737509,
0.2032752633,
0.0752524808,
0.5057702065,
0.1049248055,
0.3881486058,
-0.0660524815,
0.3452899754,
-0.2550108433,
-0.0293221362,
0.0295191612,
0.1392834634,
-0.0000623427,
0.2642534673,
-0.591407299,
0.0171456039,
0.1526442766,
-0.0334381759,
-0.3494703472,
-0.1997034103,
-0.056856554,
0.4435257912,
0.3706564307,
-0.2254173011,
-0.0462095737,
-0.3330504298,
-0.1216261089,
-0.15876773,
0.1649688482,
0.0478616655,
-0.1130081713,
0.1626305133,
0.0142953508,
-0.0069781654,
-0.1369983852,
0.2631309032,
0.0543760359,
0.1673308313,
-0.1042309105,
0.3020677567,
-0.0703181326,
-0.0098414812,
0.052553013,
0.0074248388,
0.190158084,
0.1962173134,
-0.1164152175,
-0.3119843006,
-0.259937942,
-0.7474352717,
-0.0870940238,
-0.2492241859,
0.3341350555,
0.2540560067,
-0.1607196927,
0.0971487164,
0.146938771,
0.0135782659,
0.1208702549,
0.3030396402,
-0.0211056247,
-0.264131099,
0.4715954661,
0.0567320958,
0.1360284239,
-0.2395027131,
0.0944851488,
0.0855746642,
0.0345737711,
0.0435557663,
-0.4489812255,
0.1235724092,
-0.2449782193,
0.0512569025,
-0.0606116205,
0.0437430851,
-0.0110614114,
0.0084558968,
0.0809399411,
0.3331674039,
-0.2172212154,
0.2093103677,
0.0050660782,
0.2291955501,
-0.1825449467,
-0.1943187118,
-0.0641829595,
0.2339447439,
0.0189593509,
0.261413604,
0.0302924141,
0.0009380952,
0.1234879121,
-0.2359407246,
0.6530772448,
0.2942769527,
-0.1781086177,
0.2948441505,
0.3621657789,
-0.0307601988,
-0.3972629905,
0.2834928632,
-0.3104486465,
-0.2276653945,
-0.5769512057,
0.0020176312,
-0.1900073588,
0.1614835858,
0.1787660718,
-0.2559921443,
0.6716822386,
-0.1626642048,
0.4028994739,
-0.437556386,
-0.0927187949,
-0.3298019171,
-0.0530219451,
0.3662084937,
-0.3259871602,
-0.0640388876,
0.2587724626,
-0.0162681695,
0.4336456358,
0.4094215333,
-0.1056689024,
0.1454502344,
-0.1220319569,
0.2255038023,
0.3607750535,
-0.1224516407,
-0.0684337988,
0.1887910813,
-0.1807033718,
0.3171916008,
0.1701441705,
0.0854641497,
-0.1042877659,
0.1792413145,
0.2188511193,
0.0108663999,
0.3695319891,
-0.0156878829,
-0.3240776062,
-0.2027048916,
0.4832918048,
-0.135716185,
-0.0672294647,
-0.1775229275,
-0.1403666735,
0.0221993178,
0.3671857119,
-0.1119872481,
0.029092133,
-0.0411327407,
-0.1673816741,
0.239643842,
-0.1404467672,
0.1050847545,
-0.1710646003,
-0.0152081698,
0.0745917261,
0.1085797846,
-0.2709116638,
-0.1766556054,
-0.105172269,
0.0071569905,
-0.3517927229,
-0.0465407893,
-0.1111046374,
-0.1885503531,
-0.1136639789,
0.1317767352,
-0.1785909235,
-0.1946375519,
-0.5005661249,
0.1517337561,
-0.4120476246,
-0.1020532027,
-0.1012719423,
-0.2124432474,
-0.3982958198,
0.3267835379,
0.2324446142,
-0.0355697833,
-0.0395299904,
0.0819810331,
0.2321104109,
-0.1997633576,
-0.9175687432,
0.1638943851,
0.1763909906,
-0.0909303054,
0.247208178,
0.1546778977,
-0.1466421187,
-0.0960708112,
-0.0257538296,
0.6306298375,
-0.0646004081,
0.0608096346,
-0.5003480315,
0.2619295716,
-0.1911873221,
0.1908104569,
-0.104991138,
-0.4325118661,
-0.0143133253,
0.1304036528,
-0.2320646793,
-0.0843059868,
-0.4490355253,
-0.1313849539,
0.0598227531,
0.0488688499,
0.2781394124,
-0.0805687159,
0.1933730543,
-0.2550164759,
0.2152751684,
-0.542080164,
0.1550076306,
-0.1119837165,
-0.0120684784,
-0.0852295607,
-0.1558952183,
-0.2969709039,
-0.0197966024,
0.0393949226,
-0.016540613,
0.1850438118,
-0.0722505078,
0.1588742882,
-0.3456683755,
0.1977575421,
-0.1231663674,
0.0587117001,
-0.4983721375,
0.2802624404,
-0.1423192024,
0.0969335139,
-0.2249714434,
-0.2606526017,
0.0266309939,
-0.0663829297,
-0.0077089262,
0.0847670138,
-0.2427747399,
0.1668353081,
-0.0719062611,
0.0822577551,
0.1976012886,
-0.1022715643,
-0.0642505735,
0.0905624181,
-0.0684162527,
0.0219139904,
-0.0892238319,
-0.1524734497,
0.2516382933,
-0.0162401274,
-0.2449785173,
-0.100350745,
-0.2109434903,
0.1382278949,
-0.1868401617,
-0.1263593435,
0.3529061079,
0.1456684619,
-0.2808857262,
-0.2442341745,
-0.0112628527,
-0.0327508897,
0.0331093892,
0.1553822458,
0.0217528008,
0.1336048841,
-0.0648510531,
-0.1858778596,
-0.4236634672,
-0.0582789332,
0.1405141503,
-0.1072302759,
0.4788044989,
0.2815491557,
0.2555831075,
0.1540787518,
-0.0803939626,
0.2228563279,
-0.2713090479,
-0.0046551824,
0.3146921098,
-0.1752707064,
-0.1823731512,
-0.2065906227,
0.1696433276,
0.0142174046,
0.0761996806,
-0.3303347826,
-0.1466783583,
-0.0949349925,
0.1207671911,
-0.2022130191,
-0.1867316663,
0.241856575,
0.2284965515,
0.0300903469,
0.1885069311,
0.2500556707,
0.0332655087,
-0.2170520723,
-0.0861485377,
-0.0437795743,
0.1183455586,
0.2379997671,
0.4617760181,
-0.04200764,
-0.2846428454,
0.1601476371,
0.0010736026,
0.5781361461,
-0.1647350937,
-0.3022521138,
0.0216057971,
0.076786153,
0.0009506643,
-0.1333919913,
-0.3836981654,
-0.0001531877,
0.2423481345,
0.0825090408,
0.1085125506,
-0.2214960456,
0.4477954209,
-0.3171699345,
0.0087405182,
-0.1848477274,
0.7597175837,
-0.1085423976,
-0.2938228846,
-0.1163827181,
0.2062702775,
0.2668506503,
-0.0254102536,
-0.4820897579,
-0.3349626362,
-0.3202604651,
0.2719302177,
0.1470128447,
0.6929335594,
-0.1348297298,
-0.2319810092,
0.3621507883,
0.2398637682,
0.2768635154,
0.0184219684,
-0.2167359442,
0.2924697995,
0.079120338,
-0.3491104245,
-0.114071846,
-0.0095899254,
0.1074212492,
-0.0088581964,
0.4121201932,
-0.1308528185,
-0.2242531031,
0.1573219597,
0.1882412881,
-0.1615239978,
0.0613937303,
-0.2136749923,
-0.2623848915,
-0.3775331676,
0.1652600169,
0.2398949862,
0.2294293642,
-0.2508633137,
0.0488718301,
-0.0084932987,
0.245666936,
0.317879051,
-0.0956499428,
0.1522447765,
0.1483535916,
-0.3418127,
0.3096509874,
0.3398792148,
-0.0277442262,
-0.1095982566,
-0.1247121692,
0.1159104705,
0.1117185652,
-0.0217786059,
0.2172192186,
0.4647897482,
0.1280826926,
0.4410914779,
-0.1509371698,
-0.1095777303,
-0.1212500036,
0.1708003879,
0.404702723,
-0.0328603759,
-0.3123688102,
-0.4118335247,
0.259888202,
0.1042160988,
-0.0633762181,
0.3375406265,
-0.0369731858,
-0.1301323026,
0.6268112659,
0.7443352938,
1.2204270363,
-0.2085443139,
0.5956696868,
0.3273388743,
-0.0674319789,
0.6187636256,
-0.0102656037,
0.042403549,
-0.2683009505,
0.1548625231,
-0.1980776787,
-0.1577159613,
0.2015676349,
-0.0019608289,
-0.0411036313,
-0.0536004901,
-0.5109753609,
0.3334376514,
0.0068877749,
0.0026787296,
-0.101127699,
-0.2555353642,
0.2363881171,
-0.1040956751,
-0.079145968,
0.3209194541,
-0.0350440517,
0.0317473672,
0.1873893738,
-0.27684021,
-0.2541230321,
-0.0294099823,
0.1044800133,
0.5604251623,
0.0119874794,
-0.609200716,
0.2883669138,
0.4530895352,
0.380843848,
0.1035279259,
0.0959939808,
-0.1774269342,
0.2179631293,
-0.3489566445,
-0.048572395,
-0.0852726698,
0.4823371768,
0.1212895066,
-0.08624883,
0.452154994,
-0.0263081845,
-0.1532788128,
-0.3243341446,
0.097329177,
0.4660350978,
-0.656159699,
-0.0559298396,
-0.2054632306,
0.0001007468,
-0.062172398,
-0.0448218659,
0.21907565,
0.0646789074,
0.103734754,
-0.2519076765,
-0.4340578914,
-0.036706157,
0.2222429961,
0.2893690765,
-0.117922321,
0.879553318,
0.0664853454,
-0.3234810829,
-0.1874034107,
0.037170127,
0.2153206617,
-0.1092451066,
-0.0018604547,
0.1581524909,
-0.1175391674,
-0.3384717703,
0.3617874384,
0.0083908569,
0.2841087282,
-0.3869201839,
-0.1965320259,
-0.4224751592,
-0.0321235135,
0.2779538929,
0.3617931008,
0.0696603879,
0.1996623129,
-0.1247276366,
-0.1013158858,
-0.1560570002,
-0.0752203166,
0.020221211,
0.0376656204,
0.0879769921,
-0.0319657587,
-0.0273688193,
-0.1754417121,
-0.0822160318,
0.0666388795,
-0.1285915673,
-0.0146795921,
-0.2238382995,
0.257093668,
0.1262626499,
-0.4150514603,
-0.152778849,
-0.2237046957,
0.0586871617,
-0.1579182148,
0.1590574384,
0.5828672051,
0.1063295007,
0.3782193661,
-0.2470910251,
0.56589818,
-0.0748700351,
-0.0877131075,
-0.3030568361,
0.0571165048,
-0.039043311,
-0.0320369825,
0.0130809136,
-0.0603504628,
0.0060970448,
0.1890246868,
0.2629618645,
0.0186588839,
-0.0052154213,
-0.0120388456,
-0.1070672274,
-0.3753263354,
-0.0517678708,
0.189355582,
-0.2479213476,
-0.5765467882,
0.2642937005,
0.0123998243,
0.0169081911,
0.0137433447,
-0.4336454272,
0.0795568451,
-0.031048052,
0.170232892,
0.0805719495,
0.1417364776,
-0.0181905963,
0.09814547,
0.2180120051,
-0.3502316177,
0.12948443,
0.4828861952,
-0.2283920646,
-0.0452727117,
0.4522110522,
-0.0312917605,
0.2341942936,
-0.1334195435,
0.0915353298,
0.4512062967,
0.1692260206,
-0.0341872424,
-0.3178962767,
-0.3285260201,
0.1474927664,
0.2378780991,
0.0970565528,
0.3567236066,
0.2927554846,
0.4019019902,
-0.4451822639,
-0.0209467206,
0.0176270828,
0.2360850871,
-0.3745494187,
-0.0451739728,
-0.3808813691,
0.0096498877,
0.0484975874,
-0.1260537207,
-0.159731403,
0.0313260742,
0.0340272896,
-0.0899489671,
-0.2625426054,
0.1525456458,
0.2857145369,
0.0725883618,
0.1878762394,
-0.2299561948,
-0.1870911866,
0.0890277028,
0.1067847461,
-0.0448155813,
0.0374788716,
0.4447661936,
0.358741641,
-0.10350582,
-0.0842834264,
-0.1756999195,
-0.0139695816,
0.1993031055,
0.4816289544,
0.018682234,
0.0138763636,
0.287936151,
-0.0183102097,
-0.0949821919,
0.0105205849,
0.2952637374,
0.2227563858,
0.1748559922,
0.3795814514,
-0.0568386167,
-0.4856646061,
0.0033313073,
-0.0617631972,
-0.1641254723,
0.1717436314,
0.3564146459,
0.2143042982,
0.1797748655,
-0.1373206377,
0.0054708868,
0.3706023097,
0.443055898,
0.251837343,
-0.0295525715,
-0.3219552636,
0.3352751136,
-0.3268110156,
0.4080523252,
-0.0225137025,
-0.3146441579,
-0.0225451831,
-0.1043130159,
0.3645898104,
0.0440230034,
-0.3006753027,
0.1305756569,
-0.24810341,
0.0429528914,
-0.0048683807,
0.1862839609,
0.099533312,
-0.0594905838,
-0.0888574198,
-0.3397578001,
0.444216311,
0.1383631527,
-0.0315220393,
-0.4207602143,
0.1421373487,
-0.2364388108,
0.2782037854,
0.3304498494,
0.1277319789,
0.4267578125,
-0.0317622982,
-0.1237798929,
-0.0141851921,
0.0862122327,
-0.1058101058,
0.2842881083,
0.2063486129,
0.2731181979,
0.0385228321,
0.054244712,
-0.2951193154,
0.5469293594,
-0.1160323024,
-0.3950822353,
-0.0497786291,
0.2726547718,
-0.2424453646,
-0.1353651434,
0.0816416889,
0.2359433323,
0.3239525557,
0.2044597566,
-0.3263272047,
-0.3508822918,
0.5388799906,
-0.364967972,
-0.1342417747,
-0.1359106451,
0.086956203,
0.3645114303,
0.11802385,
-0.7096751928,
-0.009728387,
0.2363465428,
-0.1331491768,
-0.0409499221,
-0.1120016724,
-0.1369568706,
-0.1903095245,
-0.2534229457,
0.6608252525,
-0.0571324974,
-0.0422460362,
-0.1349737495,
-0.3494077325
] |
https://github.com/huggingface/datasets/issues/615 | Offset overflow when slicing a big dataset with an array of indices in Pyarrow >= 1.0.0 | Related: https://issues.apache.org/jira/browse/ARROW-9773
It's definitely a size thing. I took a smaller dataset with 87000 rows and did:
```
for i in range(10,1000,20):
table = pa.concat_tables([dset._data]*i)
table.take([0])
```
and it broke at around i=300.
Also when `_indices` is not None, this breaks indexing by slice. E.g. `dset.shuffle()[:1]` breaks.
Luckily so far I haven't seen `_indices.column(0).take` break, which means it doesn't break `select` or anything like that which is where the speed really matters, it's just `_getitem`. So I'm currently working around it by just doing the arrow v0 method in `_getitem`:
```
#if PYARROW_V0:
data_subset = pa.concat_tables(
self._data.slice(indices_array[i].as_py(), 1) for i in range(len(indices_array))
)
#else:
#data_subset = self._data.take(indices_array)
``` | How to reproduce:
```python
from datasets import load_dataset
wiki = load_dataset("wikipedia", "20200501.en", split="train")
wiki[[0]]
---------------------------------------------------------------------------
ArrowInvalid Traceback (most recent call last)
<ipython-input-13-381aedc9811b> in <module>
----> 1 wikipedia[[0]]
~/Desktop/hf/nlp/src/datasets/arrow_dataset.py in __getitem__(self, key)
1069 format_columns=self._format_columns,
1070 output_all_columns=self._output_all_columns,
-> 1071 format_kwargs=self._format_kwargs,
1072 )
1073
~/Desktop/hf/nlp/src/datasets/arrow_dataset.py in _getitem(self, key, format_type, format_columns, output_all_columns, format_kwargs)
1037 )
1038 else:
-> 1039 data_subset = self._data.take(indices_array)
1040
1041 if format_type is not None:
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.take()
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/compute.py in take(data, indices, boundscheck)
266 """
267 options = TakeOptions(boundscheck)
--> 268 return call_function('take', [data, indices], options)
269
270
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/_compute.pyx in pyarrow._compute.call_function()
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/_compute.pyx in pyarrow._compute.Function.call()
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.pyarrow_internal_check_status()
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowInvalid: offset overflow while concatenating arrays
```
It seems to work fine with small datasets or with pyarrow 0.17.1 | 108 | Offset overflow when slicing a big dataset with an array of indices in Pyarrow >= 1.0.0
How to reproduce:
```python
from datasets import load_dataset
wiki = load_dataset("wikipedia", "20200501.en", split="train")
wiki[[0]]
---------------------------------------------------------------------------
ArrowInvalid Traceback (most recent call last)
<ipython-input-13-381aedc9811b> in <module>
----> 1 wikipedia[[0]]
~/Desktop/hf/nlp/src/datasets/arrow_dataset.py in __getitem__(self, key)
1069 format_columns=self._format_columns,
1070 output_all_columns=self._output_all_columns,
-> 1071 format_kwargs=self._format_kwargs,
1072 )
1073
~/Desktop/hf/nlp/src/datasets/arrow_dataset.py in _getitem(self, key, format_type, format_columns, output_all_columns, format_kwargs)
1037 )
1038 else:
-> 1039 data_subset = self._data.take(indices_array)
1040
1041 if format_type is not None:
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.take()
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/compute.py in take(data, indices, boundscheck)
266 """
267 options = TakeOptions(boundscheck)
--> 268 return call_function('take', [data, indices], options)
269
270
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/_compute.pyx in pyarrow._compute.call_function()
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/_compute.pyx in pyarrow._compute.Function.call()
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.pyarrow_internal_check_status()
~/.virtualenvs/hf-datasets/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowInvalid: offset overflow while concatenating arrays
```
It seems to work fine with small datasets or with pyarrow 0.17.1
Related: https://issues.apache.org/jira/browse/ARROW-9773
It's definitely a size thing. I took a smaller dataset with 87000 rows and did:
```
for i in range(10,1000,20):
table = pa.concat_tables([dset._data]*i)
table.take([0])
```
and it broke at around i=300.
Also when `_indices` is not None, this breaks indexing by slice. E.g. `dset.shuffle()[:1]` breaks.
Luckily so far I haven't seen `_indices.column(0).take` break, which means it doesn't break `select` or anything like that which is where the speed really matters, it's just `_getitem`. So I'm currently working around it by just doing the arrow v0 method in `_getitem`:
```
#if PYARROW_V0:
data_subset = pa.concat_tables(
self._data.slice(indices_array[i].as_py(), 1) for i in range(len(indices_array))
)
#else:
#data_subset = self._data.take(indices_array)
``` | [
-0.307860136,
-0.1624814868,
-0.0384139568,
0.191755265,
0.1244797111,
-0.0056267977,
0.2420149744,
0.1383448243,
-0.3982107043,
0.3619241714,
0.3079871237,
0.4601460695,
0.0280523822,
0.0383540168,
0.0241387971,
-0.0889734477,
-0.0426646434,
0.0609558187,
-0.1887904853,
-0.0936408415,
-0.0416767038,
-0.3072259128,
-0.2502088547,
-0.0529374853,
-0.1218297929,
-0.2309745401,
0.1539255381,
0.1009821296,
-0.1768929213,
-0.2024064362,
0.3342115879,
-0.2671544552,
0.0081015676,
-0.0058171228,
-0.0001175907,
0.1050931066,
0.3507693112,
0.0269846842,
-0.0879456624,
0.3618620932,
-0.160860002,
-0.3452380896,
0.0494741872,
-0.4793951511,
0.2600328922,
-0.1769216359,
-0.2077452987,
0.0676090121,
0.0854018703,
0.2236691564,
0.1557557285,
-0.0401191898,
0.158573702,
0.2635315955,
0.3263537586,
0.0234108437,
-0.0072754174,
-0.2422338426,
-0.1678852737,
0.0705337152,
-0.2242978215,
-0.0016123503,
-0.0947052091,
-0.0284248628,
0.0539904535,
0.0067399889,
-0.0233415887,
0.0207044613,
-0.1426184773,
0.0567586794,
0.2461706847,
-0.2803452611,
-0.1456435919,
-0.225911051,
-0.0329125635,
-0.3547536731,
0.140123412,
0.3031682968,
-0.0155511051,
0.0339878015,
-0.2461453676,
0.2101866305,
-0.2522288263,
0.2505633533,
-0.1889608502,
0.2698269486,
0.1512380987,
0.1663441211,
0.1357589513,
-0.1141220629,
0.4339354634,
-0.1705888808,
-0.0047741318,
0.1696424484,
-0.2869724631,
0.1074183062,
0.194889307,
-0.1316956133,
0.5008727312,
-0.3496704102,
0.0059449524,
0.0440782271,
0.2339592427,
0.0869401023,
0.2841057777,
0.192283839,
-0.5885927677,
0.1779751778,
-0.0207447782,
0.0505630821,
0.0146464929,
0.0611453913,
0.2007249594,
-0.1749119163,
0.3466877639,
-0.0800632089,
0.0683387145,
-0.1202838272,
-0.2903329134,
0.1888430864,
-0.4950127602,
0.3146746159,
0.2096387446,
-0.0979563445,
0.0071371384,
-0.0668990463,
-0.2257016897,
0.0162174124,
-0.2137295902,
0.0277414918,
-0.1721847951,
-0.0789436549,
-0.0978925079,
-0.1193298325,
0.0237433612,
0.167945385,
-0.0813867897,
0.3148366511,
0.3989403844,
-0.1049108207,
-0.1097596213,
-0.1568697989,
0.4402475953,
0.2657413185,
-0.0372489654,
0.071927011,
-0.1062022746,
0.0841031522,
-0.0842692778,
0.5176149607,
-0.2612576187,
-0.3617995083,
-0.104230471,
0.1721646339,
-0.123608619,
0.1671469361,
-0.2392053902,
0.1035818979,
0.2674863338,
-0.0605448447,
0.1868403554,
-0.1679097712,
-0.0465437807,
-0.1870518476,
0.2324189544,
0.0651652217,
-0.5403260589,
0.351572603,
0.0464571938,
0.2777176797,
0.3522078693,
0.5056603551,
-0.310364902,
-0.0813851207,
-0.2419676632,
0.0542318076,
0.1743375063,
0.0021218583,
-0.8825094104,
-0.0902411193,
-0.1311736256,
-0.0398749746,
0.2627777457,
-0.1125268042,
0.2651937306,
0.0228163823,
-0.0259937122,
0.4810881615,
-0.274782449,
0.2685477734,
-0.471141994,
-0.1202226356,
0.0109114666,
0.0561427474,
0.0007091435,
-0.4467767179,
-0.0522029176,
-0.2801511288,
0.2582900226,
0.048017282,
0.2383666933,
0.2446468621,
0.0595669709,
-0.2582804561,
0.1781594753,
-0.2147071362,
-0.0670890287,
-0.0253214985,
-0.0054613277,
-0.093444854,
-0.3762579858,
-0.2153982669,
-0.1107437909,
0.3079874814,
-0.1378661394,
0.2697452009,
0.1490945518,
-0.310187012,
0.3051528335,
0.1536779553,
0.1448457539,
-0.1100104153,
-0.1788467467,
-0.0944766402,
-0.2770962715,
0.3967707455,
-0.1217577606,
-0.3430444002,
-0.0394435897,
0.36963588,
0.1240582094,
-0.1210042238,
-0.0816148669,
0.247591719,
0.1220880225,
-0.1240403354,
-0.1544876695,
0.2013615072,
0.1901596636,
-0.669038713,
0.1178056225,
-0.0190328062,
-0.0259278584,
0.0730739012,
-0.2147435099,
0.3537636399,
-0.3796566725,
0.1163658276,
-0.0941119865,
-0.030895099,
0.1070788354,
-0.024929598,
0.1990388185,
-0.0057955384,
-0.1353927255,
0.2408873886,
-0.0255300216,
-0.0582760572,
0.0853509754,
0.2082444429,
0.4890707433,
-0.1404569298,
-0.0235349461,
0.0426871777,
-0.2933746576,
-0.0679957569,
0.1969688088,
-0.2779555321,
-0.0518840179,
0.440805614,
0.0995409191,
-0.2342007458,
-0.0678511411,
-0.1258072108,
0.2366425842,
0.3833582699,
0.0870456696,
0.1480715722,
0.4960046709,
-0.137815237,
-0.367536217,
0.0245737806,
0.2212058008,
0.4458926916,
-0.1369953752,
0.3641353846,
-0.1575199515,
-0.2291458696,
-0.1539965719,
-0.5479339957,
-0.0495433733,
-0.5558481216,
-0.1171552688,
0.2539521158,
-0.222250402,
0.3176636398,
0.0286701098,
-0.078891173,
0.4207023084,
-0.2142772377,
-0.0134651661,
-0.2432039678,
-0.1864144355,
0.0226918086,
0.3362286985,
0.0649508983,
0.0128787253,
0.4299607873,
0.0976506472,
-0.0092566721,
-0.3197996318,
0.0494458452,
0.1751481444,
0.2215363234,
0.0291963406,
-0.2809968293,
-0.3562768698,
-0.2338245362,
-0.0001733489,
0.1419411302,
-0.1411359608,
0.4679219127,
-0.1558833867,
0.0448510237,
0.0084001422,
-0.0066715889,
-0.0028691143,
-0.3170769215,
-0.1068454683,
-0.0941457301,
0.146625936,
0.1291237473,
0.2173151672,
0.0217556134,
0.1971846521,
-0.3625348508,
-0.2118217498,
0.071903646,
0.3074547946,
-0.0073136874,
-0.1995085925,
-0.184400782,
-0.4688221812,
-0.1206624582,
0.3121151924,
-0.4352705479,
0.1149596125,
-0.16547589,
0.5477542877,
-0.4002653956,
0.0760844126,
0.1913096011,
0.1314124316,
-0.0754616633,
-0.1794872284,
-0.0041375011,
0.0360305719,
-0.1388119757,
-0.0993694961,
-0.010342082,
0.3555848002,
0.1919979304,
0.6874069571,
0.1828754544,
-0.0826881751,
0.1499187946,
-0.1037696972,
0.0607791282,
-0.0972273648,
-0.2453379929,
0.134612143,
0.2759501934,
-0.2638722062,
0.0069094971,
-0.0608504452,
-0.3277965188,
0.3485786915,
-0.1818955541,
-0.1257542968,
-0.3210375607,
0.2611425221,
-0.1045737267,
0.0411944464,
0.1294792593,
-0.114240095,
-0.1942573041,
-0.2883710563,
0.0373694785,
-0.1895451546,
0.2498660088,
-0.129581362,
0.025786072,
-0.1655410379,
-0.1086154878,
0.04321374,
0.2535859644,
0.1290032566,
0.2761619091,
-0.0774618909,
0.2144339234,
-0.0877564549,
1.0529718399,
0.1035217345,
0.0862724483,
-0.0843413547,
0.3270574212,
-0.5469654799,
0.0455027372,
-0.3395444751,
0.417221427,
0.1493498981,
0.0795406997,
-0.2137517333,
0.131285131,
0.6030223966,
0.2673250735,
-0.0757202804,
-0.1873873919,
-0.1004452631,
-0.1217114776,
0.2724305987,
0.3786854744,
0.0342792869,
0.3319784701,
0.0867870599,
-0.2884066105,
-0.4074478149,
-0.1794429719,
-0.4362611473,
0.0392237939,
0.3683809042,
-0.232327491,
0.0919800252,
-0.2026364058,
-0.1233667657,
0.5874598026,
0.2761700451,
0.1106544137,
-0.1481908113,
-0.1145763844,
0.1147045791,
0.2968947887,
-0.1863590777,
0.0224654302,
-0.0042995773,
-0.0667002797,
0.2870537639,
-0.0751794279,
-0.2727066875,
-0.0565404817,
0.064527154,
-0.0863801837,
0.0467155054,
0.5351297259,
-0.0962857604,
-0.0200151354,
0.2606784105,
0.4524257779,
-0.2280536294,
0.2177144438,
0.2168311775,
0.659299612,
-0.0582128316,
-0.0079517169,
0.3936106861,
0.0186921507,
0.3969328701,
0.1132597029,
0.2290651798,
-0.2633152306,
-0.2673663497,
0.1683328897,
-0.2955304384,
-0.25160864,
0.0523139089,
-0.1367739439,
0.0307246223,
-0.1729723215,
0.0640648901,
-0.1427877843,
-0.0798081309,
0.1897865981,
-0.202752471,
-0.3587448597,
0.0043283999,
0.0839706361,
-0.4553223252,
-0.0062030666,
-0.0558738336,
-0.0727326348,
-0.0626006424,
-0.209988147,
-0.0158435032,
-0.3024455309,
0.2118613422,
0.1587903053,
-0.6693416834,
0.0959850848,
-0.1027469486,
-0.0909659714,
-0.0037842332,
0.012518568,
-0.0979868472,
0.0787512511,
-0.1514410675,
-0.0931807309,
0.1588275433,
0.372536391,
0.1117562056,
-0.3891112506,
0.0792203248,
0.1022789031,
-0.0605833083,
0.0916549861,
0.1698605269,
0.185879454,
-0.351303041,
0.0801828355,
-0.274428755,
-0.027628608,
-0.2533026338,
0.0510551929,
-0.0321579054,
-0.3189630508,
0.3559042215,
0.1106920987,
-0.1777250469,
-0.0625040829,
0.3362923861,
0.1386837512,
0.3069220185,
0.4120992422,
0.2120609283,
-0.2112435102,
-0.0923747793,
0.2197961211,
0.2482060045,
-0.2466104627,
-0.1335740685,
-0.3357232213,
0.2979920208,
0.0040591098,
-0.1968061626,
0.0943777338,
0.2139059901,
-0.476534605,
-0.2514711022,
-0.2843225896,
0.211540401,
0.0218785051,
-0.1386729479,
0.1883489192,
-0.1097692698,
-0.1664046645,
0.0380353481,
-0.2583704591,
0.2426101714,
0.0158908814,
0.21871984,
-0.2371933609,
-0.1162922382,
0.0712876692,
-0.1096849591,
0.0967871994,
0.0323480889,
-0.0849002451,
-0.2896194458,
-0.0747357383,
0.0687748119,
0.1048450321,
-0.2686825097,
-0.1610886455,
-0.0881016478,
-0.197155267,
-0.0708957762,
0.1996876001,
-0.0409919657,
-0.0030106995,
0.0278007798,
0.6172444224,
-0.2400877774,
-0.4222012758,
-0.0536632501,
-0.028890539,
0.1443916261,
0.1050765738,
0.5081799626,
0.0785973221,
-0.0513165146,
-0.1385734528,
0.09889853,
0.0796131343,
-0.0999372154,
0.4434024394,
-0.3727134764,
-0.2157144547,
0.2484070212,
0.3717079163,
0.2496127933,
-0.0126269348,
-0.3776682019,
0.1215234548,
0.2283976525,
-0.0320643596,
-0.2781692147,
0.1422366798,
0.1158111468,
0.1839360595,
0.3635500968,
-0.0456452295,
-0.312015146,
0.0750244036,
0.0862320662,
0.5116413236,
-0.0492970496,
-0.2612496018,
0.5733864307,
-0.1759342849,
-0.2163208127,
0.3265042603,
-0.0425847881,
0.5641288161,
0.6425902843,
-0.0141264647,
0.2967488468,
0.2060593963,
-0.3082507253,
-0.060401354,
-0.1265053153,
-0.2884389758,
0.2563360929,
-0.0069087483,
0.18544285,
0.1793781668,
0.2107781768,
0.4293076992,
0.0859499201,
0.0137432516,
-0.0070641423,
-0.2963066697,
0.0951607302,
-0.3648279309,
-0.0339965075,
0.1075608581,
-0.0892930031,
-0.0629895926,
0.1336774528,
0.5363906026,
0.0736941397,
0.0753083527,
-0.1682987064,
0.472538352,
0.3535587788,
0.2539381087,
-0.1741783768,
0.3685833812,
0.3450315893,
0.0600090586,
-0.3759498894,
0.2836782932,
0.44762972,
0.0661305785,
-0.2917453349,
-0.1754905581,
0.1431081146,
-0.1993091702,
-0.3571976423,
0.0562637188,
-0.0223261677,
0.2697279453,
0.1601202786,
0.105842188,
-0.1833210886,
0.0344819613,
0.5079106092,
-0.0918385163,
-0.1485921144,
-0.3756021559,
-0.0136667807,
-0.4743431211,
0.0703355372,
-0.0579413138,
0.0409054533,
0.0677905381,
0.1807412356,
-0.0876700282,
0.053184893,
0.2307288945,
0.0352969542,
-0.3808389008,
0.70673877,
0.5049712062,
0.219625473,
-0.2470973432,
0.0792748034,
0.0175035596,
0.4386530519,
-0.2930750847,
-0.0511547327,
0.5098761916,
0.3039678931,
-0.1780643165,
0.2502116263,
0.1689563394,
0.1316730529,
-0.1954151392,
0.4371804893,
-0.5424677134,
-0.2461667061,
0.0678664818,
-0.4354545474,
-0.1111010537,
-0.4641908407,
0.2038556933,
-0.0356635451,
0.051334776,
-0.0334168971,
0.2375269085,
-0.1416111141,
0.1121385992,
0.2387248278,
0.4911782444,
0.1525251418,
0.0015405491,
0.110936895,
-0.1395284534,
-0.1394062489,
-0.5160044432,
0.5627590418,
-0.1268525422,
0.2740681171,
-0.1731086075,
-0.718791008,
0.1381967813,
0.1420858204,
0.1563247591,
0.0118765719,
0.0317556933,
0.186227113,
-0.2070299387,
0.0572310574,
-0.001784347,
0.0907569006,
0.1579874009,
0.2495677471,
-0.0379340649,
-0.2104881704,
0.1006878614,
-0.2144786716,
-0.1678660512,
-0.0442157462,
0.1772100329,
0.2280097604,
-0.371946305,
-0.2544883192,
0.0146528482,
0.4591275156,
-0.1570971459,
-0.1533429325,
0.1323524863,
0.1654479951,
0.094547689,
0.0275229216,
0.2726255953,
0.0628591925,
-0.3466641307,
-0.0776629597,
-0.224314779
] |
https://github.com/huggingface/datasets/issues/611 | ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648 | ```
<class 'pandas.core.frame.DataFrame'>
Int64Index: 17136104 entries, 0 to 17136103
Data columns (total 6 columns):
# Column Dtype
--- ------ -----
0 item_id int64
1 item_titl object
2 start_price float64
3 shipping_fee float64
4 picture_url object
5 embeddings object
dtypes: float64(2), int64(1), object(3)
memory usage: 915.2+ MB
``` | Hi, I'm trying to load a dataset from Dataframe, but I get the error:
```bash
---------------------------------------------------------------------------
ArrowCapacityError Traceback (most recent call last)
<ipython-input-7-146b6b495963> in <module>
----> 1 dataset = Dataset.from_pandas(emb)
~/miniconda3/envs/dev/lib/python3.7/site-packages/nlp/arrow_dataset.py in from_pandas(cls, df, features, info, split)
223 info.features = features
224 pa_table: pa.Table = pa.Table.from_pandas(
--> 225 df=df, schema=pa.schema(features.type) if features is not None else None
226 )
227 return cls(pa_table, info=info, split=split)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.from_pandas()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in dataframe_to_arrays(df, schema, preserve_index, nthreads, columns, safe)
591 for i, maybe_fut in enumerate(arrays):
592 if isinstance(maybe_fut, futures.Future):
--> 593 arrays[i] = maybe_fut.result()
594
595 types = [x.type for x in arrays]
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in result(self, timeout)
426 raise CancelledError()
427 elif self._state == FINISHED:
--> 428 return self.__get_result()
429
430 self._condition.wait(timeout)
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in __get_result(self)
382 def __get_result(self):
383 if self._exception:
--> 384 raise self._exception
385 else:
386 return self._result
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/thread.py in run(self)
55
56 try:
---> 57 result = self.fn(*self.args, **self.kwargs)
58 except BaseException as exc:
59 self.future.set_exception(exc)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in convert_column(col, field)
557
558 try:
--> 559 result = pa.array(col, type=type_, from_pandas=True, safe=safe)
560 except (pa.ArrowInvalid,
561 pa.ArrowNotImplementedError,
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib.array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib._ndarray_to_array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
```
My code is :
```python
from nlp import Dataset
dataset = Dataset.from_pandas(emb)
``` | 47 | ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
Hi, I'm trying to load a dataset from Dataframe, but I get the error:
```bash
---------------------------------------------------------------------------
ArrowCapacityError Traceback (most recent call last)
<ipython-input-7-146b6b495963> in <module>
----> 1 dataset = Dataset.from_pandas(emb)
~/miniconda3/envs/dev/lib/python3.7/site-packages/nlp/arrow_dataset.py in from_pandas(cls, df, features, info, split)
223 info.features = features
224 pa_table: pa.Table = pa.Table.from_pandas(
--> 225 df=df, schema=pa.schema(features.type) if features is not None else None
226 )
227 return cls(pa_table, info=info, split=split)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.from_pandas()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in dataframe_to_arrays(df, schema, preserve_index, nthreads, columns, safe)
591 for i, maybe_fut in enumerate(arrays):
592 if isinstance(maybe_fut, futures.Future):
--> 593 arrays[i] = maybe_fut.result()
594
595 types = [x.type for x in arrays]
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in result(self, timeout)
426 raise CancelledError()
427 elif self._state == FINISHED:
--> 428 return self.__get_result()
429
430 self._condition.wait(timeout)
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in __get_result(self)
382 def __get_result(self):
383 if self._exception:
--> 384 raise self._exception
385 else:
386 return self._result
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/thread.py in run(self)
55
56 try:
---> 57 result = self.fn(*self.args, **self.kwargs)
58 except BaseException as exc:
59 self.future.set_exception(exc)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in convert_column(col, field)
557
558 try:
--> 559 result = pa.array(col, type=type_, from_pandas=True, safe=safe)
560 except (pa.ArrowInvalid,
561 pa.ArrowNotImplementedError,
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib.array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib._ndarray_to_array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
```
My code is :
```python
from nlp import Dataset
dataset = Dataset.from_pandas(emb)
```
```
<class 'pandas.core.frame.DataFrame'>
Int64Index: 17136104 entries, 0 to 17136103
Data columns (total 6 columns):
# Column Dtype
--- ------ -----
0 item_id int64
1 item_titl object
2 start_price float64
3 shipping_fee float64
4 picture_url object
5 embeddings object
dtypes: float64(2), int64(1), object(3)
memory usage: 915.2+ MB
``` | [
-0.2813219428,
-0.0744089633,
-0.2195497006,
0.4159463644,
0.2823322415,
-0.0299136937,
0.4081526101,
0.2113162875,
0.3901513219,
0.0474446639,
0.0003714785,
0.3082345724,
-0.086982511,
0.0685475543,
0.0008355677,
-0.3419585526,
-0.0933910757,
0.2279391438,
-0.2755076289,
0.093866922,
-0.2262556106,
0.0042208489,
-0.2458186746,
0.2069003135,
-0.0991637111,
-0.163917169,
0.0946855843,
-0.0185277201,
-0.0302413832,
-0.4899120629,
0.3556724787,
-0.2945442796,
0.0409599096,
0.1827442199,
-0.0001102196,
0.0405714065,
0.370965749,
0.0665660948,
-0.2005662173,
-0.1553710997,
-0.3287507892,
-0.4413963556,
0.4300813675,
-0.0768522024,
0.2361875176,
-0.2503987551,
-0.2843055725,
-0.3228807449,
0.3386787474,
0.0333644226,
0.2106782198,
0.2027077675,
0.4377657175,
0.1210990548,
0.0229465794,
-0.0303454809,
-0.1591518521,
0.4214887321,
0.1034791246,
-0.2736417353,
-0.0080705397,
0.0098081641,
-0.142464608,
0.3750423789,
0.1488452852,
-0.0841917843,
-0.1104777753,
-0.3776780069,
-0.2088724971,
0.2992430031,
0.6146752834,
-0.2124878317,
-0.3339967132,
-0.0728104264,
-0.0512901768,
-0.3944201469,
0.0467975624,
0.2284303606,
-0.0249730758,
0.0557536148,
0.0425887257,
0.0693220049,
-0.2296036184,
0.0915906951,
-0.0219255909,
0.1695369482,
-0.1137780398,
0.2228228897,
0.3527330458,
-0.1476605386,
0.1830547005,
0.1812526733,
0.0453063734,
0.1507356614,
-0.4711748064,
-0.0076135397,
-0.2046746016,
-0.4235939085,
0.0187760666,
0.3424941897,
0.2327031344,
0.0266378634,
0.0264476873,
0.2103407085,
0.1510002017,
0.0093807839,
-0.1516857147,
0.0805709958,
-0.2187975049,
-0.0026093572,
0.1261628121,
-0.0250722412,
-0.0848935694,
-0.1750026643,
0.1665081978,
-0.1399343312,
0.3136581182,
-0.1227994785,
-0.4663729668,
0.2898981869,
-0.4942124486,
-0.1571941674,
0.1066770703,
0.1941331327,
0.1898524612,
0.3169755936,
-0.0033450872,
0.1713691652,
-0.0771829039,
0.1586163938,
-0.1784181297,
0.2248278409,
0.0160618797,
0.1040192693,
-0.0073826239,
0.058732558,
0.161934495,
0.1394264251,
0.3225093782,
-0.1406375021,
0.2424961329,
-0.4746842384,
-0.0491966493,
0.4604403675,
-0.0665067658,
0.1373984516,
0.4816640317,
0.0661067441,
-0.0525114611,
0.2874205112,
-0.0509188771,
-0.1669092178,
-0.3079305887,
0.20349738,
0.086414665,
0.1019393504,
-0.5072072148,
-0.1107440144,
0.2790832818,
0.0325419605,
0.2052159607,
-0.1718197912,
-0.0322396904,
-0.2370043993,
-0.0113936029,
0.2493878007,
-0.6265667081,
0.1540752649,
-0.0097451583,
-0.004250763,
0.2222196907,
0.3802013695,
-0.3712419271,
0.0840109661,
-0.1422455311,
0.1392557025,
-0.0864753574,
0.0490297154,
-0.488214761,
0.1294536442,
0.030543074,
-0.0098823607,
0.1360376477,
0.1243114397,
0.0164617784,
0.0188370906,
-0.0198176689,
0.2659707069,
0.1596461385,
-0.1190211028,
-0.0785414279,
-0.0919745266,
0.1880331337,
0.0425337404,
-0.1309632957,
-0.111163646,
-0.0377984941,
-0.0583446473,
0.0506684333,
-0.0628660619,
-0.1312292665,
0.3077424169,
0.1022797003,
0.0275660828,
-0.1993942857,
-0.1144552827,
-0.2583914399,
-0.0028339475,
0.0060147718,
-0.2196476161,
-0.3686643243,
0.032235235,
-0.3015364707,
0.4142712057,
-0.0037014037,
0.1467633247,
0.1555080712,
0.0351495333,
-0.1392640322,
-0.340544343,
-0.1125304103,
0.382986784,
-0.2520795763,
0.1174004003,
-0.2488704026,
0.6389963031,
-0.1225097924,
-0.2358955443,
0.1020291746,
-0.0589133985,
0.0078238137,
-0.0617900826,
-0.0541579686,
0.3003413677,
0.2589097619,
-0.0539925434,
-0.1232166812,
-0.171292901,
0.1711659431,
-0.4476627111,
-0.0194847696,
0.3259801269,
0.249661386,
0.0040196702,
0.2005783617,
0.2582695186,
-0.2305360436,
0.1718456149,
0.0606856048,
0.0135375261,
0.2570947111,
0.2372242659,
-0.0010136068,
-0.1002516598,
-0.0585768893,
0.1710239649,
0.2922353148,
0.0836868957,
-0.3078922927,
-0.0896601304,
0.0380945317,
0.0779546052,
0.1145799235,
0.1024589986,
-0.101138711,
-0.1450705975,
0.2871287465,
-0.1594516784,
0.3604677916,
0.2623209655,
-0.3645986915,
-0.3160840273,
-0.1206736118,
0.0692810565,
0.3395499587,
0.2408053428,
0.4563709795,
-0.0912679881,
0.0961696133,
-0.0320505314,
-0.2791268826,
-0.5238829851,
0.062461026,
0.4196492434,
-0.3285613656,
0.2295598686,
-0.1541726142,
-0.1055466011,
0.2673345506,
-0.6564481258,
0.0172791556,
-0.1459004134,
-0.332672894,
0.3113352358,
0.0069781691,
0.0980789661,
-0.3459627628,
-0.1520458907,
0.3765097558,
0.0735323951,
0.0101615116,
-0.0400558449,
0.1425976455,
0.1107220054,
0.270267278,
0.0741860569,
0.2685508132,
0.3350816965,
0.0054361597,
-0.1947272569,
-0.0174661875,
-0.0798838437,
-0.1669339538,
0.0517802164,
0.0623220652,
0.3742078245,
-0.5855014324,
-0.5999204516,
0.1264113039,
0.0736583322,
-0.1970531791,
0.1744910181,
0.070136629,
0.1771166027,
0.1904893219,
-0.1944370121,
-0.3144041598,
-0.3713177443,
0.1869554371,
0.2236397862,
0.3277162611,
0.315998435,
0.2656183839,
0.123975426,
0.2631947398,
-0.1057829708,
-0.1982409656,
0.1272566766,
0.175486654,
0.0087331682,
-0.2702608407,
-0.1913998276,
-0.2674703002,
0.2551689744,
0.0270456299,
-0.3893483877,
-0.2998654246,
-0.3612284362,
-0.0803876296,
-0.3379432559,
-0.0242437907,
0.270457685,
0.1402749717,
-0.2321831584,
0.0587794483,
-0.1001929417,
-0.0318210945,
-0.1266248822,
0.1443428993,
-0.1426803321,
0.5946799517,
-0.1982655078,
0.542355299,
0.1506068558,
-0.0378130116,
0.4440322518,
-0.0917439014,
0.1699854136,
-0.362095207,
-0.4971081913,
-0.0669030994,
-0.1251890063,
0.0664183646,
-0.1115146279,
-0.2821608782,
-0.130408138,
0.1161025017,
-0.0095338672,
-0.1359713972,
0.0811217129,
0.1733745784,
-0.0260824952,
0.1395893693,
-0.117307961,
-0.048691716,
-0.1003915817,
-0.0609221198,
-0.2855703831,
-0.2101400197,
0.2482716292,
-0.1466302425,
-0.2170100212,
-0.267565906,
-0.0345851518,
0.3139544129,
0.3056480289,
0.5451366305,
0.1436626613,
-0.0632699057,
0.0515470915,
0.0490170047,
0.3031838536,
0.1406441927,
-0.2015918046,
0.2815238237,
-0.1118087247,
-0.5275211334,
0.006812349,
-0.2888288796,
0.2861070335,
-0.151114881,
-0.1364953369,
-0.2830282748,
-0.0095329694,
0.1217481866,
0.3055150211,
-0.0608975738,
-0.1197309569,
-0.3016712666,
-0.0870328769,
-0.5179741979,
0.1697145253,
0.146309495,
0.2763126194,
0.0918484032,
-0.2823674083,
-0.2318424731,
-0.5319344401,
0.0363813341,
0.066778034,
0.3335449696,
0.1646634638,
0.181921795,
-0.2661644816,
-0.1176079512,
0.5514031053,
0.3901550472,
0.2355771959,
-0.0744232237,
-0.0169190187,
0.0978518277,
0.2250101864,
0.0370273143,
-0.0905062258,
0.116483584,
-0.2906727493,
0.1484724283,
0.1189412028,
-0.0457757376,
0.0468275025,
-0.0387607925,
-0.223928079,
-0.2735212743,
0.4360411167,
0.0355574712,
0.1602489054,
0.4475948215,
0.2731459141,
-0.014616156,
0.1969363689,
0.235817939,
0.8210724592,
-0.3222985864,
0.0012439098,
0.3847112656,
0.0425248593,
0.450419277,
0.3941968083,
0.1207177117,
-0.351360023,
-0.2400812507,
-0.1093840078,
-0.2209489197,
0.19301112,
0.132399708,
-0.2190533876,
0.1336610615,
-0.1972189546,
0.1078304574,
0.0800063089,
0.096784994,
-0.0671038032,
-0.3313188553,
-0.5120388865,
0.115511857,
-0.245433256,
-0.184278816,
-0.1543579549,
-0.0854888335,
-0.0121684596,
-0.0684576631,
-0.1600126773,
0.1471407115,
-0.2108897865,
0.5190976262,
0.2066061199,
-0.5216554403,
0.0591061115,
-0.0817668661,
-0.211480394,
-0.0654449761,
0.0227800775,
-0.1752675921,
-0.1868131608,
-0.242134437,
-0.1483109593,
0.0007693172,
0.2760917544,
0.0007844046,
-0.1791671515,
0.1693536937,
-0.0637615845,
-0.4431355,
0.2583415806,
0.3664197326,
0.2560352683,
-0.3944437802,
-0.3703474998,
-0.1771427393,
-0.3999646306,
-0.2454401553,
0.1234321892,
0.0697476119,
-0.0844347477,
0.123295635,
0.2239296585,
-0.1587905884,
-0.0421672352,
0.2827082872,
0.2458966821,
0.1164727286,
0.6250904799,
0.1420300007,
-0.2136122286,
-0.2528684139,
0.0410646498,
0.2418820262,
-0.4038006663,
0.2405817807,
0.0724533945,
0.0953910649,
0.0289486013,
0.0958366096,
0.2089329809,
0.4507568479,
-0.1446698606,
-0.5494536161,
-0.5519043803,
0.3133971989,
-0.3650939167,
0.124087289,
0.0728807151,
0.0119038075,
-0.0238186866,
0.0976892412,
-0.28880319,
-0.0474176891,
-0.3507492244,
0.059047617,
-0.0160400756,
-0.0977909192,
-0.0120488741,
-0.0036969818,
0.2137143314,
0.2369945198,
-0.1554047763,
-0.3096165955,
0.1651388109,
0.0450444296,
0.1219118908,
-0.4085615277,
-0.1422494948,
-0.1819286346,
-0.0714022219,
-0.1304441094,
-0.1986448467,
0.008819446,
-0.0786494687,
-0.1377132535,
0.2962079644,
-0.0935740173,
-0.2675310969,
0.0401338376,
-0.2507906258,
0.2575216293,
0.0955287516,
0.0755475163,
0.3524122834,
0.0259378701,
-0.2118296474,
0.0399076641,
-0.0894019678,
-0.1462774873,
0.3332198262,
-0.3912498653,
-0.3031313121,
0.0155774727,
0.4696726799,
0.2980314493,
-0.1497731656,
0.0706828982,
-0.048704084,
0.2152476311,
-0.3896288574,
-0.1665168554,
-0.0059128199,
0.0592268854,
-0.0484120473,
0.319621563,
0.2594070435,
-0.2182891369,
0.2811605036,
0.2165709138,
0.2245475501,
0.0140966773,
-0.1251776367,
0.6010935903,
-0.1161143631,
-0.2117931247,
0.4168363512,
0.2926369607,
0.5112426877,
0.260607183,
-0.107160233,
0.1381425261,
0.2688381374,
0.1799881458,
-0.0583727062,
-0.4001862705,
-0.1858806163,
0.1077911109,
0.106651552,
-0.1382402331,
0.1256544888,
-0.028010577,
0.2579559088,
0.036102403,
0.1478013247,
0.0859168768,
-0.1818324476,
0.021945972,
0.0008073226,
0.1250001639,
0.0484732389,
-0.2068496197,
0.0252287388,
-0.3063668907,
0.2777523994,
0.0016674623,
-0.0858004764,
-0.4399662018,
0.1806317121,
0.5117085576,
-0.1094689518,
-0.1741170436,
0.3285869956,
0.4242917001,
-0.0237483904,
-0.1503317952,
0.3860250413,
0.5804910064,
0.2409937233,
-0.0238723308,
0.2077869624,
-0.008149391,
-0.1690757871,
-0.0604742169,
0.090924114,
0.0978052318,
0.1481953263,
0.1423253417,
0.2106000483,
-0.1486497223,
0.2147823125,
0.3060646653,
0.1065621227,
-0.2027075738,
0.1401880234,
-0.1162667274,
-0.2585431337,
-0.0850693733,
-0.0174533166,
-0.4101571739,
0.1649528593,
0.4182043076,
-0.1200689375,
-0.0165552832,
0.1644587815,
0.0681712106,
-0.3404385149,
0.4425361156,
0.2781798542,
0.2539973557,
-0.169439137,
0.0014495254,
-0.4343958795,
0.2145671546,
-0.3108061552,
-0.1265730709,
0.188310504,
0.3053247333,
0.2690075636,
0.3885177672,
0.1593929827,
0.1569657922,
0.0389911793,
0.1325366646,
-0.3590077758,
-0.1170708835,
0.1671165824,
0.1152933165,
-0.1712614745,
-0.1662120819,
0.3117124736,
0.0199156478,
0.1010073572,
-0.2547493577,
-0.0726023465,
-0.2228350937,
0.341465801,
0.3372590542,
0.1324574053,
0.3376510739,
0.0862658694,
0.2375315726,
-0.0411621667,
-0.4088021517,
-0.008203283,
0.5613905191,
0.3403673172,
0.5017962456,
-0.1312080175,
-0.1257842779,
-0.1431040019,
0.3379403949,
-0.2317000031,
-0.0427060165,
-0.3095023334,
0.0799119323,
-0.1820609123,
-0.0606571287,
-0.3343918025,
-0.0018313313,
0.0890032947,
0.2038214207,
-0.2043202221,
-0.5852283835,
0.4368219674,
-0.1386184245,
-0.4015678465,
0.0768634081,
0.0833694115,
0.0400833637,
-0.0228594653,
-0.2593252361,
0.1473936439,
0.2686135769,
-0.0831841752,
-0.3365373015,
0.0610048436,
-0.0287154242,
0.2706462741,
-0.1467019618,
0.3606085181,
-0.1064436883,
-0.038489949,
-0.1178414822,
-0.1704111546
] |
https://github.com/huggingface/datasets/issues/611 | ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648 | Thanks and some more on the `embeddings` and `picture_url` would be nice as well (type and max lengths of the elements) | Hi, I'm trying to load a dataset from Dataframe, but I get the error:
```bash
---------------------------------------------------------------------------
ArrowCapacityError Traceback (most recent call last)
<ipython-input-7-146b6b495963> in <module>
----> 1 dataset = Dataset.from_pandas(emb)
~/miniconda3/envs/dev/lib/python3.7/site-packages/nlp/arrow_dataset.py in from_pandas(cls, df, features, info, split)
223 info.features = features
224 pa_table: pa.Table = pa.Table.from_pandas(
--> 225 df=df, schema=pa.schema(features.type) if features is not None else None
226 )
227 return cls(pa_table, info=info, split=split)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.from_pandas()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in dataframe_to_arrays(df, schema, preserve_index, nthreads, columns, safe)
591 for i, maybe_fut in enumerate(arrays):
592 if isinstance(maybe_fut, futures.Future):
--> 593 arrays[i] = maybe_fut.result()
594
595 types = [x.type for x in arrays]
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in result(self, timeout)
426 raise CancelledError()
427 elif self._state == FINISHED:
--> 428 return self.__get_result()
429
430 self._condition.wait(timeout)
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in __get_result(self)
382 def __get_result(self):
383 if self._exception:
--> 384 raise self._exception
385 else:
386 return self._result
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/thread.py in run(self)
55
56 try:
---> 57 result = self.fn(*self.args, **self.kwargs)
58 except BaseException as exc:
59 self.future.set_exception(exc)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in convert_column(col, field)
557
558 try:
--> 559 result = pa.array(col, type=type_, from_pandas=True, safe=safe)
560 except (pa.ArrowInvalid,
561 pa.ArrowNotImplementedError,
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib.array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib._ndarray_to_array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
```
My code is :
```python
from nlp import Dataset
dataset = Dataset.from_pandas(emb)
``` | 21 | ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
Hi, I'm trying to load a dataset from Dataframe, but I get the error:
```bash
---------------------------------------------------------------------------
ArrowCapacityError Traceback (most recent call last)
<ipython-input-7-146b6b495963> in <module>
----> 1 dataset = Dataset.from_pandas(emb)
~/miniconda3/envs/dev/lib/python3.7/site-packages/nlp/arrow_dataset.py in from_pandas(cls, df, features, info, split)
223 info.features = features
224 pa_table: pa.Table = pa.Table.from_pandas(
--> 225 df=df, schema=pa.schema(features.type) if features is not None else None
226 )
227 return cls(pa_table, info=info, split=split)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.from_pandas()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in dataframe_to_arrays(df, schema, preserve_index, nthreads, columns, safe)
591 for i, maybe_fut in enumerate(arrays):
592 if isinstance(maybe_fut, futures.Future):
--> 593 arrays[i] = maybe_fut.result()
594
595 types = [x.type for x in arrays]
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in result(self, timeout)
426 raise CancelledError()
427 elif self._state == FINISHED:
--> 428 return self.__get_result()
429
430 self._condition.wait(timeout)
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in __get_result(self)
382 def __get_result(self):
383 if self._exception:
--> 384 raise self._exception
385 else:
386 return self._result
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/thread.py in run(self)
55
56 try:
---> 57 result = self.fn(*self.args, **self.kwargs)
58 except BaseException as exc:
59 self.future.set_exception(exc)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in convert_column(col, field)
557
558 try:
--> 559 result = pa.array(col, type=type_, from_pandas=True, safe=safe)
560 except (pa.ArrowInvalid,
561 pa.ArrowNotImplementedError,
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib.array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib._ndarray_to_array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
```
My code is :
```python
from nlp import Dataset
dataset = Dataset.from_pandas(emb)
```
Thanks and some more on the `embeddings` and `picture_url` would be nice as well (type and max lengths of the elements) | [
-0.2813219428,
-0.0744089633,
-0.2195497006,
0.4159463644,
0.2823322415,
-0.0299136937,
0.4081526101,
0.2113162875,
0.3901513219,
0.0474446639,
0.0003714785,
0.3082345724,
-0.086982511,
0.0685475543,
0.0008355677,
-0.3419585526,
-0.0933910757,
0.2279391438,
-0.2755076289,
0.093866922,
-0.2262556106,
0.0042208489,
-0.2458186746,
0.2069003135,
-0.0991637111,
-0.163917169,
0.0946855843,
-0.0185277201,
-0.0302413832,
-0.4899120629,
0.3556724787,
-0.2945442796,
0.0409599096,
0.1827442199,
-0.0001102196,
0.0405714065,
0.370965749,
0.0665660948,
-0.2005662173,
-0.1553710997,
-0.3287507892,
-0.4413963556,
0.4300813675,
-0.0768522024,
0.2361875176,
-0.2503987551,
-0.2843055725,
-0.3228807449,
0.3386787474,
0.0333644226,
0.2106782198,
0.2027077675,
0.4377657175,
0.1210990548,
0.0229465794,
-0.0303454809,
-0.1591518521,
0.4214887321,
0.1034791246,
-0.2736417353,
-0.0080705397,
0.0098081641,
-0.142464608,
0.3750423789,
0.1488452852,
-0.0841917843,
-0.1104777753,
-0.3776780069,
-0.2088724971,
0.2992430031,
0.6146752834,
-0.2124878317,
-0.3339967132,
-0.0728104264,
-0.0512901768,
-0.3944201469,
0.0467975624,
0.2284303606,
-0.0249730758,
0.0557536148,
0.0425887257,
0.0693220049,
-0.2296036184,
0.0915906951,
-0.0219255909,
0.1695369482,
-0.1137780398,
0.2228228897,
0.3527330458,
-0.1476605386,
0.1830547005,
0.1812526733,
0.0453063734,
0.1507356614,
-0.4711748064,
-0.0076135397,
-0.2046746016,
-0.4235939085,
0.0187760666,
0.3424941897,
0.2327031344,
0.0266378634,
0.0264476873,
0.2103407085,
0.1510002017,
0.0093807839,
-0.1516857147,
0.0805709958,
-0.2187975049,
-0.0026093572,
0.1261628121,
-0.0250722412,
-0.0848935694,
-0.1750026643,
0.1665081978,
-0.1399343312,
0.3136581182,
-0.1227994785,
-0.4663729668,
0.2898981869,
-0.4942124486,
-0.1571941674,
0.1066770703,
0.1941331327,
0.1898524612,
0.3169755936,
-0.0033450872,
0.1713691652,
-0.0771829039,
0.1586163938,
-0.1784181297,
0.2248278409,
0.0160618797,
0.1040192693,
-0.0073826239,
0.058732558,
0.161934495,
0.1394264251,
0.3225093782,
-0.1406375021,
0.2424961329,
-0.4746842384,
-0.0491966493,
0.4604403675,
-0.0665067658,
0.1373984516,
0.4816640317,
0.0661067441,
-0.0525114611,
0.2874205112,
-0.0509188771,
-0.1669092178,
-0.3079305887,
0.20349738,
0.086414665,
0.1019393504,
-0.5072072148,
-0.1107440144,
0.2790832818,
0.0325419605,
0.2052159607,
-0.1718197912,
-0.0322396904,
-0.2370043993,
-0.0113936029,
0.2493878007,
-0.6265667081,
0.1540752649,
-0.0097451583,
-0.004250763,
0.2222196907,
0.3802013695,
-0.3712419271,
0.0840109661,
-0.1422455311,
0.1392557025,
-0.0864753574,
0.0490297154,
-0.488214761,
0.1294536442,
0.030543074,
-0.0098823607,
0.1360376477,
0.1243114397,
0.0164617784,
0.0188370906,
-0.0198176689,
0.2659707069,
0.1596461385,
-0.1190211028,
-0.0785414279,
-0.0919745266,
0.1880331337,
0.0425337404,
-0.1309632957,
-0.111163646,
-0.0377984941,
-0.0583446473,
0.0506684333,
-0.0628660619,
-0.1312292665,
0.3077424169,
0.1022797003,
0.0275660828,
-0.1993942857,
-0.1144552827,
-0.2583914399,
-0.0028339475,
0.0060147718,
-0.2196476161,
-0.3686643243,
0.032235235,
-0.3015364707,
0.4142712057,
-0.0037014037,
0.1467633247,
0.1555080712,
0.0351495333,
-0.1392640322,
-0.340544343,
-0.1125304103,
0.382986784,
-0.2520795763,
0.1174004003,
-0.2488704026,
0.6389963031,
-0.1225097924,
-0.2358955443,
0.1020291746,
-0.0589133985,
0.0078238137,
-0.0617900826,
-0.0541579686,
0.3003413677,
0.2589097619,
-0.0539925434,
-0.1232166812,
-0.171292901,
0.1711659431,
-0.4476627111,
-0.0194847696,
0.3259801269,
0.249661386,
0.0040196702,
0.2005783617,
0.2582695186,
-0.2305360436,
0.1718456149,
0.0606856048,
0.0135375261,
0.2570947111,
0.2372242659,
-0.0010136068,
-0.1002516598,
-0.0585768893,
0.1710239649,
0.2922353148,
0.0836868957,
-0.3078922927,
-0.0896601304,
0.0380945317,
0.0779546052,
0.1145799235,
0.1024589986,
-0.101138711,
-0.1450705975,
0.2871287465,
-0.1594516784,
0.3604677916,
0.2623209655,
-0.3645986915,
-0.3160840273,
-0.1206736118,
0.0692810565,
0.3395499587,
0.2408053428,
0.4563709795,
-0.0912679881,
0.0961696133,
-0.0320505314,
-0.2791268826,
-0.5238829851,
0.062461026,
0.4196492434,
-0.3285613656,
0.2295598686,
-0.1541726142,
-0.1055466011,
0.2673345506,
-0.6564481258,
0.0172791556,
-0.1459004134,
-0.332672894,
0.3113352358,
0.0069781691,
0.0980789661,
-0.3459627628,
-0.1520458907,
0.3765097558,
0.0735323951,
0.0101615116,
-0.0400558449,
0.1425976455,
0.1107220054,
0.270267278,
0.0741860569,
0.2685508132,
0.3350816965,
0.0054361597,
-0.1947272569,
-0.0174661875,
-0.0798838437,
-0.1669339538,
0.0517802164,
0.0623220652,
0.3742078245,
-0.5855014324,
-0.5999204516,
0.1264113039,
0.0736583322,
-0.1970531791,
0.1744910181,
0.070136629,
0.1771166027,
0.1904893219,
-0.1944370121,
-0.3144041598,
-0.3713177443,
0.1869554371,
0.2236397862,
0.3277162611,
0.315998435,
0.2656183839,
0.123975426,
0.2631947398,
-0.1057829708,
-0.1982409656,
0.1272566766,
0.175486654,
0.0087331682,
-0.2702608407,
-0.1913998276,
-0.2674703002,
0.2551689744,
0.0270456299,
-0.3893483877,
-0.2998654246,
-0.3612284362,
-0.0803876296,
-0.3379432559,
-0.0242437907,
0.270457685,
0.1402749717,
-0.2321831584,
0.0587794483,
-0.1001929417,
-0.0318210945,
-0.1266248822,
0.1443428993,
-0.1426803321,
0.5946799517,
-0.1982655078,
0.542355299,
0.1506068558,
-0.0378130116,
0.4440322518,
-0.0917439014,
0.1699854136,
-0.362095207,
-0.4971081913,
-0.0669030994,
-0.1251890063,
0.0664183646,
-0.1115146279,
-0.2821608782,
-0.130408138,
0.1161025017,
-0.0095338672,
-0.1359713972,
0.0811217129,
0.1733745784,
-0.0260824952,
0.1395893693,
-0.117307961,
-0.048691716,
-0.1003915817,
-0.0609221198,
-0.2855703831,
-0.2101400197,
0.2482716292,
-0.1466302425,
-0.2170100212,
-0.267565906,
-0.0345851518,
0.3139544129,
0.3056480289,
0.5451366305,
0.1436626613,
-0.0632699057,
0.0515470915,
0.0490170047,
0.3031838536,
0.1406441927,
-0.2015918046,
0.2815238237,
-0.1118087247,
-0.5275211334,
0.006812349,
-0.2888288796,
0.2861070335,
-0.151114881,
-0.1364953369,
-0.2830282748,
-0.0095329694,
0.1217481866,
0.3055150211,
-0.0608975738,
-0.1197309569,
-0.3016712666,
-0.0870328769,
-0.5179741979,
0.1697145253,
0.146309495,
0.2763126194,
0.0918484032,
-0.2823674083,
-0.2318424731,
-0.5319344401,
0.0363813341,
0.066778034,
0.3335449696,
0.1646634638,
0.181921795,
-0.2661644816,
-0.1176079512,
0.5514031053,
0.3901550472,
0.2355771959,
-0.0744232237,
-0.0169190187,
0.0978518277,
0.2250101864,
0.0370273143,
-0.0905062258,
0.116483584,
-0.2906727493,
0.1484724283,
0.1189412028,
-0.0457757376,
0.0468275025,
-0.0387607925,
-0.223928079,
-0.2735212743,
0.4360411167,
0.0355574712,
0.1602489054,
0.4475948215,
0.2731459141,
-0.014616156,
0.1969363689,
0.235817939,
0.8210724592,
-0.3222985864,
0.0012439098,
0.3847112656,
0.0425248593,
0.450419277,
0.3941968083,
0.1207177117,
-0.351360023,
-0.2400812507,
-0.1093840078,
-0.2209489197,
0.19301112,
0.132399708,
-0.2190533876,
0.1336610615,
-0.1972189546,
0.1078304574,
0.0800063089,
0.096784994,
-0.0671038032,
-0.3313188553,
-0.5120388865,
0.115511857,
-0.245433256,
-0.184278816,
-0.1543579549,
-0.0854888335,
-0.0121684596,
-0.0684576631,
-0.1600126773,
0.1471407115,
-0.2108897865,
0.5190976262,
0.2066061199,
-0.5216554403,
0.0591061115,
-0.0817668661,
-0.211480394,
-0.0654449761,
0.0227800775,
-0.1752675921,
-0.1868131608,
-0.242134437,
-0.1483109593,
0.0007693172,
0.2760917544,
0.0007844046,
-0.1791671515,
0.1693536937,
-0.0637615845,
-0.4431355,
0.2583415806,
0.3664197326,
0.2560352683,
-0.3944437802,
-0.3703474998,
-0.1771427393,
-0.3999646306,
-0.2454401553,
0.1234321892,
0.0697476119,
-0.0844347477,
0.123295635,
0.2239296585,
-0.1587905884,
-0.0421672352,
0.2827082872,
0.2458966821,
0.1164727286,
0.6250904799,
0.1420300007,
-0.2136122286,
-0.2528684139,
0.0410646498,
0.2418820262,
-0.4038006663,
0.2405817807,
0.0724533945,
0.0953910649,
0.0289486013,
0.0958366096,
0.2089329809,
0.4507568479,
-0.1446698606,
-0.5494536161,
-0.5519043803,
0.3133971989,
-0.3650939167,
0.124087289,
0.0728807151,
0.0119038075,
-0.0238186866,
0.0976892412,
-0.28880319,
-0.0474176891,
-0.3507492244,
0.059047617,
-0.0160400756,
-0.0977909192,
-0.0120488741,
-0.0036969818,
0.2137143314,
0.2369945198,
-0.1554047763,
-0.3096165955,
0.1651388109,
0.0450444296,
0.1219118908,
-0.4085615277,
-0.1422494948,
-0.1819286346,
-0.0714022219,
-0.1304441094,
-0.1986448467,
0.008819446,
-0.0786494687,
-0.1377132535,
0.2962079644,
-0.0935740173,
-0.2675310969,
0.0401338376,
-0.2507906258,
0.2575216293,
0.0955287516,
0.0755475163,
0.3524122834,
0.0259378701,
-0.2118296474,
0.0399076641,
-0.0894019678,
-0.1462774873,
0.3332198262,
-0.3912498653,
-0.3031313121,
0.0155774727,
0.4696726799,
0.2980314493,
-0.1497731656,
0.0706828982,
-0.048704084,
0.2152476311,
-0.3896288574,
-0.1665168554,
-0.0059128199,
0.0592268854,
-0.0484120473,
0.319621563,
0.2594070435,
-0.2182891369,
0.2811605036,
0.2165709138,
0.2245475501,
0.0140966773,
-0.1251776367,
0.6010935903,
-0.1161143631,
-0.2117931247,
0.4168363512,
0.2926369607,
0.5112426877,
0.260607183,
-0.107160233,
0.1381425261,
0.2688381374,
0.1799881458,
-0.0583727062,
-0.4001862705,
-0.1858806163,
0.1077911109,
0.106651552,
-0.1382402331,
0.1256544888,
-0.028010577,
0.2579559088,
0.036102403,
0.1478013247,
0.0859168768,
-0.1818324476,
0.021945972,
0.0008073226,
0.1250001639,
0.0484732389,
-0.2068496197,
0.0252287388,
-0.3063668907,
0.2777523994,
0.0016674623,
-0.0858004764,
-0.4399662018,
0.1806317121,
0.5117085576,
-0.1094689518,
-0.1741170436,
0.3285869956,
0.4242917001,
-0.0237483904,
-0.1503317952,
0.3860250413,
0.5804910064,
0.2409937233,
-0.0238723308,
0.2077869624,
-0.008149391,
-0.1690757871,
-0.0604742169,
0.090924114,
0.0978052318,
0.1481953263,
0.1423253417,
0.2106000483,
-0.1486497223,
0.2147823125,
0.3060646653,
0.1065621227,
-0.2027075738,
0.1401880234,
-0.1162667274,
-0.2585431337,
-0.0850693733,
-0.0174533166,
-0.4101571739,
0.1649528593,
0.4182043076,
-0.1200689375,
-0.0165552832,
0.1644587815,
0.0681712106,
-0.3404385149,
0.4425361156,
0.2781798542,
0.2539973557,
-0.169439137,
0.0014495254,
-0.4343958795,
0.2145671546,
-0.3108061552,
-0.1265730709,
0.188310504,
0.3053247333,
0.2690075636,
0.3885177672,
0.1593929827,
0.1569657922,
0.0389911793,
0.1325366646,
-0.3590077758,
-0.1170708835,
0.1671165824,
0.1152933165,
-0.1712614745,
-0.1662120819,
0.3117124736,
0.0199156478,
0.1010073572,
-0.2547493577,
-0.0726023465,
-0.2228350937,
0.341465801,
0.3372590542,
0.1324574053,
0.3376510739,
0.0862658694,
0.2375315726,
-0.0411621667,
-0.4088021517,
-0.008203283,
0.5613905191,
0.3403673172,
0.5017962456,
-0.1312080175,
-0.1257842779,
-0.1431040019,
0.3379403949,
-0.2317000031,
-0.0427060165,
-0.3095023334,
0.0799119323,
-0.1820609123,
-0.0606571287,
-0.3343918025,
-0.0018313313,
0.0890032947,
0.2038214207,
-0.2043202221,
-0.5852283835,
0.4368219674,
-0.1386184245,
-0.4015678465,
0.0768634081,
0.0833694115,
0.0400833637,
-0.0228594653,
-0.2593252361,
0.1473936439,
0.2686135769,
-0.0831841752,
-0.3365373015,
0.0610048436,
-0.0287154242,
0.2706462741,
-0.1467019618,
0.3606085181,
-0.1064436883,
-0.038489949,
-0.1178414822,
-0.1704111546
] |
https://github.com/huggingface/datasets/issues/611 | ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648 | It looks like a Pyarrow limitation.
I was able to reproduce the error with
```python
import pandas as pd
import numpy as np
import pyarrow as pa
n = 1713614
df = pd.DataFrame.from_dict({"a": list(np.zeros((n, 128))), "b": range(n)})
pa.Table.from_pandas(df)
```
I also tried with 50% of the dataframe and it actually works.
I created an issue on Apache Arrow's JIRA [here](https://issues.apache.org/jira/browse/ARROW-9976)
One way to fix that would be to chunk the dataframe and concatenate arrow tables. | Hi, I'm trying to load a dataset from Dataframe, but I get the error:
```bash
---------------------------------------------------------------------------
ArrowCapacityError Traceback (most recent call last)
<ipython-input-7-146b6b495963> in <module>
----> 1 dataset = Dataset.from_pandas(emb)
~/miniconda3/envs/dev/lib/python3.7/site-packages/nlp/arrow_dataset.py in from_pandas(cls, df, features, info, split)
223 info.features = features
224 pa_table: pa.Table = pa.Table.from_pandas(
--> 225 df=df, schema=pa.schema(features.type) if features is not None else None
226 )
227 return cls(pa_table, info=info, split=split)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.from_pandas()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in dataframe_to_arrays(df, schema, preserve_index, nthreads, columns, safe)
591 for i, maybe_fut in enumerate(arrays):
592 if isinstance(maybe_fut, futures.Future):
--> 593 arrays[i] = maybe_fut.result()
594
595 types = [x.type for x in arrays]
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in result(self, timeout)
426 raise CancelledError()
427 elif self._state == FINISHED:
--> 428 return self.__get_result()
429
430 self._condition.wait(timeout)
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in __get_result(self)
382 def __get_result(self):
383 if self._exception:
--> 384 raise self._exception
385 else:
386 return self._result
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/thread.py in run(self)
55
56 try:
---> 57 result = self.fn(*self.args, **self.kwargs)
58 except BaseException as exc:
59 self.future.set_exception(exc)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in convert_column(col, field)
557
558 try:
--> 559 result = pa.array(col, type=type_, from_pandas=True, safe=safe)
560 except (pa.ArrowInvalid,
561 pa.ArrowNotImplementedError,
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib.array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib._ndarray_to_array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
```
My code is :
```python
from nlp import Dataset
dataset = Dataset.from_pandas(emb)
``` | 75 | ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
Hi, I'm trying to load a dataset from Dataframe, but I get the error:
```bash
---------------------------------------------------------------------------
ArrowCapacityError Traceback (most recent call last)
<ipython-input-7-146b6b495963> in <module>
----> 1 dataset = Dataset.from_pandas(emb)
~/miniconda3/envs/dev/lib/python3.7/site-packages/nlp/arrow_dataset.py in from_pandas(cls, df, features, info, split)
223 info.features = features
224 pa_table: pa.Table = pa.Table.from_pandas(
--> 225 df=df, schema=pa.schema(features.type) if features is not None else None
226 )
227 return cls(pa_table, info=info, split=split)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.from_pandas()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in dataframe_to_arrays(df, schema, preserve_index, nthreads, columns, safe)
591 for i, maybe_fut in enumerate(arrays):
592 if isinstance(maybe_fut, futures.Future):
--> 593 arrays[i] = maybe_fut.result()
594
595 types = [x.type for x in arrays]
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in result(self, timeout)
426 raise CancelledError()
427 elif self._state == FINISHED:
--> 428 return self.__get_result()
429
430 self._condition.wait(timeout)
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in __get_result(self)
382 def __get_result(self):
383 if self._exception:
--> 384 raise self._exception
385 else:
386 return self._result
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/thread.py in run(self)
55
56 try:
---> 57 result = self.fn(*self.args, **self.kwargs)
58 except BaseException as exc:
59 self.future.set_exception(exc)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in convert_column(col, field)
557
558 try:
--> 559 result = pa.array(col, type=type_, from_pandas=True, safe=safe)
560 except (pa.ArrowInvalid,
561 pa.ArrowNotImplementedError,
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib.array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib._ndarray_to_array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
```
My code is :
```python
from nlp import Dataset
dataset = Dataset.from_pandas(emb)
```
It looks like a Pyarrow limitation.
I was able to reproduce the error with
```python
import pandas as pd
import numpy as np
import pyarrow as pa
n = 1713614
df = pd.DataFrame.from_dict({"a": list(np.zeros((n, 128))), "b": range(n)})
pa.Table.from_pandas(df)
```
I also tried with 50% of the dataframe and it actually works.
I created an issue on Apache Arrow's JIRA [here](https://issues.apache.org/jira/browse/ARROW-9976)
One way to fix that would be to chunk the dataframe and concatenate arrow tables. | [
-0.2813219428,
-0.0744089633,
-0.2195497006,
0.4159463644,
0.2823322415,
-0.0299136937,
0.4081526101,
0.2113162875,
0.3901513219,
0.0474446639,
0.0003714785,
0.3082345724,
-0.086982511,
0.0685475543,
0.0008355677,
-0.3419585526,
-0.0933910757,
0.2279391438,
-0.2755076289,
0.093866922,
-0.2262556106,
0.0042208489,
-0.2458186746,
0.2069003135,
-0.0991637111,
-0.163917169,
0.0946855843,
-0.0185277201,
-0.0302413832,
-0.4899120629,
0.3556724787,
-0.2945442796,
0.0409599096,
0.1827442199,
-0.0001102196,
0.0405714065,
0.370965749,
0.0665660948,
-0.2005662173,
-0.1553710997,
-0.3287507892,
-0.4413963556,
0.4300813675,
-0.0768522024,
0.2361875176,
-0.2503987551,
-0.2843055725,
-0.3228807449,
0.3386787474,
0.0333644226,
0.2106782198,
0.2027077675,
0.4377657175,
0.1210990548,
0.0229465794,
-0.0303454809,
-0.1591518521,
0.4214887321,
0.1034791246,
-0.2736417353,
-0.0080705397,
0.0098081641,
-0.142464608,
0.3750423789,
0.1488452852,
-0.0841917843,
-0.1104777753,
-0.3776780069,
-0.2088724971,
0.2992430031,
0.6146752834,
-0.2124878317,
-0.3339967132,
-0.0728104264,
-0.0512901768,
-0.3944201469,
0.0467975624,
0.2284303606,
-0.0249730758,
0.0557536148,
0.0425887257,
0.0693220049,
-0.2296036184,
0.0915906951,
-0.0219255909,
0.1695369482,
-0.1137780398,
0.2228228897,
0.3527330458,
-0.1476605386,
0.1830547005,
0.1812526733,
0.0453063734,
0.1507356614,
-0.4711748064,
-0.0076135397,
-0.2046746016,
-0.4235939085,
0.0187760666,
0.3424941897,
0.2327031344,
0.0266378634,
0.0264476873,
0.2103407085,
0.1510002017,
0.0093807839,
-0.1516857147,
0.0805709958,
-0.2187975049,
-0.0026093572,
0.1261628121,
-0.0250722412,
-0.0848935694,
-0.1750026643,
0.1665081978,
-0.1399343312,
0.3136581182,
-0.1227994785,
-0.4663729668,
0.2898981869,
-0.4942124486,
-0.1571941674,
0.1066770703,
0.1941331327,
0.1898524612,
0.3169755936,
-0.0033450872,
0.1713691652,
-0.0771829039,
0.1586163938,
-0.1784181297,
0.2248278409,
0.0160618797,
0.1040192693,
-0.0073826239,
0.058732558,
0.161934495,
0.1394264251,
0.3225093782,
-0.1406375021,
0.2424961329,
-0.4746842384,
-0.0491966493,
0.4604403675,
-0.0665067658,
0.1373984516,
0.4816640317,
0.0661067441,
-0.0525114611,
0.2874205112,
-0.0509188771,
-0.1669092178,
-0.3079305887,
0.20349738,
0.086414665,
0.1019393504,
-0.5072072148,
-0.1107440144,
0.2790832818,
0.0325419605,
0.2052159607,
-0.1718197912,
-0.0322396904,
-0.2370043993,
-0.0113936029,
0.2493878007,
-0.6265667081,
0.1540752649,
-0.0097451583,
-0.004250763,
0.2222196907,
0.3802013695,
-0.3712419271,
0.0840109661,
-0.1422455311,
0.1392557025,
-0.0864753574,
0.0490297154,
-0.488214761,
0.1294536442,
0.030543074,
-0.0098823607,
0.1360376477,
0.1243114397,
0.0164617784,
0.0188370906,
-0.0198176689,
0.2659707069,
0.1596461385,
-0.1190211028,
-0.0785414279,
-0.0919745266,
0.1880331337,
0.0425337404,
-0.1309632957,
-0.111163646,
-0.0377984941,
-0.0583446473,
0.0506684333,
-0.0628660619,
-0.1312292665,
0.3077424169,
0.1022797003,
0.0275660828,
-0.1993942857,
-0.1144552827,
-0.2583914399,
-0.0028339475,
0.0060147718,
-0.2196476161,
-0.3686643243,
0.032235235,
-0.3015364707,
0.4142712057,
-0.0037014037,
0.1467633247,
0.1555080712,
0.0351495333,
-0.1392640322,
-0.340544343,
-0.1125304103,
0.382986784,
-0.2520795763,
0.1174004003,
-0.2488704026,
0.6389963031,
-0.1225097924,
-0.2358955443,
0.1020291746,
-0.0589133985,
0.0078238137,
-0.0617900826,
-0.0541579686,
0.3003413677,
0.2589097619,
-0.0539925434,
-0.1232166812,
-0.171292901,
0.1711659431,
-0.4476627111,
-0.0194847696,
0.3259801269,
0.249661386,
0.0040196702,
0.2005783617,
0.2582695186,
-0.2305360436,
0.1718456149,
0.0606856048,
0.0135375261,
0.2570947111,
0.2372242659,
-0.0010136068,
-0.1002516598,
-0.0585768893,
0.1710239649,
0.2922353148,
0.0836868957,
-0.3078922927,
-0.0896601304,
0.0380945317,
0.0779546052,
0.1145799235,
0.1024589986,
-0.101138711,
-0.1450705975,
0.2871287465,
-0.1594516784,
0.3604677916,
0.2623209655,
-0.3645986915,
-0.3160840273,
-0.1206736118,
0.0692810565,
0.3395499587,
0.2408053428,
0.4563709795,
-0.0912679881,
0.0961696133,
-0.0320505314,
-0.2791268826,
-0.5238829851,
0.062461026,
0.4196492434,
-0.3285613656,
0.2295598686,
-0.1541726142,
-0.1055466011,
0.2673345506,
-0.6564481258,
0.0172791556,
-0.1459004134,
-0.332672894,
0.3113352358,
0.0069781691,
0.0980789661,
-0.3459627628,
-0.1520458907,
0.3765097558,
0.0735323951,
0.0101615116,
-0.0400558449,
0.1425976455,
0.1107220054,
0.270267278,
0.0741860569,
0.2685508132,
0.3350816965,
0.0054361597,
-0.1947272569,
-0.0174661875,
-0.0798838437,
-0.1669339538,
0.0517802164,
0.0623220652,
0.3742078245,
-0.5855014324,
-0.5999204516,
0.1264113039,
0.0736583322,
-0.1970531791,
0.1744910181,
0.070136629,
0.1771166027,
0.1904893219,
-0.1944370121,
-0.3144041598,
-0.3713177443,
0.1869554371,
0.2236397862,
0.3277162611,
0.315998435,
0.2656183839,
0.123975426,
0.2631947398,
-0.1057829708,
-0.1982409656,
0.1272566766,
0.175486654,
0.0087331682,
-0.2702608407,
-0.1913998276,
-0.2674703002,
0.2551689744,
0.0270456299,
-0.3893483877,
-0.2998654246,
-0.3612284362,
-0.0803876296,
-0.3379432559,
-0.0242437907,
0.270457685,
0.1402749717,
-0.2321831584,
0.0587794483,
-0.1001929417,
-0.0318210945,
-0.1266248822,
0.1443428993,
-0.1426803321,
0.5946799517,
-0.1982655078,
0.542355299,
0.1506068558,
-0.0378130116,
0.4440322518,
-0.0917439014,
0.1699854136,
-0.362095207,
-0.4971081913,
-0.0669030994,
-0.1251890063,
0.0664183646,
-0.1115146279,
-0.2821608782,
-0.130408138,
0.1161025017,
-0.0095338672,
-0.1359713972,
0.0811217129,
0.1733745784,
-0.0260824952,
0.1395893693,
-0.117307961,
-0.048691716,
-0.1003915817,
-0.0609221198,
-0.2855703831,
-0.2101400197,
0.2482716292,
-0.1466302425,
-0.2170100212,
-0.267565906,
-0.0345851518,
0.3139544129,
0.3056480289,
0.5451366305,
0.1436626613,
-0.0632699057,
0.0515470915,
0.0490170047,
0.3031838536,
0.1406441927,
-0.2015918046,
0.2815238237,
-0.1118087247,
-0.5275211334,
0.006812349,
-0.2888288796,
0.2861070335,
-0.151114881,
-0.1364953369,
-0.2830282748,
-0.0095329694,
0.1217481866,
0.3055150211,
-0.0608975738,
-0.1197309569,
-0.3016712666,
-0.0870328769,
-0.5179741979,
0.1697145253,
0.146309495,
0.2763126194,
0.0918484032,
-0.2823674083,
-0.2318424731,
-0.5319344401,
0.0363813341,
0.066778034,
0.3335449696,
0.1646634638,
0.181921795,
-0.2661644816,
-0.1176079512,
0.5514031053,
0.3901550472,
0.2355771959,
-0.0744232237,
-0.0169190187,
0.0978518277,
0.2250101864,
0.0370273143,
-0.0905062258,
0.116483584,
-0.2906727493,
0.1484724283,
0.1189412028,
-0.0457757376,
0.0468275025,
-0.0387607925,
-0.223928079,
-0.2735212743,
0.4360411167,
0.0355574712,
0.1602489054,
0.4475948215,
0.2731459141,
-0.014616156,
0.1969363689,
0.235817939,
0.8210724592,
-0.3222985864,
0.0012439098,
0.3847112656,
0.0425248593,
0.450419277,
0.3941968083,
0.1207177117,
-0.351360023,
-0.2400812507,
-0.1093840078,
-0.2209489197,
0.19301112,
0.132399708,
-0.2190533876,
0.1336610615,
-0.1972189546,
0.1078304574,
0.0800063089,
0.096784994,
-0.0671038032,
-0.3313188553,
-0.5120388865,
0.115511857,
-0.245433256,
-0.184278816,
-0.1543579549,
-0.0854888335,
-0.0121684596,
-0.0684576631,
-0.1600126773,
0.1471407115,
-0.2108897865,
0.5190976262,
0.2066061199,
-0.5216554403,
0.0591061115,
-0.0817668661,
-0.211480394,
-0.0654449761,
0.0227800775,
-0.1752675921,
-0.1868131608,
-0.242134437,
-0.1483109593,
0.0007693172,
0.2760917544,
0.0007844046,
-0.1791671515,
0.1693536937,
-0.0637615845,
-0.4431355,
0.2583415806,
0.3664197326,
0.2560352683,
-0.3944437802,
-0.3703474998,
-0.1771427393,
-0.3999646306,
-0.2454401553,
0.1234321892,
0.0697476119,
-0.0844347477,
0.123295635,
0.2239296585,
-0.1587905884,
-0.0421672352,
0.2827082872,
0.2458966821,
0.1164727286,
0.6250904799,
0.1420300007,
-0.2136122286,
-0.2528684139,
0.0410646498,
0.2418820262,
-0.4038006663,
0.2405817807,
0.0724533945,
0.0953910649,
0.0289486013,
0.0958366096,
0.2089329809,
0.4507568479,
-0.1446698606,
-0.5494536161,
-0.5519043803,
0.3133971989,
-0.3650939167,
0.124087289,
0.0728807151,
0.0119038075,
-0.0238186866,
0.0976892412,
-0.28880319,
-0.0474176891,
-0.3507492244,
0.059047617,
-0.0160400756,
-0.0977909192,
-0.0120488741,
-0.0036969818,
0.2137143314,
0.2369945198,
-0.1554047763,
-0.3096165955,
0.1651388109,
0.0450444296,
0.1219118908,
-0.4085615277,
-0.1422494948,
-0.1819286346,
-0.0714022219,
-0.1304441094,
-0.1986448467,
0.008819446,
-0.0786494687,
-0.1377132535,
0.2962079644,
-0.0935740173,
-0.2675310969,
0.0401338376,
-0.2507906258,
0.2575216293,
0.0955287516,
0.0755475163,
0.3524122834,
0.0259378701,
-0.2118296474,
0.0399076641,
-0.0894019678,
-0.1462774873,
0.3332198262,
-0.3912498653,
-0.3031313121,
0.0155774727,
0.4696726799,
0.2980314493,
-0.1497731656,
0.0706828982,
-0.048704084,
0.2152476311,
-0.3896288574,
-0.1665168554,
-0.0059128199,
0.0592268854,
-0.0484120473,
0.319621563,
0.2594070435,
-0.2182891369,
0.2811605036,
0.2165709138,
0.2245475501,
0.0140966773,
-0.1251776367,
0.6010935903,
-0.1161143631,
-0.2117931247,
0.4168363512,
0.2926369607,
0.5112426877,
0.260607183,
-0.107160233,
0.1381425261,
0.2688381374,
0.1799881458,
-0.0583727062,
-0.4001862705,
-0.1858806163,
0.1077911109,
0.106651552,
-0.1382402331,
0.1256544888,
-0.028010577,
0.2579559088,
0.036102403,
0.1478013247,
0.0859168768,
-0.1818324476,
0.021945972,
0.0008073226,
0.1250001639,
0.0484732389,
-0.2068496197,
0.0252287388,
-0.3063668907,
0.2777523994,
0.0016674623,
-0.0858004764,
-0.4399662018,
0.1806317121,
0.5117085576,
-0.1094689518,
-0.1741170436,
0.3285869956,
0.4242917001,
-0.0237483904,
-0.1503317952,
0.3860250413,
0.5804910064,
0.2409937233,
-0.0238723308,
0.2077869624,
-0.008149391,
-0.1690757871,
-0.0604742169,
0.090924114,
0.0978052318,
0.1481953263,
0.1423253417,
0.2106000483,
-0.1486497223,
0.2147823125,
0.3060646653,
0.1065621227,
-0.2027075738,
0.1401880234,
-0.1162667274,
-0.2585431337,
-0.0850693733,
-0.0174533166,
-0.4101571739,
0.1649528593,
0.4182043076,
-0.1200689375,
-0.0165552832,
0.1644587815,
0.0681712106,
-0.3404385149,
0.4425361156,
0.2781798542,
0.2539973557,
-0.169439137,
0.0014495254,
-0.4343958795,
0.2145671546,
-0.3108061552,
-0.1265730709,
0.188310504,
0.3053247333,
0.2690075636,
0.3885177672,
0.1593929827,
0.1569657922,
0.0389911793,
0.1325366646,
-0.3590077758,
-0.1170708835,
0.1671165824,
0.1152933165,
-0.1712614745,
-0.1662120819,
0.3117124736,
0.0199156478,
0.1010073572,
-0.2547493577,
-0.0726023465,
-0.2228350937,
0.341465801,
0.3372590542,
0.1324574053,
0.3376510739,
0.0862658694,
0.2375315726,
-0.0411621667,
-0.4088021517,
-0.008203283,
0.5613905191,
0.3403673172,
0.5017962456,
-0.1312080175,
-0.1257842779,
-0.1431040019,
0.3379403949,
-0.2317000031,
-0.0427060165,
-0.3095023334,
0.0799119323,
-0.1820609123,
-0.0606571287,
-0.3343918025,
-0.0018313313,
0.0890032947,
0.2038214207,
-0.2043202221,
-0.5852283835,
0.4368219674,
-0.1386184245,
-0.4015678465,
0.0768634081,
0.0833694115,
0.0400833637,
-0.0228594653,
-0.2593252361,
0.1473936439,
0.2686135769,
-0.0831841752,
-0.3365373015,
0.0610048436,
-0.0287154242,
0.2706462741,
-0.1467019618,
0.3606085181,
-0.1064436883,
-0.038489949,
-0.1178414822,
-0.1704111546
] |
https://github.com/huggingface/datasets/issues/611 | ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648 | It looks like it's going to be fixed in pyarrow 2.0.0 :)
In the meantime I suggest to chunk big dataframes to create several small datasets, and then concatenate them using [concatenate_datasets](https://huggingface.co/docs/datasets/package_reference/main_classes.html?highlight=concatenate#datasets.concatenate_datasets) | Hi, I'm trying to load a dataset from Dataframe, but I get the error:
```bash
---------------------------------------------------------------------------
ArrowCapacityError Traceback (most recent call last)
<ipython-input-7-146b6b495963> in <module>
----> 1 dataset = Dataset.from_pandas(emb)
~/miniconda3/envs/dev/lib/python3.7/site-packages/nlp/arrow_dataset.py in from_pandas(cls, df, features, info, split)
223 info.features = features
224 pa_table: pa.Table = pa.Table.from_pandas(
--> 225 df=df, schema=pa.schema(features.type) if features is not None else None
226 )
227 return cls(pa_table, info=info, split=split)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.from_pandas()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in dataframe_to_arrays(df, schema, preserve_index, nthreads, columns, safe)
591 for i, maybe_fut in enumerate(arrays):
592 if isinstance(maybe_fut, futures.Future):
--> 593 arrays[i] = maybe_fut.result()
594
595 types = [x.type for x in arrays]
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in result(self, timeout)
426 raise CancelledError()
427 elif self._state == FINISHED:
--> 428 return self.__get_result()
429
430 self._condition.wait(timeout)
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in __get_result(self)
382 def __get_result(self):
383 if self._exception:
--> 384 raise self._exception
385 else:
386 return self._result
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/thread.py in run(self)
55
56 try:
---> 57 result = self.fn(*self.args, **self.kwargs)
58 except BaseException as exc:
59 self.future.set_exception(exc)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in convert_column(col, field)
557
558 try:
--> 559 result = pa.array(col, type=type_, from_pandas=True, safe=safe)
560 except (pa.ArrowInvalid,
561 pa.ArrowNotImplementedError,
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib.array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib._ndarray_to_array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
```
My code is :
```python
from nlp import Dataset
dataset = Dataset.from_pandas(emb)
``` | 32 | ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
Hi, I'm trying to load a dataset from Dataframe, but I get the error:
```bash
---------------------------------------------------------------------------
ArrowCapacityError Traceback (most recent call last)
<ipython-input-7-146b6b495963> in <module>
----> 1 dataset = Dataset.from_pandas(emb)
~/miniconda3/envs/dev/lib/python3.7/site-packages/nlp/arrow_dataset.py in from_pandas(cls, df, features, info, split)
223 info.features = features
224 pa_table: pa.Table = pa.Table.from_pandas(
--> 225 df=df, schema=pa.schema(features.type) if features is not None else None
226 )
227 return cls(pa_table, info=info, split=split)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/table.pxi in pyarrow.lib.Table.from_pandas()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in dataframe_to_arrays(df, schema, preserve_index, nthreads, columns, safe)
591 for i, maybe_fut in enumerate(arrays):
592 if isinstance(maybe_fut, futures.Future):
--> 593 arrays[i] = maybe_fut.result()
594
595 types = [x.type for x in arrays]
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in result(self, timeout)
426 raise CancelledError()
427 elif self._state == FINISHED:
--> 428 return self.__get_result()
429
430 self._condition.wait(timeout)
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/_base.py in __get_result(self)
382 def __get_result(self):
383 if self._exception:
--> 384 raise self._exception
385 else:
386 return self._result
~/miniconda3/envs/dev/lib/python3.7/concurrent/futures/thread.py in run(self)
55
56 try:
---> 57 result = self.fn(*self.args, **self.kwargs)
58 except BaseException as exc:
59 self.future.set_exception(exc)
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/pandas_compat.py in convert_column(col, field)
557
558 try:
--> 559 result = pa.array(col, type=type_, from_pandas=True, safe=safe)
560 except (pa.ArrowInvalid,
561 pa.ArrowNotImplementedError,
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib.array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/array.pxi in pyarrow.lib._ndarray_to_array()
~/miniconda3/envs/dev/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowCapacityError: List array cannot contain more than 2147483646 child elements, have 2147483648
```
My code is :
```python
from nlp import Dataset
dataset = Dataset.from_pandas(emb)
```
It looks like it's going to be fixed in pyarrow 2.0.0 :)
In the meantime I suggest to chunk big dataframes to create several small datasets, and then concatenate them using [concatenate_datasets](https://huggingface.co/docs/datasets/package_reference/main_classes.html?highlight=concatenate#datasets.concatenate_datasets) | [
-0.2813219428,
-0.0744089633,
-0.2195497006,
0.4159463644,
0.2823322415,
-0.0299136937,
0.4081526101,
0.2113162875,
0.3901513219,
0.0474446639,
0.0003714785,
0.3082345724,
-0.086982511,
0.0685475543,
0.0008355677,
-0.3419585526,
-0.0933910757,
0.2279391438,
-0.2755076289,
0.093866922,
-0.2262556106,
0.0042208489,
-0.2458186746,
0.2069003135,
-0.0991637111,
-0.163917169,
0.0946855843,
-0.0185277201,
-0.0302413832,
-0.4899120629,
0.3556724787,
-0.2945442796,
0.0409599096,
0.1827442199,
-0.0001102196,
0.0405714065,
0.370965749,
0.0665660948,
-0.2005662173,
-0.1553710997,
-0.3287507892,
-0.4413963556,
0.4300813675,
-0.0768522024,
0.2361875176,
-0.2503987551,
-0.2843055725,
-0.3228807449,
0.3386787474,
0.0333644226,
0.2106782198,
0.2027077675,
0.4377657175,
0.1210990548,
0.0229465794,
-0.0303454809,
-0.1591518521,
0.4214887321,
0.1034791246,
-0.2736417353,
-0.0080705397,
0.0098081641,
-0.142464608,
0.3750423789,
0.1488452852,
-0.0841917843,
-0.1104777753,
-0.3776780069,
-0.2088724971,
0.2992430031,
0.6146752834,
-0.2124878317,
-0.3339967132,
-0.0728104264,
-0.0512901768,
-0.3944201469,
0.0467975624,
0.2284303606,
-0.0249730758,
0.0557536148,
0.0425887257,
0.0693220049,
-0.2296036184,
0.0915906951,
-0.0219255909,
0.1695369482,
-0.1137780398,
0.2228228897,
0.3527330458,
-0.1476605386,
0.1830547005,
0.1812526733,
0.0453063734,
0.1507356614,
-0.4711748064,
-0.0076135397,
-0.2046746016,
-0.4235939085,
0.0187760666,
0.3424941897,
0.2327031344,
0.0266378634,
0.0264476873,
0.2103407085,
0.1510002017,
0.0093807839,
-0.1516857147,
0.0805709958,
-0.2187975049,
-0.0026093572,
0.1261628121,
-0.0250722412,
-0.0848935694,
-0.1750026643,
0.1665081978,
-0.1399343312,
0.3136581182,
-0.1227994785,
-0.4663729668,
0.2898981869,
-0.4942124486,
-0.1571941674,
0.1066770703,
0.1941331327,
0.1898524612,
0.3169755936,
-0.0033450872,
0.1713691652,
-0.0771829039,
0.1586163938,
-0.1784181297,
0.2248278409,
0.0160618797,
0.1040192693,
-0.0073826239,
0.058732558,
0.161934495,
0.1394264251,
0.3225093782,
-0.1406375021,
0.2424961329,
-0.4746842384,
-0.0491966493,
0.4604403675,
-0.0665067658,
0.1373984516,
0.4816640317,
0.0661067441,
-0.0525114611,
0.2874205112,
-0.0509188771,
-0.1669092178,
-0.3079305887,
0.20349738,
0.086414665,
0.1019393504,
-0.5072072148,
-0.1107440144,
0.2790832818,
0.0325419605,
0.2052159607,
-0.1718197912,
-0.0322396904,
-0.2370043993,
-0.0113936029,
0.2493878007,
-0.6265667081,
0.1540752649,
-0.0097451583,
-0.004250763,
0.2222196907,
0.3802013695,
-0.3712419271,
0.0840109661,
-0.1422455311,
0.1392557025,
-0.0864753574,
0.0490297154,
-0.488214761,
0.1294536442,
0.030543074,
-0.0098823607,
0.1360376477,
0.1243114397,
0.0164617784,
0.0188370906,
-0.0198176689,
0.2659707069,
0.1596461385,
-0.1190211028,
-0.0785414279,
-0.0919745266,
0.1880331337,
0.0425337404,
-0.1309632957,
-0.111163646,
-0.0377984941,
-0.0583446473,
0.0506684333,
-0.0628660619,
-0.1312292665,
0.3077424169,
0.1022797003,
0.0275660828,
-0.1993942857,
-0.1144552827,
-0.2583914399,
-0.0028339475,
0.0060147718,
-0.2196476161,
-0.3686643243,
0.032235235,
-0.3015364707,
0.4142712057,
-0.0037014037,
0.1467633247,
0.1555080712,
0.0351495333,
-0.1392640322,
-0.340544343,
-0.1125304103,
0.382986784,
-0.2520795763,
0.1174004003,
-0.2488704026,
0.6389963031,
-0.1225097924,
-0.2358955443,
0.1020291746,
-0.0589133985,
0.0078238137,
-0.0617900826,
-0.0541579686,
0.3003413677,
0.2589097619,
-0.0539925434,
-0.1232166812,
-0.171292901,
0.1711659431,
-0.4476627111,
-0.0194847696,
0.3259801269,
0.249661386,
0.0040196702,
0.2005783617,
0.2582695186,
-0.2305360436,
0.1718456149,
0.0606856048,
0.0135375261,
0.2570947111,
0.2372242659,
-0.0010136068,
-0.1002516598,
-0.0585768893,
0.1710239649,
0.2922353148,
0.0836868957,
-0.3078922927,
-0.0896601304,
0.0380945317,
0.0779546052,
0.1145799235,
0.1024589986,
-0.101138711,
-0.1450705975,
0.2871287465,
-0.1594516784,
0.3604677916,
0.2623209655,
-0.3645986915,
-0.3160840273,
-0.1206736118,
0.0692810565,
0.3395499587,
0.2408053428,
0.4563709795,
-0.0912679881,
0.0961696133,
-0.0320505314,
-0.2791268826,
-0.5238829851,
0.062461026,
0.4196492434,
-0.3285613656,
0.2295598686,
-0.1541726142,
-0.1055466011,
0.2673345506,
-0.6564481258,
0.0172791556,
-0.1459004134,
-0.332672894,
0.3113352358,
0.0069781691,
0.0980789661,
-0.3459627628,
-0.1520458907,
0.3765097558,
0.0735323951,
0.0101615116,
-0.0400558449,
0.1425976455,
0.1107220054,
0.270267278,
0.0741860569,
0.2685508132,
0.3350816965,
0.0054361597,
-0.1947272569,
-0.0174661875,
-0.0798838437,
-0.1669339538,
0.0517802164,
0.0623220652,
0.3742078245,
-0.5855014324,
-0.5999204516,
0.1264113039,
0.0736583322,
-0.1970531791,
0.1744910181,
0.070136629,
0.1771166027,
0.1904893219,
-0.1944370121,
-0.3144041598,
-0.3713177443,
0.1869554371,
0.2236397862,
0.3277162611,
0.315998435,
0.2656183839,
0.123975426,
0.2631947398,
-0.1057829708,
-0.1982409656,
0.1272566766,
0.175486654,
0.0087331682,
-0.2702608407,
-0.1913998276,
-0.2674703002,
0.2551689744,
0.0270456299,
-0.3893483877,
-0.2998654246,
-0.3612284362,
-0.0803876296,
-0.3379432559,
-0.0242437907,
0.270457685,
0.1402749717,
-0.2321831584,
0.0587794483,
-0.1001929417,
-0.0318210945,
-0.1266248822,
0.1443428993,
-0.1426803321,
0.5946799517,
-0.1982655078,
0.542355299,
0.1506068558,
-0.0378130116,
0.4440322518,
-0.0917439014,
0.1699854136,
-0.362095207,
-0.4971081913,
-0.0669030994,
-0.1251890063,
0.0664183646,
-0.1115146279,
-0.2821608782,
-0.130408138,
0.1161025017,
-0.0095338672,
-0.1359713972,
0.0811217129,
0.1733745784,
-0.0260824952,
0.1395893693,
-0.117307961,
-0.048691716,
-0.1003915817,
-0.0609221198,
-0.2855703831,
-0.2101400197,
0.2482716292,
-0.1466302425,
-0.2170100212,
-0.267565906,
-0.0345851518,
0.3139544129,
0.3056480289,
0.5451366305,
0.1436626613,
-0.0632699057,
0.0515470915,
0.0490170047,
0.3031838536,
0.1406441927,
-0.2015918046,
0.2815238237,
-0.1118087247,
-0.5275211334,
0.006812349,
-0.2888288796,
0.2861070335,
-0.151114881,
-0.1364953369,
-0.2830282748,
-0.0095329694,
0.1217481866,
0.3055150211,
-0.0608975738,
-0.1197309569,
-0.3016712666,
-0.0870328769,
-0.5179741979,
0.1697145253,
0.146309495,
0.2763126194,
0.0918484032,
-0.2823674083,
-0.2318424731,
-0.5319344401,
0.0363813341,
0.066778034,
0.3335449696,
0.1646634638,
0.181921795,
-0.2661644816,
-0.1176079512,
0.5514031053,
0.3901550472,
0.2355771959,
-0.0744232237,
-0.0169190187,
0.0978518277,
0.2250101864,
0.0370273143,
-0.0905062258,
0.116483584,
-0.2906727493,
0.1484724283,
0.1189412028,
-0.0457757376,
0.0468275025,
-0.0387607925,
-0.223928079,
-0.2735212743,
0.4360411167,
0.0355574712,
0.1602489054,
0.4475948215,
0.2731459141,
-0.014616156,
0.1969363689,
0.235817939,
0.8210724592,
-0.3222985864,
0.0012439098,
0.3847112656,
0.0425248593,
0.450419277,
0.3941968083,
0.1207177117,
-0.351360023,
-0.2400812507,
-0.1093840078,
-0.2209489197,
0.19301112,
0.132399708,
-0.2190533876,
0.1336610615,
-0.1972189546,
0.1078304574,
0.0800063089,
0.096784994,
-0.0671038032,
-0.3313188553,
-0.5120388865,
0.115511857,
-0.245433256,
-0.184278816,
-0.1543579549,
-0.0854888335,
-0.0121684596,
-0.0684576631,
-0.1600126773,
0.1471407115,
-0.2108897865,
0.5190976262,
0.2066061199,
-0.5216554403,
0.0591061115,
-0.0817668661,
-0.211480394,
-0.0654449761,
0.0227800775,
-0.1752675921,
-0.1868131608,
-0.242134437,
-0.1483109593,
0.0007693172,
0.2760917544,
0.0007844046,
-0.1791671515,
0.1693536937,
-0.0637615845,
-0.4431355,
0.2583415806,
0.3664197326,
0.2560352683,
-0.3944437802,
-0.3703474998,
-0.1771427393,
-0.3999646306,
-0.2454401553,
0.1234321892,
0.0697476119,
-0.0844347477,
0.123295635,
0.2239296585,
-0.1587905884,
-0.0421672352,
0.2827082872,
0.2458966821,
0.1164727286,
0.6250904799,
0.1420300007,
-0.2136122286,
-0.2528684139,
0.0410646498,
0.2418820262,
-0.4038006663,
0.2405817807,
0.0724533945,
0.0953910649,
0.0289486013,
0.0958366096,
0.2089329809,
0.4507568479,
-0.1446698606,
-0.5494536161,
-0.5519043803,
0.3133971989,
-0.3650939167,
0.124087289,
0.0728807151,
0.0119038075,
-0.0238186866,
0.0976892412,
-0.28880319,
-0.0474176891,
-0.3507492244,
0.059047617,
-0.0160400756,
-0.0977909192,
-0.0120488741,
-0.0036969818,
0.2137143314,
0.2369945198,
-0.1554047763,
-0.3096165955,
0.1651388109,
0.0450444296,
0.1219118908,
-0.4085615277,
-0.1422494948,
-0.1819286346,
-0.0714022219,
-0.1304441094,
-0.1986448467,
0.008819446,
-0.0786494687,
-0.1377132535,
0.2962079644,
-0.0935740173,
-0.2675310969,
0.0401338376,
-0.2507906258,
0.2575216293,
0.0955287516,
0.0755475163,
0.3524122834,
0.0259378701,
-0.2118296474,
0.0399076641,
-0.0894019678,
-0.1462774873,
0.3332198262,
-0.3912498653,
-0.3031313121,
0.0155774727,
0.4696726799,
0.2980314493,
-0.1497731656,
0.0706828982,
-0.048704084,
0.2152476311,
-0.3896288574,
-0.1665168554,
-0.0059128199,
0.0592268854,
-0.0484120473,
0.319621563,
0.2594070435,
-0.2182891369,
0.2811605036,
0.2165709138,
0.2245475501,
0.0140966773,
-0.1251776367,
0.6010935903,
-0.1161143631,
-0.2117931247,
0.4168363512,
0.2926369607,
0.5112426877,
0.260607183,
-0.107160233,
0.1381425261,
0.2688381374,
0.1799881458,
-0.0583727062,
-0.4001862705,
-0.1858806163,
0.1077911109,
0.106651552,
-0.1382402331,
0.1256544888,
-0.028010577,
0.2579559088,
0.036102403,
0.1478013247,
0.0859168768,
-0.1818324476,
0.021945972,
0.0008073226,
0.1250001639,
0.0484732389,
-0.2068496197,
0.0252287388,
-0.3063668907,
0.2777523994,
0.0016674623,
-0.0858004764,
-0.4399662018,
0.1806317121,
0.5117085576,
-0.1094689518,
-0.1741170436,
0.3285869956,
0.4242917001,
-0.0237483904,
-0.1503317952,
0.3860250413,
0.5804910064,
0.2409937233,
-0.0238723308,
0.2077869624,
-0.008149391,
-0.1690757871,
-0.0604742169,
0.090924114,
0.0978052318,
0.1481953263,
0.1423253417,
0.2106000483,
-0.1486497223,
0.2147823125,
0.3060646653,
0.1065621227,
-0.2027075738,
0.1401880234,
-0.1162667274,
-0.2585431337,
-0.0850693733,
-0.0174533166,
-0.4101571739,
0.1649528593,
0.4182043076,
-0.1200689375,
-0.0165552832,
0.1644587815,
0.0681712106,
-0.3404385149,
0.4425361156,
0.2781798542,
0.2539973557,
-0.169439137,
0.0014495254,
-0.4343958795,
0.2145671546,
-0.3108061552,
-0.1265730709,
0.188310504,
0.3053247333,
0.2690075636,
0.3885177672,
0.1593929827,
0.1569657922,
0.0389911793,
0.1325366646,
-0.3590077758,
-0.1170708835,
0.1671165824,
0.1152933165,
-0.1712614745,
-0.1662120819,
0.3117124736,
0.0199156478,
0.1010073572,
-0.2547493577,
-0.0726023465,
-0.2228350937,
0.341465801,
0.3372590542,
0.1324574053,
0.3376510739,
0.0862658694,
0.2375315726,
-0.0411621667,
-0.4088021517,
-0.008203283,
0.5613905191,
0.3403673172,
0.5017962456,
-0.1312080175,
-0.1257842779,
-0.1431040019,
0.3379403949,
-0.2317000031,
-0.0427060165,
-0.3095023334,
0.0799119323,
-0.1820609123,
-0.0606571287,
-0.3343918025,
-0.0018313313,
0.0890032947,
0.2038214207,
-0.2043202221,
-0.5852283835,
0.4368219674,
-0.1386184245,
-0.4015678465,
0.0768634081,
0.0833694115,
0.0400833637,
-0.0228594653,
-0.2593252361,
0.1473936439,
0.2686135769,
-0.0831841752,
-0.3365373015,
0.0610048436,
-0.0287154242,
0.2706462741,
-0.1467019618,
0.3606085181,
-0.1064436883,
-0.038489949,
-0.1178414822,
-0.1704111546
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Could you try
```python
load_dataset('text', data_files='test.txt',cache_dir="./", split="train")
```
?
`load_dataset` returns a dictionary by default, like {"train": your_dataset} | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 18 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Could you try
```python
load_dataset('text', data_files='test.txt',cache_dir="./", split="train")
```
?
`load_dataset` returns a dictionary by default, like {"train": your_dataset} | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Hi @lhoestq
Thanks for your suggestion.
I tried
```
dataset = load_dataset('text', data_files='test.txt',cache_dir="./", split="train")
print(dataset)
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But it still doesn't work and got error:
```
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-7-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/arrow_dataset.py in __getitem__(self, key)
1069 format_columns=self._format_columns,
1070 output_all_columns=self._output_all_columns,
-> 1071 format_kwargs=self._format_kwargs,
1072 )
1073
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/arrow_dataset.py in _getitem(self, key, format_type, format_columns, output_all_columns, format_kwargs)
1056 format_columns=format_columns,
1057 output_all_columns=output_all_columns,
-> 1058 format_kwargs=format_kwargs,
1059 )
1060 return outputs
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/arrow_dataset.py in _convert_outputs(self, outputs, format_type, format_columns, output_all_columns, format_kwargs)
872 continue
873 if format_columns is None or k in format_columns:
--> 874 v = map_nested(command, v, **map_nested_kwargs)
875 output_dict[k] = v
876 return output_dict
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/utils/py_utils.py in map_nested(function, data_struct, dict_only, map_list, map_tuple, map_numpy, num_proc, types)
214 # Singleton
215 if not isinstance(data_struct, dict) and not isinstance(data_struct, types):
--> 216 return function(data_struct)
217
218 disable_tqdm = bool(logger.getEffectiveLevel() > INFO)
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/arrow_dataset.py in command(x)
833 if x.dtype == np.object: # pytorch tensors cannot be instantied from an array of objects
834 return [map_nested(command, i, **map_nested_kwargs) for i in x]
--> 835 return torch.tensor(x, **format_kwargs)
836
837 elif format_type == "tensorflow":
TypeError: new(): invalid data type 'str'
```
I found type can be ['numpy', 'torch', 'tensorflow', 'pandas'] only, how can I deal with the string type? | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 312 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Hi @lhoestq
Thanks for your suggestion.
I tried
```
dataset = load_dataset('text', data_files='test.txt',cache_dir="./", split="train")
print(dataset)
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But it still doesn't work and got error:
```
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-7-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/arrow_dataset.py in __getitem__(self, key)
1069 format_columns=self._format_columns,
1070 output_all_columns=self._output_all_columns,
-> 1071 format_kwargs=self._format_kwargs,
1072 )
1073
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/arrow_dataset.py in _getitem(self, key, format_type, format_columns, output_all_columns, format_kwargs)
1056 format_columns=format_columns,
1057 output_all_columns=output_all_columns,
-> 1058 format_kwargs=format_kwargs,
1059 )
1060 return outputs
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/arrow_dataset.py in _convert_outputs(self, outputs, format_type, format_columns, output_all_columns, format_kwargs)
872 continue
873 if format_columns is None or k in format_columns:
--> 874 v = map_nested(command, v, **map_nested_kwargs)
875 output_dict[k] = v
876 return output_dict
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/utils/py_utils.py in map_nested(function, data_struct, dict_only, map_list, map_tuple, map_numpy, num_proc, types)
214 # Singleton
215 if not isinstance(data_struct, dict) and not isinstance(data_struct, types):
--> 216 return function(data_struct)
217
218 disable_tqdm = bool(logger.getEffectiveLevel() > INFO)
/Library/Python/3.7/site-packages/datasets-0.4.0-py3.7.egg/datasets/arrow_dataset.py in command(x)
833 if x.dtype == np.object: # pytorch tensors cannot be instantied from an array of objects
834 return [map_nested(command, i, **map_nested_kwargs) for i in x]
--> 835 return torch.tensor(x, **format_kwargs)
836
837 elif format_type == "tensorflow":
TypeError: new(): invalid data type 'str'
```
I found type can be ['numpy', 'torch', 'tensorflow', 'pandas'] only, how can I deal with the string type? | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | You need to tokenize the string inputs to convert them in integers before you can feed them to a pytorch dataloader.
You can read the quicktour of the datasets or the transformers libraries to know more about that:
- transformers: https://huggingface.co/transformers/quicktour.html
- dataset: https://huggingface.co/docs/datasets/quicktour.html | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 44 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
You need to tokenize the string inputs to convert them in integers before you can feed them to a pytorch dataloader.
You can read the quicktour of the datasets or the transformers libraries to know more about that:
- transformers: https://huggingface.co/transformers/quicktour.html
- dataset: https://huggingface.co/docs/datasets/quicktour.html | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Hey @chiyuzhang94, I was also having trouble in loading a large text file (11GB).
But finally got it working. This is what I did after looking into the documentation.
1. split the whole dataset file into smaller files
```bash
mkdir ./shards
split -a 4 -l 256000 -d full_raw_corpus.txt ./shards/shard_
````
2. Pass paths of small data files to `load_dataset`
```python
files = glob.glob('shards/*')
from datasets import load_dataset
dataset = load_dataset('text', data_files=files, split='train')
```
(On passing the whole dataset file (11GB) directly to `load_dataset` was resulting into RAM issue)
3. Tokenization
```python
def encode(examples):
return tokenizer(examples['text'], truncation=True, padding='max_length')
dataset = dataset.map(encode, batched=True)
dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
```
Now you can pass `dataset` to `Trainer` or `pytorch DataLoader`
```python
dataloader = torch.utils.data.DataLoader(dataset, batch_size=4)
next(iter(dataloader))
```
Hope this helps
| I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 125 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Hey @chiyuzhang94, I was also having trouble in loading a large text file (11GB).
But finally got it working. This is what I did after looking into the documentation.
1. split the whole dataset file into smaller files
```bash
mkdir ./shards
split -a 4 -l 256000 -d full_raw_corpus.txt ./shards/shard_
````
2. Pass paths of small data files to `load_dataset`
```python
files = glob.glob('shards/*')
from datasets import load_dataset
dataset = load_dataset('text', data_files=files, split='train')
```
(On passing the whole dataset file (11GB) directly to `load_dataset` was resulting into RAM issue)
3. Tokenization
```python
def encode(examples):
return tokenizer(examples['text'], truncation=True, padding='max_length')
dataset = dataset.map(encode, batched=True)
dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
```
Now you can pass `dataset` to `Trainer` or `pytorch DataLoader`
```python
dataloader = torch.utils.data.DataLoader(dataset, batch_size=4)
next(iter(dataloader))
```
Hope this helps
| [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Thanks, @thomwolf and @sipah00 ,
I tried to implement your suggestions in my scripts.
Now, I am facing some connection time-out error. I am using my local file, I have no idea why the module request s3 database.
The log is:
```
Traceback (most recent call last):
File "/home/.local/lib/python3.6/site-packages/requests/adapters.py", line 449, in send
raise err
File "/home/.local/lib/python3.6/site-packages/urllib3/util/connection.py", line 74, in create_connection
timeout=timeout
File "/home/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 720, in urlopen
sock.connect(sa)
TimeoutError: [Errno 110] Connection timed out
Traceback (most recent call last):
File "/home/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 672, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File "/home/.local/lib/python3.6/site-packages/urllib3/util/retry.py", line 436, in increment
chunked=chunked,
File "/home/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 376, in _make_request
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='s3.amazonaws.com', port=443): Max retries exceeded with url: /datasets.huggingface.co/datasets/datasets/text/text.py (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection obj
ect at 0x7fff401e0e48>: Failed to establish a new connection: [Errno 110] Connection timed out',))
Traceback (most recent call last):
File "/scratch/roberta_emohash/run_language_modeling.py", line 1019, in <module>
main()
File "/scratch/roberta_emohash/run_language_modeling.py", line 962, in main
train_dataset = load_and_cache_examples(args, tokenizer, evaluate=False)
File "/scratch/roberta_emohash/run_language_modeling.py", line 177, in load_and_cache_examples
return HG_Datasets(tokenizer, file_path, args)
File "/scratch/roberta_emohash/run_language_modeling.py", line 117, in HG_Datasets
dataset = load_dataset('text', data_files=files, cache_dir = args.data_cache_dir, split="train")
File "/arc/project/evn_py36/datasets/datasets/src/datasets/load.py", line 590, in load_dataset
self._validate_conn(conn)
File "/home/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 994, in _validate_conn
conn.connect()
File "/home/.local/lib/python3.6/site-packages/urllib3/connection.py", line 300, in connect
conn = self._new_conn()
File "/home/.local/lib/python3.6/site-packages/urllib3/connection.py", line 169, in _new_conn
self, "Failed to establish a new connection: %s" % e
urllib3.exceptions.NewConnectionError: <urllib3.connection.VerifiedHTTPSConnection object at 0x7fff401e0da0>: Failed to establish a new connection: [Errno 110] Connection timed out
```
Do you have any experience on this issue? | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 248 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Thanks, @thomwolf and @sipah00 ,
I tried to implement your suggestions in my scripts.
Now, I am facing some connection time-out error. I am using my local file, I have no idea why the module request s3 database.
The log is:
```
Traceback (most recent call last):
File "/home/.local/lib/python3.6/site-packages/requests/adapters.py", line 449, in send
raise err
File "/home/.local/lib/python3.6/site-packages/urllib3/util/connection.py", line 74, in create_connection
timeout=timeout
File "/home/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 720, in urlopen
sock.connect(sa)
TimeoutError: [Errno 110] Connection timed out
Traceback (most recent call last):
File "/home/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 672, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File "/home/.local/lib/python3.6/site-packages/urllib3/util/retry.py", line 436, in increment
chunked=chunked,
File "/home/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 376, in _make_request
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='s3.amazonaws.com', port=443): Max retries exceeded with url: /datasets.huggingface.co/datasets/datasets/text/text.py (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection obj
ect at 0x7fff401e0e48>: Failed to establish a new connection: [Errno 110] Connection timed out',))
Traceback (most recent call last):
File "/scratch/roberta_emohash/run_language_modeling.py", line 1019, in <module>
main()
File "/scratch/roberta_emohash/run_language_modeling.py", line 962, in main
train_dataset = load_and_cache_examples(args, tokenizer, evaluate=False)
File "/scratch/roberta_emohash/run_language_modeling.py", line 177, in load_and_cache_examples
return HG_Datasets(tokenizer, file_path, args)
File "/scratch/roberta_emohash/run_language_modeling.py", line 117, in HG_Datasets
dataset = load_dataset('text', data_files=files, cache_dir = args.data_cache_dir, split="train")
File "/arc/project/evn_py36/datasets/datasets/src/datasets/load.py", line 590, in load_dataset
self._validate_conn(conn)
File "/home/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 994, in _validate_conn
conn.connect()
File "/home/.local/lib/python3.6/site-packages/urllib3/connection.py", line 300, in connect
conn = self._new_conn()
File "/home/.local/lib/python3.6/site-packages/urllib3/connection.py", line 169, in _new_conn
self, "Failed to establish a new connection: %s" % e
urllib3.exceptions.NewConnectionError: <urllib3.connection.VerifiedHTTPSConnection object at 0x7fff401e0da0>: Failed to establish a new connection: [Errno 110] Connection timed out
```
Do you have any experience on this issue? | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | I noticed this is because I use a cloud server where does not provide for connections from our standard compute nodes to outside resources.
For the `datasets` package, it seems that if the loading script is not already cached in the library it will attempt to connect to an AWS resource to download the dataset loading script.
I am wondering why the package works in this way. Do you have any suggestions to solve this issue? | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 76 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
I noticed this is because I use a cloud server where does not provide for connections from our standard compute nodes to outside resources.
For the `datasets` package, it seems that if the loading script is not already cached in the library it will attempt to connect to an AWS resource to download the dataset loading script.
I am wondering why the package works in this way. Do you have any suggestions to solve this issue? | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | I solved the above issue by downloading text.py manually and passing the path to the `load_dataset` function.
Now, I have a new issue with the Read-only file system.
The error is:
```
I0916 22:14:38.453380 140737353971520 filelock.py:274] Lock 140734268996072 acquired on /scratch/chiyuzh/roberta/text.py.lock
Found main folder for dataset /scratch/chiyuzh/roberta/text.py at /home/chiyuzh/.cache/huggingface/modules/datasets_modules/datasets/text
Creating specific version folder for dataset /scratch/chiyuzh/roberta/text.py at /home/chiyuzh/.cache/huggingface/modules/datasets_modules/datasets/text/512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7fcc649178b014
I0916 22:14:38.530371 140737353971520 filelock.py:318] Lock 140734268996072 released on /scratch/chiyuzh/roberta/text.py.lock
Traceback (most recent call last):
File "/scratch/chiyuzh/roberta/run_language_modeling_hg.py", line 1019, in <module>
main()
File "/scratch/chiyuzh/roberta/run_language_modeling_hg.py", line 962, in main
train_dataset = load_and_cache_examples(args, tokenizer, evaluate=False)
File "/scratch/chiyuzh/roberta/run_language_modeling_hg.py", line 177, in load_and_cache_examples
return HG_Datasets(tokenizer, file_path, args)
File "/scratch/chiyuzh/roberta/run_language_modeling_hg.py", line 117, in HG_Datasets
dataset = load_dataset('/scratch/chiyuzh/roberta/text.py', data_files=files, cache_dir = args.data_cache_dir, split="train")
File "/arc/project/chiyuzh/evn_py36/datasets/src/datasets/load.py", line 590, in load_dataset
path, script_version=script_version, download_config=download_config, download_mode=download_mode, dataset=True
File "/arc/project/chiyuzh/evn_py36/datasets/src/datasets/load.py", line 385, in prepare_module
os.makedirs(hash_folder_path)
File "/project/chiyuzh/evn_py36/lib/python3.6/os.py", line 220, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/home/chiyuzh/.cache/huggingface/modules/datasets_modules/datasets/text/512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7fcc649178b014'
```
I installed datasets at /project/chiyuzh/evn_py36/datasets/src where is a writable directory.
I also tried change the environment variables to the writable directory:
`export HF_MODULES_PATH=/project/chiyuzh/evn_py36/datasets/cache_dir/`
`export HF_DATASETS_CACHE=/project/chiyuzh/evn_py36/datasets/cache_dir/`
In my scripts, I also changed to:
`dataset = load_dataset('/scratch/chiyuzh/roberta/text.py', data_files=files, cache_dir = args.data_cache_dir, split="train")`
`data_cache_dir = $TMPDIR/data/` that also a writable directory.
But it still try to make directory at /home/chiyuzh/.cache/huggingface/modules/.
Do you have any idea about this issue? @thomwolf
| I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 214 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
I solved the above issue by downloading text.py manually and passing the path to the `load_dataset` function.
Now, I have a new issue with the Read-only file system.
The error is:
```
I0916 22:14:38.453380 140737353971520 filelock.py:274] Lock 140734268996072 acquired on /scratch/chiyuzh/roberta/text.py.lock
Found main folder for dataset /scratch/chiyuzh/roberta/text.py at /home/chiyuzh/.cache/huggingface/modules/datasets_modules/datasets/text
Creating specific version folder for dataset /scratch/chiyuzh/roberta/text.py at /home/chiyuzh/.cache/huggingface/modules/datasets_modules/datasets/text/512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7fcc649178b014
I0916 22:14:38.530371 140737353971520 filelock.py:318] Lock 140734268996072 released on /scratch/chiyuzh/roberta/text.py.lock
Traceback (most recent call last):
File "/scratch/chiyuzh/roberta/run_language_modeling_hg.py", line 1019, in <module>
main()
File "/scratch/chiyuzh/roberta/run_language_modeling_hg.py", line 962, in main
train_dataset = load_and_cache_examples(args, tokenizer, evaluate=False)
File "/scratch/chiyuzh/roberta/run_language_modeling_hg.py", line 177, in load_and_cache_examples
return HG_Datasets(tokenizer, file_path, args)
File "/scratch/chiyuzh/roberta/run_language_modeling_hg.py", line 117, in HG_Datasets
dataset = load_dataset('/scratch/chiyuzh/roberta/text.py', data_files=files, cache_dir = args.data_cache_dir, split="train")
File "/arc/project/chiyuzh/evn_py36/datasets/src/datasets/load.py", line 590, in load_dataset
path, script_version=script_version, download_config=download_config, download_mode=download_mode, dataset=True
File "/arc/project/chiyuzh/evn_py36/datasets/src/datasets/load.py", line 385, in prepare_module
os.makedirs(hash_folder_path)
File "/project/chiyuzh/evn_py36/lib/python3.6/os.py", line 220, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/home/chiyuzh/.cache/huggingface/modules/datasets_modules/datasets/text/512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7fcc649178b014'
```
I installed datasets at /project/chiyuzh/evn_py36/datasets/src where is a writable directory.
I also tried change the environment variables to the writable directory:
`export HF_MODULES_PATH=/project/chiyuzh/evn_py36/datasets/cache_dir/`
`export HF_DATASETS_CACHE=/project/chiyuzh/evn_py36/datasets/cache_dir/`
In my scripts, I also changed to:
`dataset = load_dataset('/scratch/chiyuzh/roberta/text.py', data_files=files, cache_dir = args.data_cache_dir, split="train")`
`data_cache_dir = $TMPDIR/data/` that also a writable directory.
But it still try to make directory at /home/chiyuzh/.cache/huggingface/modules/.
Do you have any idea about this issue? @thomwolf
| [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > Hey @chiyuzhang94, I was also having trouble in loading a large text file (11GB).
> But finally got it working. This is what I did after looking into the documentation.
>
> 1. split the whole dataset file into smaller files
>
> ```shell
> mkdir ./shards
> split -a 4 -l 256000 -d full_raw_corpus.txt ./shards/shard_
> ```
>
> 1. Pass paths of small data files to `load_dataset`
>
> ```python
> files = glob.glob('shards/*')
> from datasets import load_dataset
> dataset = load_dataset('text', data_files=files, split='train')
> ```
>
> (On passing the whole dataset file (11GB) directly to `load_dataset` was resulting into RAM issue)
>
> 1. Tokenization
>
> ```python
> def encode(examples):
> return tokenizer(examples['text'], truncation=True, padding='max_length')
> dataset = dataset.map(encode, batched=True)
> dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
> ```
>
> Now you can pass `dataset` to `Trainer` or `pytorch DataLoader`
>
> ```python
> dataloader = torch.utils.data.DataLoader(dataset, batch_size=4)
> next(iter(dataloader))
> ```
>
> Hope this helps
When I run 'dataset = dataset.map(encode, batched=True)',
I encountered a problem like this:
> Testing the mapped function outputs
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/dataset_dict.py", line 300, in map
for k, dataset in self.items()
File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/dataset_dict.py", line 300, in <dictcomp>
for k, dataset in self.items()
File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1224, in map
update_data = does_function_return_dict(test_inputs, test_indices)
File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1195, in does_function_return_dict
function(*fn_args, indices, **fn_kwargs) if with_indices else function(*fn_args, **fn_kwargs)
File "<stdin>", line 3, in encode
TypeError: __init__() takes 1 positional argument but 2 were given | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 254 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> Hey @chiyuzhang94, I was also having trouble in loading a large text file (11GB).
> But finally got it working. This is what I did after looking into the documentation.
>
> 1. split the whole dataset file into smaller files
>
> ```shell
> mkdir ./shards
> split -a 4 -l 256000 -d full_raw_corpus.txt ./shards/shard_
> ```
>
> 1. Pass paths of small data files to `load_dataset`
>
> ```python
> files = glob.glob('shards/*')
> from datasets import load_dataset
> dataset = load_dataset('text', data_files=files, split='train')
> ```
>
> (On passing the whole dataset file (11GB) directly to `load_dataset` was resulting into RAM issue)
>
> 1. Tokenization
>
> ```python
> def encode(examples):
> return tokenizer(examples['text'], truncation=True, padding='max_length')
> dataset = dataset.map(encode, batched=True)
> dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
> ```
>
> Now you can pass `dataset` to `Trainer` or `pytorch DataLoader`
>
> ```python
> dataloader = torch.utils.data.DataLoader(dataset, batch_size=4)
> next(iter(dataloader))
> ```
>
> Hope this helps
When I run 'dataset = dataset.map(encode, batched=True)',
I encountered a problem like this:
> Testing the mapped function outputs
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/dataset_dict.py", line 300, in map
for k, dataset in self.items()
File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/dataset_dict.py", line 300, in <dictcomp>
for k, dataset in self.items()
File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1224, in map
update_data = does_function_return_dict(test_inputs, test_indices)
File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1195, in does_function_return_dict
function(*fn_args, indices, **fn_kwargs) if with_indices else function(*fn_args, **fn_kwargs)
File "<stdin>", line 3, in encode
TypeError: __init__() takes 1 positional argument but 2 were given | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > > Hey @chiyuzhang94, I was also having trouble in loading a large text file (11GB).
> > But finally got it working. This is what I did after looking into the documentation.
> >
> > 1. split the whole dataset file into smaller files
> >
> > ```shell
> > mkdir ./shards
> > split -a 4 -l 256000 -d full_raw_corpus.txt ./shards/shard_
> > ```
> >
> >
> >
> > 1. Pass paths of small data files to `load_dataset`
> >
> > ```python
> > files = glob.glob('shards/*')
> > from datasets import load_dataset
> > dataset = load_dataset('text', data_files=files, split='train')
> > ```
> >
> >
> > (On passing the whole dataset file (11GB) directly to `load_dataset` was resulting into RAM issue)
> >
> > 1. Tokenization
> >
> > ```python
> > def encode(examples):
> > return tokenizer(examples['text'], truncation=True, padding='max_length')
> > dataset = dataset.map(encode, batched=True)
> > dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
> > ```
> >
> >
> > Now you can pass `dataset` to `Trainer` or `pytorch DataLoader`
> > ```python
> > dataloader = torch.utils.data.DataLoader(dataset, batch_size=4)
> > next(iter(dataloader))
> > ```
> >
> >
> > Hope this helps
>
> When I run 'dataset = dataset.map(encode, batched=True)',
> I encountered a problem like this:
>
> > Testing the mapped function outputs
> > Traceback (most recent call last):
> > File "", line 1, in
> > File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/dataset_dict.py", line 300, in map
> > for k, dataset in self.items()
> > File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/dataset_dict.py", line 300, in
> > for k, dataset in self.items()
> > File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1224, in map
> > update_data = does_function_return_dict(test_inputs, test_indices)
> > File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1195, in does_function_return_dict
> > function(*fn_args, indices, **fn_kwargs) if with_indices else function(*fn_args, **fn_kwargs)
> > File "", line 3, in encode
> > TypeError: **init**() takes 1 positional argument but 2 were given
What is your encoder function? | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 331 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> > Hey @chiyuzhang94, I was also having trouble in loading a large text file (11GB).
> > But finally got it working. This is what I did after looking into the documentation.
> >
> > 1. split the whole dataset file into smaller files
> >
> > ```shell
> > mkdir ./shards
> > split -a 4 -l 256000 -d full_raw_corpus.txt ./shards/shard_
> > ```
> >
> >
> >
> > 1. Pass paths of small data files to `load_dataset`
> >
> > ```python
> > files = glob.glob('shards/*')
> > from datasets import load_dataset
> > dataset = load_dataset('text', data_files=files, split='train')
> > ```
> >
> >
> > (On passing the whole dataset file (11GB) directly to `load_dataset` was resulting into RAM issue)
> >
> > 1. Tokenization
> >
> > ```python
> > def encode(examples):
> > return tokenizer(examples['text'], truncation=True, padding='max_length')
> > dataset = dataset.map(encode, batched=True)
> > dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
> > ```
> >
> >
> > Now you can pass `dataset` to `Trainer` or `pytorch DataLoader`
> > ```python
> > dataloader = torch.utils.data.DataLoader(dataset, batch_size=4)
> > next(iter(dataloader))
> > ```
> >
> >
> > Hope this helps
>
> When I run 'dataset = dataset.map(encode, batched=True)',
> I encountered a problem like this:
>
> > Testing the mapped function outputs
> > Traceback (most recent call last):
> > File "", line 1, in
> > File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/dataset_dict.py", line 300, in map
> > for k, dataset in self.items()
> > File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/dataset_dict.py", line 300, in
> > for k, dataset in self.items()
> > File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1224, in map
> > update_data = does_function_return_dict(test_inputs, test_indices)
> > File "/anaconda3/envs/torch-xla-1.6/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1195, in does_function_return_dict
> > function(*fn_args, indices, **fn_kwargs) if with_indices else function(*fn_args, **fn_kwargs)
> > File "", line 3, in encode
> > TypeError: **init**() takes 1 positional argument but 2 were given
What is your encoder function? | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > ```python
> def encode(examples):
> return tokenizer(examples['text'], truncation=True, padding='max_length')
> ```
It is the same as suggested:
> def encode(examples):
return tokenizer(examples['text'], truncation=True, padding='max_length') | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 25 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> ```python
> def encode(examples):
> return tokenizer(examples['text'], truncation=True, padding='max_length')
> ```
It is the same as suggested:
> def encode(examples):
return tokenizer(examples['text'], truncation=True, padding='max_length') | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > > ```python
> > def encode(examples):
> > return tokenizer(examples['text'], truncation=True, padding='max_length')
> > ```
>
> It is the same as suggested:
>
> > def encode(examples):
> > return tokenizer(examples['text'], truncation=True, padding='max_length')
Do you use this function in a `class` object?
init() takes 1 positional argument but 2 were given. I guess the additional argument is self? | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 60 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> > ```python
> > def encode(examples):
> > return tokenizer(examples['text'], truncation=True, padding='max_length')
> > ```
>
> It is the same as suggested:
>
> > def encode(examples):
> > return tokenizer(examples['text'], truncation=True, padding='max_length')
Do you use this function in a `class` object?
init() takes 1 positional argument but 2 were given. I guess the additional argument is self? | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > > > ```python
> > > def encode(examples):
> > > return tokenizer(examples['text'], truncation=True, padding='max_length')
> > > ```
> >
> >
> > It is the same as suggested:
> > > def encode(examples):
> > > return tokenizer(examples['text'], truncation=True, padding='max_length')
>
> Do you use this function in a `class` object?
>
> init() takes 1 positional argument but 2 were given. I guess the additional argument is self?
Thanks for your reply.
Could you provide some simple example here?
Currently, I do not use this function in a class object.
I think you are right and I was wondering how to construct this class.
I try to modify it based on transformers' LineByLineTextDataset. Am I correct?
> class LineByLineTextDataset(Dataset):
"""
This will be superseded by a framework-agnostic approach
soon.
"""
def __init__(self, tokenizer: PreTrainedTokenizer, file_path: str, block_size: int):
assert os.path.isfile(file_path), f"Input file path {file_path} not found"
# Here, we do not cache the features, operating under the assumption
# that we will soon use fast multithreaded tokenizers from the
# `tokenizers` repo everywhere =)
#logger.info("Creating features from dataset file at %s", file_path)
#with open(file_path, encoding="utf-8") as f:
# lines = [line for line in f.read().splitlines() if (len(line) > 0 and not line.isspace())]
#batch_encoding = tokenizer(lines, add_special_tokens=True, truncation=True, max_length=block_size)
import glob
files = glob.glob('/home/mtzhang111/fairseq/cs_doc/shards/shard_003*')
from datasets import load_dataset
dataset = load_dataset('text', data_files=files)
batch_encoding= dataset.map(encode, batched=True)
self.examples = batch_encoding["input_ids"]
def encode(examples):
return tokenizer(examples['text'], truncation=True, padding='max_length')
def __len__(self):
return len(self.examples)
def __getitem__(self, i) -> torch.Tensor:
return torch.tensor(self.examples[i], dtype=torch.long)
| I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 250 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> > > ```python
> > > def encode(examples):
> > > return tokenizer(examples['text'], truncation=True, padding='max_length')
> > > ```
> >
> >
> > It is the same as suggested:
> > > def encode(examples):
> > > return tokenizer(examples['text'], truncation=True, padding='max_length')
>
> Do you use this function in a `class` object?
>
> init() takes 1 positional argument but 2 were given. I guess the additional argument is self?
Thanks for your reply.
Could you provide some simple example here?
Currently, I do not use this function in a class object.
I think you are right and I was wondering how to construct this class.
I try to modify it based on transformers' LineByLineTextDataset. Am I correct?
> class LineByLineTextDataset(Dataset):
"""
This will be superseded by a framework-agnostic approach
soon.
"""
def __init__(self, tokenizer: PreTrainedTokenizer, file_path: str, block_size: int):
assert os.path.isfile(file_path), f"Input file path {file_path} not found"
# Here, we do not cache the features, operating under the assumption
# that we will soon use fast multithreaded tokenizers from the
# `tokenizers` repo everywhere =)
#logger.info("Creating features from dataset file at %s", file_path)
#with open(file_path, encoding="utf-8") as f:
# lines = [line for line in f.read().splitlines() if (len(line) > 0 and not line.isspace())]
#batch_encoding = tokenizer(lines, add_special_tokens=True, truncation=True, max_length=block_size)
import glob
files = glob.glob('/home/mtzhang111/fairseq/cs_doc/shards/shard_003*')
from datasets import load_dataset
dataset = load_dataset('text', data_files=files)
batch_encoding= dataset.map(encode, batched=True)
self.examples = batch_encoding["input_ids"]
def encode(examples):
return tokenizer(examples['text'], truncation=True, padding='max_length')
def __len__(self):
return len(self.examples)
def __getitem__(self, i) -> torch.Tensor:
return torch.tensor(self.examples[i], dtype=torch.long)
| [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > > > > ```python
> > > > def encode(examples):
> > > > return tokenizer(examples['text'], truncation=True, padding='max_length')
> > > > ```
> > >
> > >
> > > It is the same as suggested:
> > > > def encode(examples):
> > > > return tokenizer(examples['text'], truncation=True, padding='max_length')
> >
> >
> > Do you use this function in a `class` object?
> > init() takes 1 positional argument but 2 were given. I guess the additional argument is self?
>
> Thanks for your reply.
> Could you provide some simple example here?
> Currently, I do not use this function in a class object.
> I think you are right and I was wondering how to construct this class.
> I try to modify it based on transformers' LineByLineTextDataset. Am I correct?
>
> > class LineByLineTextDataset(Dataset):
> > """
> > This will be superseded by a framework-agnostic approach
> > soon.
> > """
>
> ```
> def __init__(self, tokenizer: PreTrainedTokenizer, file_path: str, block_size: int):
> assert os.path.isfile(file_path), f"Input file path {file_path} not found"
> # Here, we do not cache the features, operating under the assumption
> # that we will soon use fast multithreaded tokenizers from the
> # `tokenizers` repo everywhere =)
> #logger.info("Creating features from dataset file at %s", file_path)
> #with open(file_path, encoding="utf-8") as f:
> # lines = [line for line in f.read().splitlines() if (len(line) > 0 and not line.isspace())]
> #batch_encoding = tokenizer(lines, add_special_tokens=True, truncation=True, max_length=block_size)
>
> import glob
> files = glob.glob('/home/mtzhang111/fairseq/cs_doc/shards/shard_003*')
> from datasets import load_dataset
> dataset = load_dataset('text', data_files=files)
> batch_encoding= dataset.map(encode, batched=True)
> self.examples = batch_encoding["input_ids"]
>
>
> def encode(examples):
> return tokenizer(examples['text'], truncation=True, padding='max_length')
>
> def __len__(self):
> return len(self.examples)
>
> def __getitem__(self, i) -> torch.Tensor:
> return torch.tensor(self.examples[i], dtype=torch.long)
> ```
I am also struggling with this adaptation.
I am not sure whether I am right.
I think you don't need to construct `class LazyLineByLineTextDataset(Dataset)` at all.
torch.utils.data.Dataset is a generator.
Now, we use `dataset = dataset.map(encode, batched=True)` as a generator. So we just pass dataset to torch.utils.data.DataLoader. | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 357 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> > > > ```python
> > > > def encode(examples):
> > > > return tokenizer(examples['text'], truncation=True, padding='max_length')
> > > > ```
> > >
> > >
> > > It is the same as suggested:
> > > > def encode(examples):
> > > > return tokenizer(examples['text'], truncation=True, padding='max_length')
> >
> >
> > Do you use this function in a `class` object?
> > init() takes 1 positional argument but 2 were given. I guess the additional argument is self?
>
> Thanks for your reply.
> Could you provide some simple example here?
> Currently, I do not use this function in a class object.
> I think you are right and I was wondering how to construct this class.
> I try to modify it based on transformers' LineByLineTextDataset. Am I correct?
>
> > class LineByLineTextDataset(Dataset):
> > """
> > This will be superseded by a framework-agnostic approach
> > soon.
> > """
>
> ```
> def __init__(self, tokenizer: PreTrainedTokenizer, file_path: str, block_size: int):
> assert os.path.isfile(file_path), f"Input file path {file_path} not found"
> # Here, we do not cache the features, operating under the assumption
> # that we will soon use fast multithreaded tokenizers from the
> # `tokenizers` repo everywhere =)
> #logger.info("Creating features from dataset file at %s", file_path)
> #with open(file_path, encoding="utf-8") as f:
> # lines = [line for line in f.read().splitlines() if (len(line) > 0 and not line.isspace())]
> #batch_encoding = tokenizer(lines, add_special_tokens=True, truncation=True, max_length=block_size)
>
> import glob
> files = glob.glob('/home/mtzhang111/fairseq/cs_doc/shards/shard_003*')
> from datasets import load_dataset
> dataset = load_dataset('text', data_files=files)
> batch_encoding= dataset.map(encode, batched=True)
> self.examples = batch_encoding["input_ids"]
>
>
> def encode(examples):
> return tokenizer(examples['text'], truncation=True, padding='max_length')
>
> def __len__(self):
> return len(self.examples)
>
> def __getitem__(self, i) -> torch.Tensor:
> return torch.tensor(self.examples[i], dtype=torch.long)
> ```
I am also struggling with this adaptation.
I am not sure whether I am right.
I think you don't need to construct `class LazyLineByLineTextDataset(Dataset)` at all.
torch.utils.data.Dataset is a generator.
Now, we use `dataset = dataset.map(encode, batched=True)` as a generator. So we just pass dataset to torch.utils.data.DataLoader. | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | @chiyuzhang94 Thanks for your reply. After some changes, currently, I managed to make the data loading process running.
I published it in case you might want to take a look. Thanks for your help!
https://github.com/shizhediao/Transformers_TPU | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 35 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
@chiyuzhang94 Thanks for your reply. After some changes, currently, I managed to make the data loading process running.
I published it in case you might want to take a look. Thanks for your help!
https://github.com/shizhediao/Transformers_TPU | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Hi @shizhediao ,
Thanks! It looks great!
But my problem still is the cache directory is a read-only file system.
[As I mentioned](https://github.com/huggingface/datasets/issues/610#issuecomment-693912285), I tried to change the cache directory but it didn't work.
Do you have any suggestions?
| I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 39 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Hi @shizhediao ,
Thanks! It looks great!
But my problem still is the cache directory is a read-only file system.
[As I mentioned](https://github.com/huggingface/datasets/issues/610#issuecomment-693912285), I tried to change the cache directory but it didn't work.
Do you have any suggestions?
| [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > I installed datasets at /project/chiyuzh/evn_py36/datasets/src where is a writable directory.
> I also tried change the environment variables to the writable directory:
> `export HF_MODULES_PATH=/project/chiyuzh/evn_py36/datasets/cache_dir/`
I think it is `HF_MODULES_CACHE` and not `HF_MODULES_PATH` @chiyuzhang94 .
Could you try again and let me know if it fixes your issue ?
| I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 50 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> I installed datasets at /project/chiyuzh/evn_py36/datasets/src where is a writable directory.
> I also tried change the environment variables to the writable directory:
> `export HF_MODULES_PATH=/project/chiyuzh/evn_py36/datasets/cache_dir/`
I think it is `HF_MODULES_CACHE` and not `HF_MODULES_PATH` @chiyuzhang94 .
Could you try again and let me know if it fixes your issue ?
| [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | We should probably add a section in the doc on the caching system with the env variables in particular. | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 19 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
We should probably add a section in the doc on the caching system with the env variables in particular. | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Hi @thomwolf , @lhoestq ,
Thanks for your suggestions. With the latest version of this package, I can load text data without Internet.
But I found the speed of dataset loading is very slow.
My scrips like this:
```
def token_encode(examples):
tokenizer_out = tokenizer(examples['text'], truncation=True, padding="max_length", add_special_tokens=True, max_length=args.block_size)
return tokenizer_out
path = Path(file_path)
files = sorted(path.glob('*'))
dataset = load_dataset('./text.py', data_files=files, cache_dir = args.data_cache_dir, split="train")
dataset = dataset.map(token_encode, batched=True)
dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
```
I have 1,123,870,657 lines in my input directory.
I can find the processing speed as following. It is very slow.
```
| 13/1123871 [00:02<62:37:39, 4.98ba/s]^M 0%|
| 14/1123871 [00:03<61:27:31, 5.08ba/s]^M 0%|
| 15/1123871 [00:03<66:34:19, 4.69ba/s]^M 0%|
| 16/1123871 [00:03<68:25:01, 4.56ba/s]^M 0%|
| 17/1123871 [00:03<72:00:03, 4.34ba/s]^M 0%|
```
Do you have any suggestions to accelerate this loading process? | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 129 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Hi @thomwolf , @lhoestq ,
Thanks for your suggestions. With the latest version of this package, I can load text data without Internet.
But I found the speed of dataset loading is very slow.
My scrips like this:
```
def token_encode(examples):
tokenizer_out = tokenizer(examples['text'], truncation=True, padding="max_length", add_special_tokens=True, max_length=args.block_size)
return tokenizer_out
path = Path(file_path)
files = sorted(path.glob('*'))
dataset = load_dataset('./text.py', data_files=files, cache_dir = args.data_cache_dir, split="train")
dataset = dataset.map(token_encode, batched=True)
dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
```
I have 1,123,870,657 lines in my input directory.
I can find the processing speed as following. It is very slow.
```
| 13/1123871 [00:02<62:37:39, 4.98ba/s]^M 0%|
| 14/1123871 [00:03<61:27:31, 5.08ba/s]^M 0%|
| 15/1123871 [00:03<66:34:19, 4.69ba/s]^M 0%|
| 16/1123871 [00:03<68:25:01, 4.56ba/s]^M 0%|
| 17/1123871 [00:03<72:00:03, 4.34ba/s]^M 0%|
```
Do you have any suggestions to accelerate this loading process? | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | You can use multiprocessing by specifying `num_proc=` in `.map()`
Also it looks like you have `1123871` batches of 1000 elements (default batch size), i.e. 1,123,871,000 lines in total.
Am I right ? | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 32 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
You can use multiprocessing by specifying `num_proc=` in `.map()`
Also it looks like you have `1123871` batches of 1000 elements (default batch size), i.e. 1,123,871,000 lines in total.
Am I right ? | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > You can use multiprocessing by specifying `num_proc=` in `.map()`
>
> Also it looks like you have `1123871` batches of 1000 elements (default batch size), i.e. 1,123,871,000 lines in total.
> Am I right ?
Hi @lhoestq ,
Thanks. I will try it.
You are right. I have 1,123,870,657 lines totally in the path. I split the large file into 440 small files. Each file has 2,560,000 lines.
I have another question. Because I am using a cloud server where only allows running a job up to 7 days. Hence, I need to resume my model every week. If the script needs to load and process the dataset every time. It is very low efficient based on the current processing speed. Is it possible that I process the dataset once and use the process cache to in the future work?
| I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 141 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> You can use multiprocessing by specifying `num_proc=` in `.map()`
>
> Also it looks like you have `1123871` batches of 1000 elements (default batch size), i.e. 1,123,871,000 lines in total.
> Am I right ?
Hi @lhoestq ,
Thanks. I will try it.
You are right. I have 1,123,870,657 lines totally in the path. I split the large file into 440 small files. Each file has 2,560,000 lines.
I have another question. Because I am using a cloud server where only allows running a job up to 7 days. Hence, I need to resume my model every week. If the script needs to load and process the dataset every time. It is very low efficient based on the current processing speed. Is it possible that I process the dataset once and use the process cache to in the future work?
| [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Hi @lhoestq ,
I tried to use multi-processor, but I got errors as follow:
Because I am using python distributed training, it seems some conflicts with the distributed job.
Do you have any suggestions?
```
I0925 10:19:35.603023 140737353971520 filelock.py:318] Lock 140737229443368 released on /tmp/pbs.1120510.pbsha.ib.sockeye/cache/_tmp_pbs.1120510.pbsha.ib.sockeye_cache_text_default-7fb934ed6fac5d01_0.0.0_512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7
fcc649178b014.lock
Traceback (most recent call last):
File "/scratch/chiyuzh/roberta/run_language_modeling.py", line 1024, in <module>
main()
File "/scratch/chiyuzh/roberta/run_language_modeling.py", line 967, in main
train_dataset = load_and_cache_examples(args, tokenizer, evaluate=False)
File "/scratch/chiyuzh/roberta/run_language_modeling.py", line 180, in load_and_cache_examples
return HG_Datasets(tokenizer, file_path, args)
File "/scratch/chiyuzh/roberta/run_language_modeling.py", line 119, in HG_Datasets
dataset = dataset.map(token_encode, batched=True, batch_size = 10000, num_proc = 16)
File "/project/chiyuzh/evn_py36/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1287, in map
transformed_shards = [r.get() for r in results]
File "/project/chiyuzh/evn_py36/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1287, in <listcomp>
transformed_shards = [r.get() for r in results]
File "/project/chiyuzh/evn_py36/lib/python3.6/multiprocessing/pool.py", line 644, in get
raise self._value
File "/project/chiyuzh/evn_py36/lib/python3.6/multiprocessing/pool.py", line 424, in _handle_tasks
put(task)
File "/project/chiyuzh/evn_py36/lib/python3.6/multiprocessing/connection.py", line 206, in send
self._send_bytes(_ForkingPickler.dumps(obj))
File "/project/chiyuzh/evn_py36/lib/python3.6/multiprocessing/reduction.py", line 51, in dumps
cls(buf, protocol).dump(obj)
AttributeError: Can't pickle local object 'HG_Datasets.<locals>.token_encode'
``` | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 157 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Hi @lhoestq ,
I tried to use multi-processor, but I got errors as follow:
Because I am using python distributed training, it seems some conflicts with the distributed job.
Do you have any suggestions?
```
I0925 10:19:35.603023 140737353971520 filelock.py:318] Lock 140737229443368 released on /tmp/pbs.1120510.pbsha.ib.sockeye/cache/_tmp_pbs.1120510.pbsha.ib.sockeye_cache_text_default-7fb934ed6fac5d01_0.0.0_512f465342e4f4cd07a8791428a629c043bb89d55ad7817cbf7
fcc649178b014.lock
Traceback (most recent call last):
File "/scratch/chiyuzh/roberta/run_language_modeling.py", line 1024, in <module>
main()
File "/scratch/chiyuzh/roberta/run_language_modeling.py", line 967, in main
train_dataset = load_and_cache_examples(args, tokenizer, evaluate=False)
File "/scratch/chiyuzh/roberta/run_language_modeling.py", line 180, in load_and_cache_examples
return HG_Datasets(tokenizer, file_path, args)
File "/scratch/chiyuzh/roberta/run_language_modeling.py", line 119, in HG_Datasets
dataset = dataset.map(token_encode, batched=True, batch_size = 10000, num_proc = 16)
File "/project/chiyuzh/evn_py36/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1287, in map
transformed_shards = [r.get() for r in results]
File "/project/chiyuzh/evn_py36/lib/python3.6/site-packages/datasets/arrow_dataset.py", line 1287, in <listcomp>
transformed_shards = [r.get() for r in results]
File "/project/chiyuzh/evn_py36/lib/python3.6/multiprocessing/pool.py", line 644, in get
raise self._value
File "/project/chiyuzh/evn_py36/lib/python3.6/multiprocessing/pool.py", line 424, in _handle_tasks
put(task)
File "/project/chiyuzh/evn_py36/lib/python3.6/multiprocessing/connection.py", line 206, in send
self._send_bytes(_ForkingPickler.dumps(obj))
File "/project/chiyuzh/evn_py36/lib/python3.6/multiprocessing/reduction.py", line 51, in dumps
cls(buf, protocol).dump(obj)
AttributeError: Can't pickle local object 'HG_Datasets.<locals>.token_encode'
``` | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | For multiprocessing, the function given to `map` must be picklable.
Maybe you could try to define `token_encode` outside `HG_Datasets` ?
Also maybe #656 could make functions defined locally picklable for multiprocessing, once it's merged. | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 34 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
For multiprocessing, the function given to `map` must be picklable.
Maybe you could try to define `token_encode` outside `HG_Datasets` ?
Also maybe #656 could make functions defined locally picklable for multiprocessing, once it's merged. | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > I have another question. Because I am using a cloud server where only allows running a job up to 7 days. Hence, I need to resume my model every week. If the script needs to load and process the dataset every time. It is very low efficient based on the current processing speed. Is it possible that I process the dataset once and use the process cache to in the future work?
Feel free to save your processed dataset using `dataset.save_to_disk("path/to/save/directory")`.
Then you'll be able to reload it again using
```python
from datasets import load_from_disk
dataset = load_from_disk("path/to/save/directory")
``` | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 100 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> I have another question. Because I am using a cloud server where only allows running a job up to 7 days. Hence, I need to resume my model every week. If the script needs to load and process the dataset every time. It is very low efficient based on the current processing speed. Is it possible that I process the dataset once and use the process cache to in the future work?
Feel free to save your processed dataset using `dataset.save_to_disk("path/to/save/directory")`.
Then you'll be able to reload it again using
```python
from datasets import load_from_disk
dataset = load_from_disk("path/to/save/directory")
``` | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Hi @lhoestq ,
Thanks for your suggestion.
I tried to process the dataset and save it to disk.
I have 1.12B samples in the raw dataset. I used 16 processors.
I run this process job for 7 days. But it didn't finish. I don't why the processing is such slow.
The log shows that some processors (\#12, \#14, \#15) are very slow. The different processor has a different speed. These slow processors look like a bottleneck.
Could you please give me any suggestion to improve the processing speed?
Thanks.
Chiyu
Here is my code:
```
def token_encode(examples):
tokenizer_out = tokenizer(examples['text'], truncation=True, padding="max_length", add_special_tokens=True, max_length=args.block_size)
return tokenizer_out
path = Path(file_path)
files = sorted(path.glob('*'))
dataset = load_dataset('./text.py', data_files=files, cache_dir = args.data_cache_dir, split="train")
dataset = dataset.map(token_encode, batched=True, batch_size = 16384, num_proc = 16)
dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
dataset.save_to_disk(output_dir)
```
Here is the log.
```
^M#6: 1%|▏ | 59/4288 [55:10<66:11:58, 56.35s/ba]
^M#1: 8%|▊ | 356/4288 [55:39<10:40:02, 9.77s/ba]
^M#2: 5%|▍ | 210/4288 [55:33<17:47:19, 15.70s/ba]
^M#0: 19%|█▉ | 836/4288 [55:53<4:08:56, 4.33s/ba]
^M#0: 20%|█▉ | 837/4288 [55:57<4:01:52, 4.21s/ba]
^M#1: 8%|▊ | 357/4288 [55:48<10:38:09, 9.74s/ba]
^M#0: 20%|█▉ | 838/4288 [56:01<4:02:56, 4.23s/ba]
^M#3: 4%|▎ | 155/4288 [55:43<24:41:20, 21.51s/ba]
^M#0: 20%|█▉ | 839/4288 [56:05<4:04:48, 4.26s/ba]
^M#12: 1%| | 29/4288 [54:50<133:20:53, 112.72s/ba]
^M#2: 5%|▍ | 211/4288 [55:48<17:40:33, 15.61s/ba]
^M#14: 0%| | 2/4288 [04:24<157:17:50, 132.12s/ba]
^M#15: 0%| | 1/4288 [02:24<172:11:37, 144.60s/ba]
``` | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 219 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Hi @lhoestq ,
Thanks for your suggestion.
I tried to process the dataset and save it to disk.
I have 1.12B samples in the raw dataset. I used 16 processors.
I run this process job for 7 days. But it didn't finish. I don't why the processing is such slow.
The log shows that some processors (\#12, \#14, \#15) are very slow. The different processor has a different speed. These slow processors look like a bottleneck.
Could you please give me any suggestion to improve the processing speed?
Thanks.
Chiyu
Here is my code:
```
def token_encode(examples):
tokenizer_out = tokenizer(examples['text'], truncation=True, padding="max_length", add_special_tokens=True, max_length=args.block_size)
return tokenizer_out
path = Path(file_path)
files = sorted(path.glob('*'))
dataset = load_dataset('./text.py', data_files=files, cache_dir = args.data_cache_dir, split="train")
dataset = dataset.map(token_encode, batched=True, batch_size = 16384, num_proc = 16)
dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
dataset.save_to_disk(output_dir)
```
Here is the log.
```
^M#6: 1%|▏ | 59/4288 [55:10<66:11:58, 56.35s/ba]
^M#1: 8%|▊ | 356/4288 [55:39<10:40:02, 9.77s/ba]
^M#2: 5%|▍ | 210/4288 [55:33<17:47:19, 15.70s/ba]
^M#0: 19%|█▉ | 836/4288 [55:53<4:08:56, 4.33s/ba]
^M#0: 20%|█▉ | 837/4288 [55:57<4:01:52, 4.21s/ba]
^M#1: 8%|▊ | 357/4288 [55:48<10:38:09, 9.74s/ba]
^M#0: 20%|█▉ | 838/4288 [56:01<4:02:56, 4.23s/ba]
^M#3: 4%|▎ | 155/4288 [55:43<24:41:20, 21.51s/ba]
^M#0: 20%|█▉ | 839/4288 [56:05<4:04:48, 4.26s/ba]
^M#12: 1%| | 29/4288 [54:50<133:20:53, 112.72s/ba]
^M#2: 5%|▍ | 211/4288 [55:48<17:40:33, 15.61s/ba]
^M#14: 0%| | 2/4288 [04:24<157:17:50, 132.12s/ba]
^M#15: 0%| | 1/4288 [02:24<172:11:37, 144.60s/ba]
``` | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Hi !
As far as I can tell, there could be several reasons for your processes to have different speeds:
- some parts of your dataset have short passages while some have longer passages, that take more time to be processed
- OR there are other processes running that prevent some of them to run at full speed
- OR the value of `num_proc` is higher than the number of actual processes that you can run in parallel at full speed.
So I'd suggest you to check that you have nothing else running in parallel to your processing job, and also maybe take a look at the slow parts of the datasets.
When doing multiprocessing, the dataset is sharded in `num_proc` contiguous parts that are processed individually in each process. If you want to take a look at the dataset processed in the 12th shard of 16 for example, you can do:
```python
my_shard = dataset.shard(num_shards=16, index=12, contiguous=True)
```
Hope this helps, let me know if you find what is causing this slow down. | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 174 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Hi !
As far as I can tell, there could be several reasons for your processes to have different speeds:
- some parts of your dataset have short passages while some have longer passages, that take more time to be processed
- OR there are other processes running that prevent some of them to run at full speed
- OR the value of `num_proc` is higher than the number of actual processes that you can run in parallel at full speed.
So I'd suggest you to check that you have nothing else running in parallel to your processing job, and also maybe take a look at the slow parts of the datasets.
When doing multiprocessing, the dataset is sharded in `num_proc` contiguous parts that are processed individually in each process. If you want to take a look at the dataset processed in the 12th shard of 16 for example, you can do:
```python
my_shard = dataset.shard(num_shards=16, index=12, contiguous=True)
```
Hope this helps, let me know if you find what is causing this slow down. | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > Do you use a fast or a slow tokenizer from the `transformers` library @chiyuzhang94?
Hi @thomwolf ,
I use this:
```
from transformers import
AutoTokenizer.from_pretrained(args.model_name_or_path, cache_dir=args.cache_dir)
```
I guess this is a slow one, let me explore the fast tokenizer. | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 41 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> Do you use a fast or a slow tokenizer from the `transformers` library @chiyuzhang94?
Hi @thomwolf ,
I use this:
```
from transformers import
AutoTokenizer.from_pretrained(args.model_name_or_path, cache_dir=args.cache_dir)
```
I guess this is a slow one, let me explore the fast tokenizer. | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | > Hi !
>
> As far as I can tell, there could be several reasons for your processes to have different speeds:
>
> * some parts of your dataset have short passages while some have longer passages, that take more time to be processed
> * OR there are other processes running that prevent some of them to run at full speed
> * OR the value of `num_proc` is higher than the number of actual processes that you can run in parallel at full speed.
>
> So I'd suggest you to check that you have nothing else running in parallel to your processing job, and also maybe take a look at the slow parts of the datasets.
> When doing multiprocessing, the dataset is sharded in `num_proc` contiguous parts that are processed individually in each process. If you want to take a look at the dataset processed in the 12th shard of 16 for example, you can do:
>
> ```python
> my_shard = dataset.shard(num_shards=16, index=12, contiguous=True)
> ```
>
> Hope this helps, let me know if you find what is causing this slow down.
Hi @lhoestq ,
Thanks for your suggestions.
I don't think my problem is due to any one of these seasons.
1. I have 1,123,870,657 lines totally in the path. I split the large file into 440 small files. Each file has 2,560,000 lines. The last file is smaller a little bit. But they are similar. I randomly shuffled all the 1,123,870,657 lines. Hence, the sequences should also be similar across all the files.
2. I run this script on the entire node. I requested all the resources on the nodes (40 CPUs, 384GB memory). Hence, these were not any other processes.
3. As I say, the node has 40 CPUs, but I set num_proc = 16. This should not be a problem. | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 312 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
> Hi !
>
> As far as I can tell, there could be several reasons for your processes to have different speeds:
>
> * some parts of your dataset have short passages while some have longer passages, that take more time to be processed
> * OR there are other processes running that prevent some of them to run at full speed
> * OR the value of `num_proc` is higher than the number of actual processes that you can run in parallel at full speed.
>
> So I'd suggest you to check that you have nothing else running in parallel to your processing job, and also maybe take a look at the slow parts of the datasets.
> When doing multiprocessing, the dataset is sharded in `num_proc` contiguous parts that are processed individually in each process. If you want to take a look at the dataset processed in the 12th shard of 16 for example, you can do:
>
> ```python
> my_shard = dataset.shard(num_shards=16, index=12, contiguous=True)
> ```
>
> Hope this helps, let me know if you find what is causing this slow down.
Hi @lhoestq ,
Thanks for your suggestions.
I don't think my problem is due to any one of these seasons.
1. I have 1,123,870,657 lines totally in the path. I split the large file into 440 small files. Each file has 2,560,000 lines. The last file is smaller a little bit. But they are similar. I randomly shuffled all the 1,123,870,657 lines. Hence, the sequences should also be similar across all the files.
2. I run this script on the entire node. I requested all the resources on the nodes (40 CPUs, 384GB memory). Hence, these were not any other processes.
3. As I say, the node has 40 CPUs, but I set num_proc = 16. This should not be a problem. | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/610 | Load text file for RoBERTa pre-training. | Hi @thomwolf
I am using `RobertaTokenizerFast` now.
But the speed is still imbalanced, some processors are still slow.
Here is the part of the log. #0 is always much fast than lower rank processors.
```
#15: 3%|▎ | 115/3513 [3:18:36<98:01:33, 103.85s/ba]
#2: 24%|██▍ | 847/3513 [3:20:43<11:06:49, 15.01s/ba]
#1: 37%|███▋ | 1287/3513 [3:20:52<6:19:02, 10.22s/ba]
#0: 72%|███████▏ | 2546/3513 [3:20:52<1:51:03, 6.89s/ba]
#3: 18%|█▊ | 617/3513 [3:20:36<15:50:30, 19.69s/ba]
#0: 73%|███████▎ | 2547/3513 [3:20:59<1:50:25, 6.86s/ba]
#1: 37%|███▋ | 1288/3513 [3:21:02<6:21:13, 10.28s/ba]
#7: 7%|▋ | 252/3513 [3:20:09<44:09:03, 48.74s/ba]
#12: 4%|▍ | 144/3513 [3:19:19<78:00:54, 83.36s/ba]
#4: 14%|█▍ | 494/3513 [3:20:37<20:46:06, 24.77s/ba]
#0: 73%|███████▎ | 2548/3513 [3:21:06<1:49:26, 6.80s/ba]
#2: 24%|██▍ | 848/3513 [3:20:58<11:06:17, 15.00s/ba]
```
Here is my script related to the datasets processing,
```
tokenizer = RobertaTokenizerFast.from_pretrained(args.model_name_or_path, cache_dir=args.cache_dir)
def token_encode(examples):
tokenizer_out = tokenizer(examples['text'], truncation=True, padding="max_length", add_special_tokens=True, max_length=128)
return tokenizer_out
def HG_Datasets(tokenizer, file_path, args):
path = Path(file_path)
files = sorted(path.glob('*'))
dataset = load_dataset('./text.py', data_files=files, cache_dir = ""./, split="train")
dataset = dataset.map(token_encode, batched=True, batch_size = 20000, num_proc = 16)
dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
return dataset
```
I have 1,123,870,657 lines totally in the path. I split the large file into 440 small files. Each file has 2,560,000 lines.
Could you please give any suggestion? Thanks very much!! | I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1 | 198 | Load text file for RoBERTa pre-training.
I migrate my question from https://github.com/huggingface/transformers/pull/4009#issuecomment-690039444
I tried to train a Roberta from scratch using transformers. But I got OOM issues with loading a large text file.
According to the suggestion from @thomwolf , I tried to implement `datasets` to load my text file. This test.txt is a simple sample where each line is a sentence.
```
from datasets import load_dataset
dataset = load_dataset('text', data_files='test.txt',cache_dir="./")
dataset.set_format(type='torch',columns=["text"])
dataloader = torch.utils.data.DataLoader(dataset, batch_size=8)
next(iter(dataloader))
```
But dataload cannot yield sample and error is:
```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-12-388aca337e2f> in <module>
----> 1 next(iter(dataloader))
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
/Library/Python/3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
/Library/Python/3.7/site-packages/torch/utils/data/_utils/fetch.py in <listcomp>(.0)
42 def fetch(self, possibly_batched_index):
43 if self.auto_collation:
---> 44 data = [self.dataset[idx] for idx in possibly_batched_index]
45 else:
46 data = self.dataset[possibly_batched_index]
KeyError: 0
```
`dataset.set_format(type='torch',columns=["text"])` returns a log says:
```
Set __getitem__(key) output type to torch for ['text'] columns (when key is int or slice) and don't output other (un-formatted) columns.
```
I noticed the dataset is `DatasetDict({'train': Dataset(features: {'text': Value(dtype='string', id=None)}, num_rows: 44)})`.
Each sample can be accessed by `dataset["train"]["text"]` instead of `dataset["text"]`.
Could you please give me any suggestions on how to modify this code to load the text file?
Versions:
Python version 3.7.3
PyTorch version 1.6.0
TensorFlow version 2.3.0
datasets version: 1.0.1
Hi @thomwolf
I am using `RobertaTokenizerFast` now.
But the speed is still imbalanced, some processors are still slow.
Here is the part of the log. #0 is always much fast than lower rank processors.
```
#15: 3%|▎ | 115/3513 [3:18:36<98:01:33, 103.85s/ba]
#2: 24%|██▍ | 847/3513 [3:20:43<11:06:49, 15.01s/ba]
#1: 37%|███▋ | 1287/3513 [3:20:52<6:19:02, 10.22s/ba]
#0: 72%|███████▏ | 2546/3513 [3:20:52<1:51:03, 6.89s/ba]
#3: 18%|█▊ | 617/3513 [3:20:36<15:50:30, 19.69s/ba]
#0: 73%|███████▎ | 2547/3513 [3:20:59<1:50:25, 6.86s/ba]
#1: 37%|███▋ | 1288/3513 [3:21:02<6:21:13, 10.28s/ba]
#7: 7%|▋ | 252/3513 [3:20:09<44:09:03, 48.74s/ba]
#12: 4%|▍ | 144/3513 [3:19:19<78:00:54, 83.36s/ba]
#4: 14%|█▍ | 494/3513 [3:20:37<20:46:06, 24.77s/ba]
#0: 73%|███████▎ | 2548/3513 [3:21:06<1:49:26, 6.80s/ba]
#2: 24%|██▍ | 848/3513 [3:20:58<11:06:17, 15.00s/ba]
```
Here is my script related to the datasets processing,
```
tokenizer = RobertaTokenizerFast.from_pretrained(args.model_name_or_path, cache_dir=args.cache_dir)
def token_encode(examples):
tokenizer_out = tokenizer(examples['text'], truncation=True, padding="max_length", add_special_tokens=True, max_length=128)
return tokenizer_out
def HG_Datasets(tokenizer, file_path, args):
path = Path(file_path)
files = sorted(path.glob('*'))
dataset = load_dataset('./text.py', data_files=files, cache_dir = ""./, split="train")
dataset = dataset.map(token_encode, batched=True, batch_size = 20000, num_proc = 16)
dataset.set_format(type='torch', columns=['input_ids', 'attention_mask'])
return dataset
```
I have 1,123,870,657 lines totally in the path. I split the large file into 440 small files. Each file has 2,560,000 lines.
Could you please give any suggestion? Thanks very much!! | [
-0.235164091,
-0.2028223872,
-0.0119608492,
0.4084470868,
0.3831170797,
-0.1324438006,
0.513992548,
0.4229525626,
-0.1767199636,
0.0487691164,
-0.2432832569,
0.1370508373,
-0.144555822,
-0.389242053,
-0.0388360843,
-0.3020623922,
-0.1219131723,
0.192405805,
-0.4711350501,
-0.0367562622,
0.1829232574,
0.0783706829,
-0.1375161558,
0.1973747611,
-0.6427965164,
0.1069569662,
0.2162095755,
0.2214323729,
-0.0749238655,
-0.3002364635,
0.206228748,
0.0434493273,
0.5629115701,
0.795384407,
-0.0001255157,
0.1221124679,
0.2487566471,
-0.2410899699,
-0.3123826385,
-0.1114151031,
0.4535880983,
-0.2304579318,
0.1303865314,
-0.356226325,
0.0759143308,
-0.1436130702,
0.0986076966,
-0.0940504447,
0.5390228033,
0.3707354665,
0.0733966827,
0.4406837225,
-0.0109578297,
-0.0613044836,
0.3599252105,
0.1098704264,
-0.0299898908,
0.0779796094,
0.3455818892,
-0.2738673687,
-0.4754480422,
0.1371811181,
-0.157726258,
0.0799184963,
-0.0569340475,
0.127282083,
0.1293709427,
-0.0320418328,
0.043305859,
0.3101655543,
0.5464714766,
-0.1133197546,
-0.3787394166,
-0.3641909361,
-0.1538962722,
-0.1453787088,
0.3427858949,
0.0990867913,
-0.2295721471,
0.1603961885,
-0.1188361794,
0.0387963541,
-0.4248598218,
0.0132333189,
-0.1664131284,
0.3269482255,
-0.1371060908,
0.0592870116,
0.3322789371,
-0.0902750641,
0.2790077031,
0.0059214793,
0.1468863189,
0.3295744956,
-0.3641241193,
-0.0306546465,
-0.1202295721,
-0.3970665932,
0.2847501934,
0.0596283153,
0.2434103787,
0.15489842,
-0.1783423424,
-0.0348096155,
0.2422807366,
0.1821571738,
-0.1800769567,
0.2752803564,
0.2457242012,
0.3165403903,
-0.13992697,
-0.1407175809,
-0.424334079,
-0.2287805974,
0.0969795138,
-0.135422647,
0.1783900857,
-0.1036363915,
0.0023349598,
0.0298273452,
-0.0968156159,
-0.1359276921,
0.153466925,
0.3965757191,
-0.1335891187,
0.3401689827,
0.1560482383,
0.0891145915,
-0.2729741335,
-0.2191157043,
-0.0154706035,
-0.230926469,
-0.2201964259,
0.1377539933,
0.2425260544,
-0.0458628312,
0.2399179637,
-0.3079231381,
0.2493962944,
-0.1582968384,
-0.1100424677,
-0.3570912182,
0.013569003,
0.2636014819,
-0.1435480267,
0.2023341656,
0.1805707961,
-0.0803245232,
-0.0784903318,
0.3265269399,
-0.1811525971,
-0.4468551278,
0.1354889721,
0.061707411,
-0.1038748622,
-0.0352104641,
-0.0966455787,
0.0569573119,
0.4921214581,
-0.3197454512,
-0.0241777226,
-0.1626605988,
-0.1030525193,
0.0550208092,
0.2268253267,
0.5365383625,
-0.1438212395,
0.0203773826,
-0.0253133401,
-0.0516013131,
0.1682344377,
0.6376837492,
-0.2747703493,
0.5127100348,
-0.1360619962,
0.1061270088,
0.2240156531,
-0.0344659798,
-0.3717760742,
0.3262197375,
-0.0751039833,
0.0406258926,
0.2201386541,
-0.0307626724,
0.0520550311,
-0.0443849377,
0.2690665126,
0.2103518546,
0.1113787144,
0.1413727403,
-0.1208721548,
-0.1993711591,
-0.0637963414,
0.5253328085,
0.2333913445,
0.0096205138,
-0.2704215944,
0.0675344914,
0.2778607011,
-0.2674022615,
0.1534748822,
0.1924354732,
-0.204886511,
0.2004882991,
0.1048068032,
-0.1451075375,
-0.1432373077,
0.0395660996,
0.0148978382,
-0.0170756765,
-0.1478386223,
0.12225122,
-0.2059445828,
0.0733122975,
-0.1966184676,
0.0283276178,
-0.0339355767,
-0.0356451422,
-0.0964286774,
-0.1215628758,
-0.2722819149,
0.2359144688,
-0.1684349775,
0.181261003,
-0.2048250139,
0.1474656016,
-0.0961617976,
-0.2596965432,
-0.0846057758,
0.1086730957,
-0.1160763055,
-0.0557808094,
-0.1087905914,
0.1681587398,
-0.0896202251,
-0.1042776853,
-0.185377568,
0.0577389002,
0.1356100738,
-0.2802478075,
0.1246787906,
0.2198599875,
0.1722251922,
0.0123172179,
-0.1304284632,
0.2397198081,
-0.0239512883,
0.1762231141,
0.1257118881,
-0.1457847655,
0.2596860826,
-0.1022093669,
-0.1274694502,
0.1805508733,
0.4158623815,
0.2873033285,
0.2258877009,
0.0142893177,
-0.1222362891,
-0.5006822348,
0.2749260962,
-0.1510629058,
0.0672701299,
0.2920909524,
-0.377589196,
0.0398273915,
-0.2509223521,
-0.254042685,
0.3311232626,
0.0367436931,
-0.0564248934,
-0.0796816871,
0.0952708125,
-0.1589732468,
0.1148996055,
0.2903904915,
0.0871189386,
0.4159712791,
0.0195511356,
-0.0128613953,
-0.2850964665,
-0.2793557048,
-0.0324505493,
0.0772640407,
-0.2523701191,
0.3918678761,
0.1817437559,
0.1658735871,
-0.3981607556,
-0.066521138,
-0.0594438165,
0.1600899994,
-0.0328795835,
0.22458902,
-0.1531230807,
0.1375097334,
0.2761564255,
0.2386559844,
0.5310093164,
-0.3450524807,
-0.0228328258,
-0.2657009661,
-0.1711075008,
-0.0281732585,
0.1564719528,
-0.3800227046,
-0.022734087,
-0.159381032,
0.04143776,
-0.0410516858,
-0.2128794342,
0.1461290568,
0.0950182974,
-0.0657052025,
-0.0873158723,
0.3188766539,
0.2235689461,
-0.2584021986,
0.1652871817,
-0.3514512777,
-0.0836748183,
0.1345466226,
-0.0258615538,
-0.0039211996,
0.1339136064,
-0.4912371635,
-0.0777524039,
-0.4011792839,
0.2066591978,
0.25366202,
0.119074814,
0.2927313745,
0.3777470291,
0.2591687143,
-0.1536991894,
0.2036134005,
0.0069755688,
-0.273950994,
0.6016746163,
0.0372794531,
-0.3855608106,
-0.2735379636,
-0.1876646578,
-0.0212984048,
0.1292123795,
-0.5017557144,
0.2092531323,
-0.1055912524,
-0.0392209068,
-0.2687578797,
0.2106822133,
0.3517276049,
-0.0582845509,
0.02200168,
0.0811397806,
-0.1444354355,
0.0589839257,
0.0612705946,
0.2923365831,
-0.2327817529,
0.0906196386,
-0.1289010197,
0.7936115265,
0.0342537835,
-0.0950503349,
0.1772329211,
0.0900983587,
0.1697514355,
-0.1435039937,
-0.1156518757,
-0.2848764658,
-0.334931314,
-0.0750080645,
0.3109187186,
0.0489612967,
-0.2725129724,
-0.1710483134,
0.0355695039,
0.1201005653,
-0.3826603889,
0.5341356397,
0.0394278765,
0.1441117078,
-0.1620216668,
0.0803807378,
-0.2024175227,
-0.0343527496,
-0.1732663959,
0.2794303596,
0.1588231921,
-0.1944966465,
-0.3677432537,
0.2286982834,
-0.1797412485,
-0.0017903298,
0.1497698575,
0.2188694924,
0.1418789178,
-0.4290395379,
0.0551722385,
-0.1176696867,
0.5121042728,
0.4852637947,
-0.1927080154,
-0.0360224657,
-0.1814630926,
-0.3889468312,
-0.1276861131,
-0.1543959677,
0.1139599085,
0.2410914302,
0.3773937225,
-0.2511387467,
-0.0662890375,
0.2503162324,
0.1687660515,
-0.1369205862,
-0.3428622484,
-0.2601179481,
-0.121140331,
-0.556635499,
0.0615773425,
0.0695771426,
0.234084785,
-0.1890151501,
0.1321267784,
0.0574261099,
0.0725402683,
0.1174270809,
0.0908756256,
0.1282272637,
-0.2521229088,
0.1319790632,
0.39874807,
-0.0123186558,
0.0352785662,
0.4964231849,
0.1588191092,
-0.3698640168,
0.0866977274,
0.0357627161,
0.1273486763,
0.3553582132,
-0.017152749,
0.0441157557,
0.2130617946,
0.1465010047,
-0.29508394,
0.2786145806,
0.0300536305,
-0.0124506429,
-0.4403938651,
-0.3575324416,
0.431155175,
-0.2055300027,
0.1844107062,
-0.1050694138,
-0.1353553534,
-0.0892197713,
0.4647260308,
-0.1015501618,
1.112431407,
-0.3223901689,
0.598732233,
-0.0135057494,
0.443911314,
0.2239234447,
-0.4978277981,
0.124543041,
-0.4053941071,
-0.1702316999,
0.0305953771,
-0.1809031367,
0.0319688469,
0.225775972,
-0.2543109655,
0.1726709306,
0.2434325963,
0.1995403767,
0.113698788,
0.0503353477,
-0.129824847,
-0.4798261225,
-0.5485133529,
-0.0774982721,
-0.0896054506,
0.3039266765,
-0.0064559914,
-0.0907664746,
-0.1625552922,
-0.2407624125,
-0.1279715598,
0.0543007925,
-0.1000749841,
0.1193539128,
0.2212516367,
-0.2080655992,
0.2713834643,
0.4447632134,
0.0335665978,
0.2296934724,
-0.1550180912,
0.0855644047,
-0.3620640635,
-0.291254729,
-0.1490074545,
-0.0249444172,
0.3309187293,
0.1130103022,
-0.203044638,
0.0722929537,
-0.0686617866,
-0.2512927651,
0.1684598178,
-0.0013507567,
0.3717736602,
-0.5535948277,
-0.0597359166,
-0.0040133819,
-0.24792476,
-0.0405745879,
0.0385404155,
0.2612252831,
-0.2834444344,
0.1107983887,
0.3466925323,
-0.2172759473,
-0.0156598128,
0.641054213,
-0.025419388,
-0.0714917034,
0.6801123023,
0.3254792392,
-0.1144691184,
-0.1269175112,
0.0653539374,
0.1271480173,
-0.8447716832,
0.1113623679,
-0.1274240464,
-0.1004358903,
-0.075664714,
0.0183763523,
0.2501240969,
-0.3937137127,
-0.1317996085,
-0.34710899,
-0.8210462332,
0.17250067,
-0.0540784597,
0.058346726,
0.1969920397,
0.0994357765,
-0.0679673702,
0.2009987682,
-0.1649546921,
0.2213949859,
-0.2336619496,
0.2557602823,
0.4548302889,
-0.4371103942,
0.1852939725,
0.1128204167,
0.016250629,
0.2322521806,
-0.0405182056,
-0.1479801834,
-0.2513855398,
0.1854471117,
0.1747974008,
-0.3302827179,
-0.0397496335,
-0.1643349379,
-0.0223630648,
-0.3079045713,
0.154282555,
0.00383313,
0.0557833835,
0.1906658709,
0.1657610536,
0.2118629068,
-0.5373832583,
0.1133679599,
-0.2964809537,
0.2011528909,
0.2831172645,
-0.0739611983,
-0.086749129,
-0.0841353238,
-0.2059237361,
0.2081244588,
0.7083231211,
0.0150300488,
0.3763869107,
-0.356171757,
0.1149774641,
-0.0224564373,
0.1344816685,
0.5138251781,
-0.3078081608,
0.0999429449,
0.3407734036,
0.1181183159,
-0.0901403502,
-0.0774589479,
0.020808164,
-0.4548584521,
0.0252232328,
0.2397968471,
-0.1746070236,
0.1946329921,
0.1334869266,
0.2194306701,
0.1392873079,
0.0803818852,
0.0127500184,
0.4389441609,
-0.056777969,
0.0051424117,
0.3662574589,
0.1114441082,
0.2706190348,
0.6254374385,
-0.2381745279,
0.2555935681,
-0.2969623506,
0.2216149271,
-0.0832466632,
-0.5557147264,
-0.1067689806,
0.2992468774,
0.2346815169,
-0.0115801021,
-0.1024337262,
0.2693866193,
-0.0783937946,
0.4590186179,
-0.241999954,
-0.0667977184,
-0.1246633828,
-0.3170199692,
0.0169499889,
-0.1958135068,
-0.0729701743,
0.079974547,
-0.2137754858,
0.0734718293,
0.0290788617,
0.2068384439,
-0.0088843592,
-0.3853548765,
-0.0500595719,
0.200718224,
-0.0043408386,
-0.3296884298,
0.2033613473,
0.3960947692,
0.0450604446,
0.2269716114,
0.1732404083,
0.4248793721,
0.2449598908,
-0.2228326052,
0.2395239472,
-0.259383738,
-0.1199422628,
-0.0664997771,
0.3457727134,
0.225268364,
0.2683987916,
0.173575893,
0.0053323917,
-0.1643179953,
0.2192570567,
-0.0700118095,
-0.1345787346,
-0.1802101135,
0.1774027944,
0.1630973518,
-0.3449828029,
-0.2759724259,
-0.0461292937,
-0.3151616454,
0.2501930296,
0.4753974378,
-0.0478397869,
0.1418462992,
0.1386897266,
-0.0118243191,
0.0551332571,
0.6322230697,
0.4257953167,
0.2179271281,
-0.3433358967,
0.1284048855,
-0.3904693127,
0.064446561,
0.0279970244,
0.2212459743,
0.0685960203,
0.123556152,
0.1988861561,
0.2944617867,
0.0243466683,
-0.6205517054,
0.2357998192,
0.1450714767,
-0.1094392687,
-0.2260715365,
0.242035687,
0.262899071,
-0.0525476336,
-0.554404676,
0.0243349336,
0.1460000873,
-0.0373799726,
-0.2378274202,
0.081512928,
0.05539985,
0.0964973718,
0.1544814408,
-0.1375415623,
0.5417481661,
-0.0090197995,
-0.0410703309,
-0.2688593566,
-0.1199020445,
-0.0654937848,
-0.0751952454,
0.4655779898,
0.5816169977,
-0.2886755764,
-0.0158441793,
-0.3237223625,
-0.0191356484,
-0.0375782587,
-0.2409425974,
-0.369329989,
-0.0854670778,
-0.0077547431,
0.0852635652,
-0.0350229852,
0.3017436564,
-0.1090993881,
-0.0184358507,
-0.4023271501,
-0.219107151,
0.4515859485,
-0.3525936306,
-0.3459249139,
0.0132851154,
-0.1101942882,
0.1301021725,
0.2072258145,
-0.7693994641,
0.4182785153,
0.2454118729,
-0.1145419702,
-0.5120353103,
0.2250450552,
0.0633467883,
-0.0521319434,
-0.2652809918,
0.6368250251,
0.0585999936,
-0.1705970168,
-0.2721942663,
-0.296387881
] |
https://github.com/huggingface/datasets/issues/600 | Pickling error when loading dataset | I wasn't able to reproduce on google colab (python 3.6.9 as well) with
pickle==4.0
dill=0.3.2
transformers==3.1.0
datasets=1.0.1 (also tried nlp 0.4.0)
If I try
```python
from datasets import load_dataset # or from nlp
from transformers import BertTokenizer
tokenizer = BertTokenizer.from_pretrained("bert-base-uncased")
dataset = load_dataset("text", data_files=file_path, split="train")
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True,
truncation=True, max_length=512), batched=True)
dataset.set_format(type='torch', columns=['input_ids'])
```
It runs without error | Hi,
I modified line 136 in the original [run_language_modeling.py](https://github.com/huggingface/transformers/blob/master/examples/language-modeling/run_language_modeling.py) as:
```
# line 136: return LineByLineTextDataset(tokenizer=tokenizer, file_path=file_path, block_size=args.block_size)
dataset = load_dataset("text", data_files=file_path, split="train")
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True,
truncation=True, max_length=args.block_size), batched=True)
dataset.set_format(type='torch', columns=['input_ids'])
return dataset
```
When I run this with transformers (3.1.0) and nlp (0.4.0), I get the following error:
```
Traceback (most recent call last):
File "src/run_language_modeling.py", line 319, in <module>
main()
File "src/run_language_modeling.py", line 248, in main
get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None
File "src/run_language_modeling.py", line 139, in get_dataset
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True, truncation=True, max_length=args.block_size), batched=True)
File "/data/nlp/src/nlp/arrow_dataset.py", line 1136, in map
new_fingerprint=new_fingerprint,
File "/data/nlp/src/nlp/fingerprint.py", line 158, in wrapper
self._fingerprint, transform, kwargs_for_fingerprint
File "/data/nlp/src/nlp/fingerprint.py", line 105, in update_fingerprint
hasher.update(transform_args[key])
File "/data/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/data/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/data/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/data/nlp/src/nlp/utils/py_utils.py", line 362, in dumps
dump(obj, file)
File "/data/nlp/src/nlp/utils/py_utils.py", line 339, in dump
Pickler(file, recurse=True).dump(obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 446, in dump
StockPickler.dump(self, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 409, in dump
self.save(obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1438, in save_function
obj.__dict__, fkwdefaults), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 751, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 736, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1170, in save_cell
pickler.save_reduce(_create_cell, (f,), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 736, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 605, in save_reduce
save(cls)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1365, in save_type
obj.__bases__, _dict), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 751, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 933, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 847, in _batch_setitems
save(v)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 933, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 847, in _batch_setitems
save(v)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 507, in save
self.save_global(obj, rv)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 927, in save_global
(obj, module_name, name))
_pickle.PicklingError: Can't pickle typing.Union[str, NoneType]: it's not the same object as typing.Union
``` | 61 | Pickling error when loading dataset
Hi,
I modified line 136 in the original [run_language_modeling.py](https://github.com/huggingface/transformers/blob/master/examples/language-modeling/run_language_modeling.py) as:
```
# line 136: return LineByLineTextDataset(tokenizer=tokenizer, file_path=file_path, block_size=args.block_size)
dataset = load_dataset("text", data_files=file_path, split="train")
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True,
truncation=True, max_length=args.block_size), batched=True)
dataset.set_format(type='torch', columns=['input_ids'])
return dataset
```
When I run this with transformers (3.1.0) and nlp (0.4.0), I get the following error:
```
Traceback (most recent call last):
File "src/run_language_modeling.py", line 319, in <module>
main()
File "src/run_language_modeling.py", line 248, in main
get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None
File "src/run_language_modeling.py", line 139, in get_dataset
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True, truncation=True, max_length=args.block_size), batched=True)
File "/data/nlp/src/nlp/arrow_dataset.py", line 1136, in map
new_fingerprint=new_fingerprint,
File "/data/nlp/src/nlp/fingerprint.py", line 158, in wrapper
self._fingerprint, transform, kwargs_for_fingerprint
File "/data/nlp/src/nlp/fingerprint.py", line 105, in update_fingerprint
hasher.update(transform_args[key])
File "/data/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/data/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/data/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/data/nlp/src/nlp/utils/py_utils.py", line 362, in dumps
dump(obj, file)
File "/data/nlp/src/nlp/utils/py_utils.py", line 339, in dump
Pickler(file, recurse=True).dump(obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 446, in dump
StockPickler.dump(self, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 409, in dump
self.save(obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1438, in save_function
obj.__dict__, fkwdefaults), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 751, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 736, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1170, in save_cell
pickler.save_reduce(_create_cell, (f,), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 736, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 605, in save_reduce
save(cls)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1365, in save_type
obj.__bases__, _dict), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 751, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 933, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 847, in _batch_setitems
save(v)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 933, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 847, in _batch_setitems
save(v)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 507, in save
self.save_global(obj, rv)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 927, in save_global
(obj, module_name, name))
_pickle.PicklingError: Can't pickle typing.Union[str, NoneType]: it's not the same object as typing.Union
```
I wasn't able to reproduce on google colab (python 3.6.9 as well) with
pickle==4.0
dill=0.3.2
transformers==3.1.0
datasets=1.0.1 (also tried nlp 0.4.0)
If I try
```python
from datasets import load_dataset # or from nlp
from transformers import BertTokenizer
tokenizer = BertTokenizer.from_pretrained("bert-base-uncased")
dataset = load_dataset("text", data_files=file_path, split="train")
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True,
truncation=True, max_length=512), batched=True)
dataset.set_format(type='torch', columns=['input_ids'])
```
It runs without error | [
-0.2029626071,
-0.191864118,
0.1510747075,
0.2650443912,
0.1417570561,
-0.1736163646,
0.2733697891,
0.3465306461,
0.0597779304,
-0.0787870735,
0.0942318216,
0.3698832989,
-0.2363301367,
0.0360411257,
0.086170882,
-0.4138215482,
-0.0133401863,
0.1228953525,
-0.1728637516,
0.0111280903,
-0.1992152929,
0.3199379444,
-0.22430031,
0.1471234262,
-0.5054330826,
-0.0055766646,
0.0756162629,
0.1886182874,
-0.0382537693,
-0.3906204104,
0.3578637838,
-0.0553876683,
0.2542708516,
0.4166390002,
-0.0001185715,
0.1009834036,
0.4450069368,
-0.1613718569,
-0.2186081707,
-0.5004020333,
0.3419600129,
-0.1968301237,
0.195206359,
-0.1194227338,
-0.1389637142,
0.0235720389,
0.0966474935,
0.0984589905,
0.3546353877,
0.2370630205,
0.2198282182,
0.7399454713,
0.0196056366,
-0.0040007085,
-0.1397821754,
0.3455083966,
-0.0702358857,
0.1682085395,
0.0740607232,
0.0049475208,
-0.2505332232,
0.2688039839,
0.0097402558,
-0.1106546968,
0.0340494365,
0.04309351,
-0.0089142062,
-0.0071582589,
0.2315975279,
0.1808645427,
0.3163923025,
-0.1332210302,
-0.4018166959,
-0.4381491244,
-0.0321019664,
-0.253823638,
0.3569170237,
0.0049060769,
-0.1719965041,
-0.0149834994,
-0.1687417626,
-0.2313759625,
0.0281378329,
0.3235932887,
0.1266870648,
0.6207985282,
-0.0286826808,
0.2285000831,
0.3370735049,
-0.1880075186,
0.1313725859,
-0.027758766,
0.0349621661,
0.4023726285,
-0.2388503999,
-0.1197173521,
0.0664065853,
-0.1022803336,
0.0406201109,
0.0130148735,
-0.3214133978,
0.0603421368,
-0.2768704295,
-0.0223204345,
0.2950416803,
0.2889094055,
0.0581619292,
0.5016821027,
-0.0262935013,
-0.1388158202,
0.2159737647,
0.0168191716,
-0.1241563112,
-0.2818192542,
-0.1952975839,
0.097701773,
0.0427058637,
-0.252340734,
-0.0277268961,
-0.2434886396,
-0.0807856098,
0.0121761151,
0.1265030503,
0.4822337627,
0.0803426206,
0.1158536077,
0.0164869428,
0.1796440929,
-0.418432951,
-0.2533633709,
-0.1892675459,
0.0220431387,
-0.3372351527,
-0.1492153853,
0.2223520875,
0.1535899192,
0.3354985714,
-0.1102678925,
-0.0604241081,
0.1364388615,
-0.0179688558,
-0.1492114365,
-0.0685900748,
0.2175128758,
-0.2264652103,
0.1727864295,
0.2571264207,
-0.1251235455,
-0.2969125509,
0.1636128724,
-0.0111902989,
-0.253051728,
-0.1209304184,
0.0851274952,
-0.1824114621,
-0.2599700987,
-0.3591950238,
0.2333444953,
0.3632025719,
-0.2630691826,
-0.1501653939,
-0.3836105764,
-0.1859427094,
-0.1210466325,
0.1095167398,
0.6124694347,
-0.10491658,
-0.4261672795,
-0.1083564162,
-0.1307387054,
0.3863958716,
0.4650295377,
-0.4532152116,
0.1536078155,
-0.2186063528,
0.569476068,
0.3836513162,
-0.1736367345,
-0.2988277376,
0.2628950179,
-0.2244314849,
0.2072925568,
-0.0707562864,
-0.1237978488,
-0.1637936831,
0.057843633,
-0.1647841781,
0.3448472917,
0.1585468203,
0.2666155696,
-0.2812249362,
-0.292591691,
0.4360115528,
0.292026341,
0.2161482722,
0.1050297618,
-0.1254454404,
0.3208657503,
0.2130340487,
-0.2217202038,
0.1912668645,
0.0267761014,
0.022275107,
-0.157406196,
-0.0328533649,
0.0872847885,
-0.6515650153,
0.1748920232,
-0.2471688539,
0.3482393622,
-0.3800008893,
0.122223109,
-0.0670935288,
-0.1169425249,
-0.3418496549,
-0.2124997526,
0.0444623828,
0.1835246235,
-0.0876025707,
-0.0400481112,
-0.0541979223,
0.0558040291,
0.0123566762,
0.1081895754,
-0.4298700988,
0.1635246575,
-0.1746212989,
-0.3540353775,
-0.05868515,
0.2108430713,
0.2123283595,
-0.2704419494,
-0.2660780549,
0.2112216055,
0.0572889075,
0.2215669751,
0.0190084111,
-0.0050374269,
0.1972182095,
-0.1928326786,
0.0177531186,
0.355889231,
0.0182692222,
0.0179571956,
0.084220767,
0.3661317825,
0.0188052338,
0.0989182964,
0.1062776893,
0.0629076138,
0.1784607172,
-0.0767169297,
0.163967669,
-0.1629997939,
0.3003992736,
0.0798102021,
0.4254424572,
0.0218043607,
-0.3364064097,
-0.2099475563,
0.4499590397,
0.0788154602,
0.1665170193,
0.3088003993,
-0.0811552405,
-0.1771025956,
-0.0583941266,
0.0550889671,
0.2792567611,
0.0804544017,
-0.0513549112,
0.0372812599,
-0.0395420305,
-0.1005214006,
0.177628085,
0.101741001,
-0.0386785306,
0.3335557282,
-0.0069131404,
0.0972454622,
-0.43229267,
-0.284920603,
0.1184079647,
0.3154611588,
-0.213056922,
0.1573488116,
-0.1570317149,
-0.0666514039,
-0.4013565183,
-0.2381339222,
-0.476208061,
-0.1907107532,
-0.2355119139,
0.1614568383,
-0.1278425157,
0.2946344018,
-0.0560880415,
0.0913452879,
0.0177858844,
-0.0834000409,
-0.1864182651,
-0.2087881714,
-0.4619051814,
-0.0359910801,
0.2449777722,
0.0210265964,
0.1043825224,
0.1237275004,
-0.0020177364,
0.1463042647,
-0.35873878,
0.2713181376,
0.0121463519,
0.187656939,
0.1870480031,
0.0780556649,
-0.1918314695,
-0.4377752542,
0.2089123428,
-0.0625456721,
-0.2025258541,
0.0551075116,
0.096565038,
-0.0389923677,
-0.1555089056,
-0.3513686061,
-0.3187610507,
-0.4107225239,
0.1541297883,
-0.106872201,
0.1926856637,
0.6526252031,
-0.040500775,
0.3030757904,
-0.2392678112,
0.0254839398,
-0.1506413519,
0.1499269009,
0.0620459355,
-0.1284029335,
-0.3454410136,
-0.3472177684,
-0.0850818008,
0.2574237883,
-0.0885444805,
-0.2285955548,
0.3524973989,
-0.1272809803,
-0.0153183416,
-0.0792653561,
-0.0165646784,
0.437337935,
0.1797024608,
0.0240605772,
-0.0108515471,
-0.025229305,
0.2277864218,
-0.0137910219,
0.39828372,
-0.0115690213,
0.1218837649,
0.0862019956,
0.8433318138,
0.1893254369,
-0.3235137463,
0.3949733377,
0.3149034977,
-0.0733483881,
-0.2566161454,
-0.3480522931,
-0.0821665227,
-0.0401235148,
0.173765257,
0.2232932746,
-0.0486485772,
-0.371719569,
-0.0359345861,
-0.1308296919,
-0.1652548909,
-0.3224000335,
0.2523367405,
-0.1592558622,
0.2457666695,
0.0005499423,
0.0926777795,
-0.2677296102,
0.014488861,
0.0368716121,
0.2266471237,
0.1431091279,
0.0896597058,
-0.5136069059,
-0.0663284287,
-0.4781810045,
0.2796217203,
0.1847024858,
0.5642687678,
-0.1529384702,
-0.1680026203,
0.1045318246,
-0.0186035782,
0.7028647661,
0.0230170507,
-0.0897497609,
0.1123476028,
-0.2468472123,
-0.3765422106,
0.0356232896,
-0.0458275452,
0.1709310114,
0.3891172111,
0.5012795329,
-0.4591731429,
-0.3562015891,
0.0085629541,
-0.0293876864,
-0.0819564313,
-0.201849103,
-0.1864292324,
0.0265048221,
-0.2668230236,
0.1215892881,
-0.0974667966,
0.3548803627,
0.0250715241,
0.29824844,
-0.0434096083,
-0.134286955,
-0.1364928782,
0.1130338758,
0.1991305053,
-0.0183562301,
-0.0332276635,
0.3735995293,
0.1696553379,
-0.2558123469,
0.2703851759,
0.0274355933,
-0.2911649942,
0.0742890537,
0.2565268874,
0.1025480479,
0.2148577273,
0.1469195634,
0.2874315977,
0.113722235,
0.0473640263,
0.0363877751,
0.2269978672,
0.2227708697,
0.1510109603,
-0.3832802773,
-0.2247339934,
0.4746956825,
-0.2865242362,
0.014801085,
0.2330477834,
-0.1075508744,
-0.331222415,
0.6692475677,
-0.0216855183,
1.0043594837,
0.1279560924,
0.2349966168,
0.3570193052,
0.0152031705,
0.4015215635,
0.0753976032,
0.0922101289,
-0.4636112452,
0.1191377938,
-0.0363175049,
-0.0786610544,
0.1102758721,
0.2392286956,
-0.1544450521,
0.1322178394,
-0.0604647323,
-0.2274058461,
0.1711774468,
0.4276745617,
0.1397118866,
-0.4334683418,
-0.2043322772,
0.0076697506,
-0.2661411166,
0.5549463034,
-0.0314721502,
-0.2316811532,
-0.0094626993,
-0.3618115187,
-0.2293411493,
0.3019568622,
-0.099499397,
0.3744978905,
0.6007330418,
-0.1288868636,
0.4571171403,
0.1026241481,
0.2113720775,
0.0299901366,
-0.2630378306,
0.192215085,
-0.0337363407,
-0.0982527658,
0.0589330681,
0.0264071897,
0.3566519916,
0.0146896988,
-0.2336517572,
-0.0800438747,
-0.1291652173,
-0.2312930822,
-0.0259568244,
-0.1611710489,
0.0190383084,
-0.1472083479,
-0.3155694902,
0.0277753994,
0.0255548507,
-0.164984107,
0.0687859356,
0.0484747514,
-0.244903475,
-0.2078677714,
0.3131468594,
-0.2475469112,
-0.0739669353,
0.481036067,
-0.0992322415,
-0.0194899663,
0.6444161534,
0.3653109074,
-0.2062574923,
-0.1943456829,
-0.08366175,
0.1650458127,
-0.8099957108,
-0.0325144939,
-0.0016930513,
-0.2046626508,
-0.0655460656,
0.5908413529,
0.2419276386,
0.0629567951,
-0.1193099916,
-0.3331794143,
-0.3890790641,
0.2196121663,
-0.2222794145,
0.1357289255,
0.0955775231,
0.3911069334,
-0.034744788,
0.1781406701,
-0.2182644904,
-0.0300162248,
-0.2331593335,
0.2521229982,
0.2451466024,
-0.4850664437,
0.225737676,
0.0978111103,
0.0273079947,
0.1323978305,
-0.313706696,
-0.1233846918,
-0.4189224541,
0.1245739684,
0.2815956175,
-0.286385715,
-0.09428408,
0.1666616797,
0.1116884649,
-0.1594537795,
0.0924318209,
-0.0615468919,
-0.1313706636,
0.2736980915,
0.1938244104,
0.053395085,
-0.076833427,
0.0873869583,
-0.1187308431,
-0.3134465218,
0.2363166809,
0.2007079422,
-0.196581617,
0.0685269758,
-0.2107043564,
0.0794221237,
0.312517643,
0.1303778291,
0.0038125012,
-0.3625320494,
0.1793458909,
-0.036080651,
0.076100111,
0.0311516132,
-0.1425474137,
-0.2864572406,
0.4078353047,
0.1651120037,
-0.2894611657,
-0.2251745909,
0.0795920193,
0.0420676693,
-0.0174696818,
0.0502239019,
0.0443149656,
0.0742177814,
0.0834892243,
0.0840105265,
0.4601435065,
-0.2134632766,
0.2142042965,
0.3518062532,
-0.1893409491,
0.1552313864,
0.464518398,
0.0193827413,
0.139503032,
0.2465682477,
-0.2924770415,
0.2774635255,
-0.2668266892,
0.2905974984,
0.0077865031,
-0.5513209105,
-0.1049559787,
0.2481088489,
0.1737881899,
0.1777733415,
-0.0665572733,
0.4451243579,
-0.1363946497,
-0.1493588388,
-0.048861526,
0.3371510208,
-0.271699369,
-0.3933607042,
-0.1717555225,
0.1165925413,
-0.3949106932,
-0.0003244951,
-0.1727863997,
-0.0516138263,
0.1566631049,
0.1176722944,
-0.1058414951,
-0.3330610394,
-0.051996097,
0.0413667113,
-0.1215797663,
-0.2785531282,
0.3481363356,
-0.0297524109,
0.0341185592,
-0.1454358697,
0.1331012547,
0.286752224,
0.2867029607,
-0.3025154471,
-0.1334830523,
-0.1729677469,
0.1225763708,
-0.0304383375,
0.349045366,
0.1919638515,
0.468568027,
0.3146243691,
0.0043296888,
-0.1563209593,
-0.1256594807,
0.1083557084,
0.1264603287,
-0.0214297473,
0.5825882554,
-0.1685423255,
-0.0980616957,
-0.2602687478,
0.1421923488,
-0.4087180495,
0.3401705623,
0.2832736075,
0.0236684456,
0.1059989557,
-0.0358007178,
0.0434373021,
0.1233691871,
0.5698879361,
0.3689297736,
0.1243643463,
-0.5338955522,
-0.0682754815,
-0.4477843344,
0.0827897191,
0.1939661652,
0.1000258327,
0.0117925256,
0.0065875128,
0.0200606864,
-0.0266445745,
-0.3352199495,
0.3204244375,
0.0809504017,
0.0154611086,
-0.0622801371,
0.1577233225,
-0.2189986855,
0.0669479817,
0.0925960243,
-0.466753602,
0.0259803142,
0.1531937569,
-0.0335599408,
-0.2191779464,
-0.1638471782,
0.0668926686,
0.0369469523,
-0.0208345167,
0.1326973438,
0.726259768,
-0.0420240089,
-0.1220181286,
-0.4175514579,
-0.4629764259,
-0.1109925285,
0.0634772927,
-0.1209396943,
0.6496317387,
-0.0621026605,
0.3009476364,
-0.4772536159,
0.1635540277,
-0.1076171473,
-0.1320490241,
-0.3146500587,
0.1489405632,
-0.245984301,
-0.0108385943,
0.1265087277,
0.3707411885,
0.122842133,
0.0308369026,
-0.3600006402,
-0.2014895231,
0.6447339654,
-0.7064893842,
-0.1418031305,
-0.0910699964,
0.0048359409,
0.2531349659,
-0.2619391382,
-0.7180588841,
-0.0282274783,
0.1826344132,
-0.0192401353,
-0.1775763333,
0.0684964806,
0.2308870703,
-0.0765094459,
-0.1325626224,
0.6567322612,
0.1076975167,
-0.220926851,
0.1161852181,
-0.1894452274
] |
https://github.com/huggingface/datasets/issues/600 | Pickling error when loading dataset | Closing since it looks like it's working on >= 3.6.9
Feel free to re-open if you have other questions :) | Hi,
I modified line 136 in the original [run_language_modeling.py](https://github.com/huggingface/transformers/blob/master/examples/language-modeling/run_language_modeling.py) as:
```
# line 136: return LineByLineTextDataset(tokenizer=tokenizer, file_path=file_path, block_size=args.block_size)
dataset = load_dataset("text", data_files=file_path, split="train")
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True,
truncation=True, max_length=args.block_size), batched=True)
dataset.set_format(type='torch', columns=['input_ids'])
return dataset
```
When I run this with transformers (3.1.0) and nlp (0.4.0), I get the following error:
```
Traceback (most recent call last):
File "src/run_language_modeling.py", line 319, in <module>
main()
File "src/run_language_modeling.py", line 248, in main
get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None
File "src/run_language_modeling.py", line 139, in get_dataset
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True, truncation=True, max_length=args.block_size), batched=True)
File "/data/nlp/src/nlp/arrow_dataset.py", line 1136, in map
new_fingerprint=new_fingerprint,
File "/data/nlp/src/nlp/fingerprint.py", line 158, in wrapper
self._fingerprint, transform, kwargs_for_fingerprint
File "/data/nlp/src/nlp/fingerprint.py", line 105, in update_fingerprint
hasher.update(transform_args[key])
File "/data/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/data/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/data/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/data/nlp/src/nlp/utils/py_utils.py", line 362, in dumps
dump(obj, file)
File "/data/nlp/src/nlp/utils/py_utils.py", line 339, in dump
Pickler(file, recurse=True).dump(obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 446, in dump
StockPickler.dump(self, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 409, in dump
self.save(obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1438, in save_function
obj.__dict__, fkwdefaults), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 751, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 736, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1170, in save_cell
pickler.save_reduce(_create_cell, (f,), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 736, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 605, in save_reduce
save(cls)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1365, in save_type
obj.__bases__, _dict), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 751, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 933, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 847, in _batch_setitems
save(v)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 933, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 847, in _batch_setitems
save(v)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 507, in save
self.save_global(obj, rv)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 927, in save_global
(obj, module_name, name))
_pickle.PicklingError: Can't pickle typing.Union[str, NoneType]: it's not the same object as typing.Union
``` | 20 | Pickling error when loading dataset
Hi,
I modified line 136 in the original [run_language_modeling.py](https://github.com/huggingface/transformers/blob/master/examples/language-modeling/run_language_modeling.py) as:
```
# line 136: return LineByLineTextDataset(tokenizer=tokenizer, file_path=file_path, block_size=args.block_size)
dataset = load_dataset("text", data_files=file_path, split="train")
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True,
truncation=True, max_length=args.block_size), batched=True)
dataset.set_format(type='torch', columns=['input_ids'])
return dataset
```
When I run this with transformers (3.1.0) and nlp (0.4.0), I get the following error:
```
Traceback (most recent call last):
File "src/run_language_modeling.py", line 319, in <module>
main()
File "src/run_language_modeling.py", line 248, in main
get_dataset(data_args, tokenizer=tokenizer, cache_dir=model_args.cache_dir) if training_args.do_train else None
File "src/run_language_modeling.py", line 139, in get_dataset
dataset = dataset.map(lambda ex: tokenizer(ex["text"], add_special_tokens=True, truncation=True, max_length=args.block_size), batched=True)
File "/data/nlp/src/nlp/arrow_dataset.py", line 1136, in map
new_fingerprint=new_fingerprint,
File "/data/nlp/src/nlp/fingerprint.py", line 158, in wrapper
self._fingerprint, transform, kwargs_for_fingerprint
File "/data/nlp/src/nlp/fingerprint.py", line 105, in update_fingerprint
hasher.update(transform_args[key])
File "/data/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/data/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/data/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/data/nlp/src/nlp/utils/py_utils.py", line 362, in dumps
dump(obj, file)
File "/data/nlp/src/nlp/utils/py_utils.py", line 339, in dump
Pickler(file, recurse=True).dump(obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 446, in dump
StockPickler.dump(self, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 409, in dump
self.save(obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1438, in save_function
obj.__dict__, fkwdefaults), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 751, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 736, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1170, in save_cell
pickler.save_reduce(_create_cell, (f,), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 736, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 605, in save_reduce
save(cls)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 1365, in save_type
obj.__bases__, _dict), obj=obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 610, in save_reduce
save(args)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 751, in save_tuple
save(element)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 933, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 847, in _batch_setitems
save(v)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "/root/miniconda3/envs/py3.6/lib/python3.6/site-packages/dill/_dill.py", line 933, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 847, in _batch_setitems
save(v)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 507, in save
self.save_global(obj, rv)
File "/root/miniconda3/envs/py3.6/lib/python3.6/pickle.py", line 927, in save_global
(obj, module_name, name))
_pickle.PicklingError: Can't pickle typing.Union[str, NoneType]: it's not the same object as typing.Union
```
Closing since it looks like it's working on >= 3.6.9
Feel free to re-open if you have other questions :) | [
-0.2029626071,
-0.191864118,
0.1510747075,
0.2650443912,
0.1417570561,
-0.1736163646,
0.2733697891,
0.3465306461,
0.0597779304,
-0.0787870735,
0.0942318216,
0.3698832989,
-0.2363301367,
0.0360411257,
0.086170882,
-0.4138215482,
-0.0133401863,
0.1228953525,
-0.1728637516,
0.0111280903,
-0.1992152929,
0.3199379444,
-0.22430031,
0.1471234262,
-0.5054330826,
-0.0055766646,
0.0756162629,
0.1886182874,
-0.0382537693,
-0.3906204104,
0.3578637838,
-0.0553876683,
0.2542708516,
0.4166390002,
-0.0001185715,
0.1009834036,
0.4450069368,
-0.1613718569,
-0.2186081707,
-0.5004020333,
0.3419600129,
-0.1968301237,
0.195206359,
-0.1194227338,
-0.1389637142,
0.0235720389,
0.0966474935,
0.0984589905,
0.3546353877,
0.2370630205,
0.2198282182,
0.7399454713,
0.0196056366,
-0.0040007085,
-0.1397821754,
0.3455083966,
-0.0702358857,
0.1682085395,
0.0740607232,
0.0049475208,
-0.2505332232,
0.2688039839,
0.0097402558,
-0.1106546968,
0.0340494365,
0.04309351,
-0.0089142062,
-0.0071582589,
0.2315975279,
0.1808645427,
0.3163923025,
-0.1332210302,
-0.4018166959,
-0.4381491244,
-0.0321019664,
-0.253823638,
0.3569170237,
0.0049060769,
-0.1719965041,
-0.0149834994,
-0.1687417626,
-0.2313759625,
0.0281378329,
0.3235932887,
0.1266870648,
0.6207985282,
-0.0286826808,
0.2285000831,
0.3370735049,
-0.1880075186,
0.1313725859,
-0.027758766,
0.0349621661,
0.4023726285,
-0.2388503999,
-0.1197173521,
0.0664065853,
-0.1022803336,
0.0406201109,
0.0130148735,
-0.3214133978,
0.0603421368,
-0.2768704295,
-0.0223204345,
0.2950416803,
0.2889094055,
0.0581619292,
0.5016821027,
-0.0262935013,
-0.1388158202,
0.2159737647,
0.0168191716,
-0.1241563112,
-0.2818192542,
-0.1952975839,
0.097701773,
0.0427058637,
-0.252340734,
-0.0277268961,
-0.2434886396,
-0.0807856098,
0.0121761151,
0.1265030503,
0.4822337627,
0.0803426206,
0.1158536077,
0.0164869428,
0.1796440929,
-0.418432951,
-0.2533633709,
-0.1892675459,
0.0220431387,
-0.3372351527,
-0.1492153853,
0.2223520875,
0.1535899192,
0.3354985714,
-0.1102678925,
-0.0604241081,
0.1364388615,
-0.0179688558,
-0.1492114365,
-0.0685900748,
0.2175128758,
-0.2264652103,
0.1727864295,
0.2571264207,
-0.1251235455,
-0.2969125509,
0.1636128724,
-0.0111902989,
-0.253051728,
-0.1209304184,
0.0851274952,
-0.1824114621,
-0.2599700987,
-0.3591950238,
0.2333444953,
0.3632025719,
-0.2630691826,
-0.1501653939,
-0.3836105764,
-0.1859427094,
-0.1210466325,
0.1095167398,
0.6124694347,
-0.10491658,
-0.4261672795,
-0.1083564162,
-0.1307387054,
0.3863958716,
0.4650295377,
-0.4532152116,
0.1536078155,
-0.2186063528,
0.569476068,
0.3836513162,
-0.1736367345,
-0.2988277376,
0.2628950179,
-0.2244314849,
0.2072925568,
-0.0707562864,
-0.1237978488,
-0.1637936831,
0.057843633,
-0.1647841781,
0.3448472917,
0.1585468203,
0.2666155696,
-0.2812249362,
-0.292591691,
0.4360115528,
0.292026341,
0.2161482722,
0.1050297618,
-0.1254454404,
0.3208657503,
0.2130340487,
-0.2217202038,
0.1912668645,
0.0267761014,
0.022275107,
-0.157406196,
-0.0328533649,
0.0872847885,
-0.6515650153,
0.1748920232,
-0.2471688539,
0.3482393622,
-0.3800008893,
0.122223109,
-0.0670935288,
-0.1169425249,
-0.3418496549,
-0.2124997526,
0.0444623828,
0.1835246235,
-0.0876025707,
-0.0400481112,
-0.0541979223,
0.0558040291,
0.0123566762,
0.1081895754,
-0.4298700988,
0.1635246575,
-0.1746212989,
-0.3540353775,
-0.05868515,
0.2108430713,
0.2123283595,
-0.2704419494,
-0.2660780549,
0.2112216055,
0.0572889075,
0.2215669751,
0.0190084111,
-0.0050374269,
0.1972182095,
-0.1928326786,
0.0177531186,
0.355889231,
0.0182692222,
0.0179571956,
0.084220767,
0.3661317825,
0.0188052338,
0.0989182964,
0.1062776893,
0.0629076138,
0.1784607172,
-0.0767169297,
0.163967669,
-0.1629997939,
0.3003992736,
0.0798102021,
0.4254424572,
0.0218043607,
-0.3364064097,
-0.2099475563,
0.4499590397,
0.0788154602,
0.1665170193,
0.3088003993,
-0.0811552405,
-0.1771025956,
-0.0583941266,
0.0550889671,
0.2792567611,
0.0804544017,
-0.0513549112,
0.0372812599,
-0.0395420305,
-0.1005214006,
0.177628085,
0.101741001,
-0.0386785306,
0.3335557282,
-0.0069131404,
0.0972454622,
-0.43229267,
-0.284920603,
0.1184079647,
0.3154611588,
-0.213056922,
0.1573488116,
-0.1570317149,
-0.0666514039,
-0.4013565183,
-0.2381339222,
-0.476208061,
-0.1907107532,
-0.2355119139,
0.1614568383,
-0.1278425157,
0.2946344018,
-0.0560880415,
0.0913452879,
0.0177858844,
-0.0834000409,
-0.1864182651,
-0.2087881714,
-0.4619051814,
-0.0359910801,
0.2449777722,
0.0210265964,
0.1043825224,
0.1237275004,
-0.0020177364,
0.1463042647,
-0.35873878,
0.2713181376,
0.0121463519,
0.187656939,
0.1870480031,
0.0780556649,
-0.1918314695,
-0.4377752542,
0.2089123428,
-0.0625456721,
-0.2025258541,
0.0551075116,
0.096565038,
-0.0389923677,
-0.1555089056,
-0.3513686061,
-0.3187610507,
-0.4107225239,
0.1541297883,
-0.106872201,
0.1926856637,
0.6526252031,
-0.040500775,
0.3030757904,
-0.2392678112,
0.0254839398,
-0.1506413519,
0.1499269009,
0.0620459355,
-0.1284029335,
-0.3454410136,
-0.3472177684,
-0.0850818008,
0.2574237883,
-0.0885444805,
-0.2285955548,
0.3524973989,
-0.1272809803,
-0.0153183416,
-0.0792653561,
-0.0165646784,
0.437337935,
0.1797024608,
0.0240605772,
-0.0108515471,
-0.025229305,
0.2277864218,
-0.0137910219,
0.39828372,
-0.0115690213,
0.1218837649,
0.0862019956,
0.8433318138,
0.1893254369,
-0.3235137463,
0.3949733377,
0.3149034977,
-0.0733483881,
-0.2566161454,
-0.3480522931,
-0.0821665227,
-0.0401235148,
0.173765257,
0.2232932746,
-0.0486485772,
-0.371719569,
-0.0359345861,
-0.1308296919,
-0.1652548909,
-0.3224000335,
0.2523367405,
-0.1592558622,
0.2457666695,
0.0005499423,
0.0926777795,
-0.2677296102,
0.014488861,
0.0368716121,
0.2266471237,
0.1431091279,
0.0896597058,
-0.5136069059,
-0.0663284287,
-0.4781810045,
0.2796217203,
0.1847024858,
0.5642687678,
-0.1529384702,
-0.1680026203,
0.1045318246,
-0.0186035782,
0.7028647661,
0.0230170507,
-0.0897497609,
0.1123476028,
-0.2468472123,
-0.3765422106,
0.0356232896,
-0.0458275452,
0.1709310114,
0.3891172111,
0.5012795329,
-0.4591731429,
-0.3562015891,
0.0085629541,
-0.0293876864,
-0.0819564313,
-0.201849103,
-0.1864292324,
0.0265048221,
-0.2668230236,
0.1215892881,
-0.0974667966,
0.3548803627,
0.0250715241,
0.29824844,
-0.0434096083,
-0.134286955,
-0.1364928782,
0.1130338758,
0.1991305053,
-0.0183562301,
-0.0332276635,
0.3735995293,
0.1696553379,
-0.2558123469,
0.2703851759,
0.0274355933,
-0.2911649942,
0.0742890537,
0.2565268874,
0.1025480479,
0.2148577273,
0.1469195634,
0.2874315977,
0.113722235,
0.0473640263,
0.0363877751,
0.2269978672,
0.2227708697,
0.1510109603,
-0.3832802773,
-0.2247339934,
0.4746956825,
-0.2865242362,
0.014801085,
0.2330477834,
-0.1075508744,
-0.331222415,
0.6692475677,
-0.0216855183,
1.0043594837,
0.1279560924,
0.2349966168,
0.3570193052,
0.0152031705,
0.4015215635,
0.0753976032,
0.0922101289,
-0.4636112452,
0.1191377938,
-0.0363175049,
-0.0786610544,
0.1102758721,
0.2392286956,
-0.1544450521,
0.1322178394,
-0.0604647323,
-0.2274058461,
0.1711774468,
0.4276745617,
0.1397118866,
-0.4334683418,
-0.2043322772,
0.0076697506,
-0.2661411166,
0.5549463034,
-0.0314721502,
-0.2316811532,
-0.0094626993,
-0.3618115187,
-0.2293411493,
0.3019568622,
-0.099499397,
0.3744978905,
0.6007330418,
-0.1288868636,
0.4571171403,
0.1026241481,
0.2113720775,
0.0299901366,
-0.2630378306,
0.192215085,
-0.0337363407,
-0.0982527658,
0.0589330681,
0.0264071897,
0.3566519916,
0.0146896988,
-0.2336517572,
-0.0800438747,
-0.1291652173,
-0.2312930822,
-0.0259568244,
-0.1611710489,
0.0190383084,
-0.1472083479,
-0.3155694902,
0.0277753994,
0.0255548507,
-0.164984107,
0.0687859356,
0.0484747514,
-0.244903475,
-0.2078677714,
0.3131468594,
-0.2475469112,
-0.0739669353,
0.481036067,
-0.0992322415,
-0.0194899663,
0.6444161534,
0.3653109074,
-0.2062574923,
-0.1943456829,
-0.08366175,
0.1650458127,
-0.8099957108,
-0.0325144939,
-0.0016930513,
-0.2046626508,
-0.0655460656,
0.5908413529,
0.2419276386,
0.0629567951,
-0.1193099916,
-0.3331794143,
-0.3890790641,
0.2196121663,
-0.2222794145,
0.1357289255,
0.0955775231,
0.3911069334,
-0.034744788,
0.1781406701,
-0.2182644904,
-0.0300162248,
-0.2331593335,
0.2521229982,
0.2451466024,
-0.4850664437,
0.225737676,
0.0978111103,
0.0273079947,
0.1323978305,
-0.313706696,
-0.1233846918,
-0.4189224541,
0.1245739684,
0.2815956175,
-0.286385715,
-0.09428408,
0.1666616797,
0.1116884649,
-0.1594537795,
0.0924318209,
-0.0615468919,
-0.1313706636,
0.2736980915,
0.1938244104,
0.053395085,
-0.076833427,
0.0873869583,
-0.1187308431,
-0.3134465218,
0.2363166809,
0.2007079422,
-0.196581617,
0.0685269758,
-0.2107043564,
0.0794221237,
0.312517643,
0.1303778291,
0.0038125012,
-0.3625320494,
0.1793458909,
-0.036080651,
0.076100111,
0.0311516132,
-0.1425474137,
-0.2864572406,
0.4078353047,
0.1651120037,
-0.2894611657,
-0.2251745909,
0.0795920193,
0.0420676693,
-0.0174696818,
0.0502239019,
0.0443149656,
0.0742177814,
0.0834892243,
0.0840105265,
0.4601435065,
-0.2134632766,
0.2142042965,
0.3518062532,
-0.1893409491,
0.1552313864,
0.464518398,
0.0193827413,
0.139503032,
0.2465682477,
-0.2924770415,
0.2774635255,
-0.2668266892,
0.2905974984,
0.0077865031,
-0.5513209105,
-0.1049559787,
0.2481088489,
0.1737881899,
0.1777733415,
-0.0665572733,
0.4451243579,
-0.1363946497,
-0.1493588388,
-0.048861526,
0.3371510208,
-0.271699369,
-0.3933607042,
-0.1717555225,
0.1165925413,
-0.3949106932,
-0.0003244951,
-0.1727863997,
-0.0516138263,
0.1566631049,
0.1176722944,
-0.1058414951,
-0.3330610394,
-0.051996097,
0.0413667113,
-0.1215797663,
-0.2785531282,
0.3481363356,
-0.0297524109,
0.0341185592,
-0.1454358697,
0.1331012547,
0.286752224,
0.2867029607,
-0.3025154471,
-0.1334830523,
-0.1729677469,
0.1225763708,
-0.0304383375,
0.349045366,
0.1919638515,
0.468568027,
0.3146243691,
0.0043296888,
-0.1563209593,
-0.1256594807,
0.1083557084,
0.1264603287,
-0.0214297473,
0.5825882554,
-0.1685423255,
-0.0980616957,
-0.2602687478,
0.1421923488,
-0.4087180495,
0.3401705623,
0.2832736075,
0.0236684456,
0.1059989557,
-0.0358007178,
0.0434373021,
0.1233691871,
0.5698879361,
0.3689297736,
0.1243643463,
-0.5338955522,
-0.0682754815,
-0.4477843344,
0.0827897191,
0.1939661652,
0.1000258327,
0.0117925256,
0.0065875128,
0.0200606864,
-0.0266445745,
-0.3352199495,
0.3204244375,
0.0809504017,
0.0154611086,
-0.0622801371,
0.1577233225,
-0.2189986855,
0.0669479817,
0.0925960243,
-0.466753602,
0.0259803142,
0.1531937569,
-0.0335599408,
-0.2191779464,
-0.1638471782,
0.0668926686,
0.0369469523,
-0.0208345167,
0.1326973438,
0.726259768,
-0.0420240089,
-0.1220181286,
-0.4175514579,
-0.4629764259,
-0.1109925285,
0.0634772927,
-0.1209396943,
0.6496317387,
-0.0621026605,
0.3009476364,
-0.4772536159,
0.1635540277,
-0.1076171473,
-0.1320490241,
-0.3146500587,
0.1489405632,
-0.245984301,
-0.0108385943,
0.1265087277,
0.3707411885,
0.122842133,
0.0308369026,
-0.3600006402,
-0.2014895231,
0.6447339654,
-0.7064893842,
-0.1418031305,
-0.0910699964,
0.0048359409,
0.2531349659,
-0.2619391382,
-0.7180588841,
-0.0282274783,
0.1826344132,
-0.0192401353,
-0.1775763333,
0.0684964806,
0.2308870703,
-0.0765094459,
-0.1325626224,
0.6567322612,
0.1076975167,
-0.220926851,
0.1161852181,
-0.1894452274
] |
https://github.com/huggingface/datasets/issues/598 | The current version of the package on github has an error when loading dataset | Thanks for reporting !
Which version of transformers are you using ?
It looks like it doesn't have the PreTrainedTokenizerBase class | Instead of downloading the package from pip, downloading the version from source will result in an error when loading dataset (the pip version is completely fine):
To recreate the error:
First, installing nlp directly from source:
```
git clone https://github.com/huggingface/nlp.git
cd nlp
pip install -e .
```
Then run:
```
from nlp import load_dataset
dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
```
will give error:
```
>>> dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
Checking /home/zeyuy/.cache/huggingface/datasets/84a754b488511b109e2904672d809c041008416ae74e38f9ee0c80a8dffa1383.2e21f48d63b5572d19c97e441fbb802257cf6a4c03fbc5ed8fae3d2c2273f59e.py for additional imports.
Found main folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext
Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Found script file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.py
Found dataset infos file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/dataset_infos.json to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/dataset_infos.json
Found metadata file for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.json
Loading Dataset Infos from /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Overwrite dataset info from restored data version.
Loading Dataset info from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Reusing dataset wikitext (/home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d)
Constructing Dataset for split train, from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/load.py", line 600, in load_dataset
ds = builder_instance.as_dataset(split=split, ignore_verifications=ignore_verifications)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 611, in as_dataset
datasets = utils.map_nested(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 216, in map_nested
return function(data_struct)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 631, in _build_single_dataset
ds = self._as_dataset(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 704, in _as_dataset
return Dataset(**dataset_kwargs)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/arrow_dataset.py", line 188, in __init__
self._fingerprint = generate_fingerprint(self)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 91, in generate_fingerprint
hasher.update(key)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 361, in dumps
with _no_cache_fields(obj):
File "/home/zeyuy/miniconda3/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 348, in _no_cache_fields
if isinstance(obj, tr.PreTrainedTokenizerBase) and hasattr(obj, "cache") and isinstance(obj.cache, dict):
AttributeError: module 'transformers' has no attribute 'PreTrainedTokenizerBase'
```
| 21 | The current version of the package on github has an error when loading dataset
Instead of downloading the package from pip, downloading the version from source will result in an error when loading dataset (the pip version is completely fine):
To recreate the error:
First, installing nlp directly from source:
```
git clone https://github.com/huggingface/nlp.git
cd nlp
pip install -e .
```
Then run:
```
from nlp import load_dataset
dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
```
will give error:
```
>>> dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
Checking /home/zeyuy/.cache/huggingface/datasets/84a754b488511b109e2904672d809c041008416ae74e38f9ee0c80a8dffa1383.2e21f48d63b5572d19c97e441fbb802257cf6a4c03fbc5ed8fae3d2c2273f59e.py for additional imports.
Found main folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext
Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Found script file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.py
Found dataset infos file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/dataset_infos.json to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/dataset_infos.json
Found metadata file for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.json
Loading Dataset Infos from /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Overwrite dataset info from restored data version.
Loading Dataset info from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Reusing dataset wikitext (/home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d)
Constructing Dataset for split train, from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/load.py", line 600, in load_dataset
ds = builder_instance.as_dataset(split=split, ignore_verifications=ignore_verifications)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 611, in as_dataset
datasets = utils.map_nested(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 216, in map_nested
return function(data_struct)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 631, in _build_single_dataset
ds = self._as_dataset(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 704, in _as_dataset
return Dataset(**dataset_kwargs)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/arrow_dataset.py", line 188, in __init__
self._fingerprint = generate_fingerprint(self)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 91, in generate_fingerprint
hasher.update(key)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 361, in dumps
with _no_cache_fields(obj):
File "/home/zeyuy/miniconda3/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 348, in _no_cache_fields
if isinstance(obj, tr.PreTrainedTokenizerBase) and hasattr(obj, "cache") and isinstance(obj.cache, dict):
AttributeError: module 'transformers' has no attribute 'PreTrainedTokenizerBase'
```
Thanks for reporting !
Which version of transformers are you using ?
It looks like it doesn't have the PreTrainedTokenizerBase class | [
-0.2237435281,
-0.2012972683,
-0.0425651111,
0.0305602606,
0.0355135016,
-0.0190251619,
-0.1124797463,
0.245201692,
-0.0017795414,
-0.2171768695,
0.2775868773,
0.3542348444,
-0.2464633286,
0.2480498254,
0.3240588605,
-0.2881515622,
0.0670980141,
0.3065454066,
-0.2375207841,
0.0527713001,
-0.0257974789,
0.4857281744,
-0.2306239903,
0.3760010302,
-0.0834266543,
0.0187895242,
0.0690762997,
0.1313542128,
-0.2909764647,
-0.5385345221,
0.6074132323,
-0.0616440214,
0.0506869964,
0.1241365448,
-0.0001099683,
0.0639229566,
0.4524998367,
0.0468346104,
-0.2115154266,
-0.5344375968,
-0.0258459598,
-0.4982025325,
0.2533907592,
-0.0385188013,
-0.033229176,
-0.1539615244,
0.123736158,
0.2145824581,
0.5887582302,
0.2153976709,
0.2569970191,
0.3765674829,
0.3491863906,
-0.0690732598,
-0.0353674963,
-0.2311027199,
-0.057744287,
0.1733410358,
0.0412600413,
-0.0806533247,
0.2716606557,
0.1918047965,
0.059941411,
-0.0913043916,
0.4098451138,
-0.1614539325,
-0.1533887684,
-0.1004510373,
-0.1068111956,
0.2216579616,
0.2371094972,
-0.3243215382,
-0.407365799,
-0.3286473751,
0.1875767708,
-0.2094339132,
0.4702660441,
0.0190893486,
-0.1861206293,
0.1364383847,
-0.2939094603,
-0.3428579867,
-0.1923739612,
0.2609533966,
0.1397569776,
0.3448454738,
-0.1708794087,
0.1244882196,
0.2748578191,
-0.1763909459,
-0.1969624907,
0.1317259967,
-0.1033228412,
0.3190253079,
-0.1158775613,
-0.0795880333,
-0.1095688939,
0.1889614463,
0.0918164998,
0.0788193196,
-0.197187081,
-0.1108214557,
-0.0314430296,
0.1409584284,
0.2720893621,
0.271759063,
0.177199617,
-0.1189833283,
-0.003281977,
0.1774202883,
0.3424130082,
0.0457537211,
-0.0453366712,
-0.3309017718,
-0.4002842605,
-0.1914319098,
0.1130491346,
-0.1737481505,
-0.3322198093,
-0.0813275278,
0.0919059813,
0.0365610607,
0.1109027863,
0.0576234497,
-0.1283794791,
0.2926148474,
0.023491554,
0.19237867,
-0.3091998696,
0.0621639565,
-0.168789953,
0.1422374994,
-0.2686136663,
-0.1336474419,
0.1294742227,
-0.2735539675,
0.5659919977,
-0.0004659779,
-0.0992646664,
0.0367270485,
-0.2505792379,
0.2368857116,
-0.3236500621,
0.1910730153,
-0.0812206492,
0.2065938413,
-0.0171953142,
-0.0790163353,
-0.2939081788,
-0.1111740172,
-0.3565988541,
-0.2549516857,
-0.1689946949,
0.2119120359,
-0.2016154379,
-0.2702844739,
-0.2804157138,
0.0198177397,
-0.065805316,
-0.3608135283,
-0.0046904907,
-0.2430275679,
-0.1055392772,
-0.0930549204,
0.3117879033,
0.2317884713,
0.0926559567,
-0.3515031934,
-0.235999465,
-0.3532509208,
0.0857137442,
0.2320383191,
-0.2740489542,
0.2068705559,
-0.188013345,
0.1380043626,
0.623500824,
-0.5698124766,
-0.5247966647,
0.2389449328,
-0.1837095618,
0.1494829357,
-0.106520474,
-0.0558203645,
-0.2375169396,
-0.1302373409,
0.0816134661,
0.2938148677,
0.0514479578,
-0.1161219031,
-0.1398785263,
-0.4660893679,
0.2725194395,
0.0642340481,
-0.0643579215,
-0.0080759302,
-0.0887140632,
0.4918845296,
0.1930050105,
0.1258877218,
-0.0510848574,
0.1976207495,
0.0127383079,
0.0237860288,
-0.3512736559,
-0.261854738,
-0.4162874222,
0.153845191,
-0.2101593614,
-0.0290677473,
-0.1326702833,
-0.0577357598,
-0.307828784,
-0.0738362297,
-0.1671572179,
-0.2715991437,
0.1607210338,
0.004829362,
0.2955423892,
0.1357760727,
-0.1708002388,
0.3214162588,
0.0418823585,
0.1781066954,
-0.4866743088,
0.0572807491,
-0.1335321367,
-0.0460232757,
0.1729027331,
0.4731178582,
0.1276734173,
-0.2077902108,
0.0151711889,
0.4866742492,
-0.3806296289,
0.2787310779,
-0.0023202333,
-0.169099614,
0.1775315255,
-0.3081973195,
-0.1312316507,
0.0708649829,
-0.0935906321,
0.0851687565,
-0.0395265073,
0.0182777457,
-0.0922876596,
0.0961841941,
0.2280827314,
0.1834035218,
0.3326525092,
-0.0641416982,
-0.024161268,
-0.1820687354,
0.2553634346,
-0.064981021,
0.1488991976,
-0.1461169124,
-0.0508807525,
-0.0003215298,
0.4923802018,
0.1942919195,
0.0343223512,
0.2010838836,
0.0021547731,
-0.0955142453,
0.0481096059,
0.2638821006,
0.2911491692,
0.2004128098,
-0.0828937292,
0.1889246404,
-0.2043617368,
-0.1494051665,
0.2577416897,
0.1075861827,
-0.0307696518,
0.1811572313,
-0.1337857395,
0.0114307664,
-0.2375520617,
0.0948072374,
-0.0222692303,
0.2971347868,
-0.0851934776,
-0.0681525692,
-0.4806862772,
-0.3239580095,
-0.3728137314,
-0.2951775491,
-0.4344967604,
-0.1589564681,
-0.0362029672,
0.1149726361,
0.0485618114,
0.1649898589,
-0.0553436875,
-0.0526931286,
-0.2266287804,
-0.0447862558,
-0.0267590508,
-0.0484158769,
-0.3857757151,
0.0981594324,
0.1967357248,
0.2448917925,
0.4569404125,
-0.2492744029,
-0.1252939254,
-0.0644191578,
-0.2730084658,
0.1938120723,
-0.2926282883,
0.2592027187,
0.3556086719,
-0.051629059,
-0.2703352869,
-0.335970521,
0.3509817421,
-0.4124707282,
-0.1979118884,
-0.0850105658,
0.0744216889,
-0.0316738449,
-0.2016800195,
-0.308390826,
-0.2766190171,
-0.350523293,
0.3111944199,
0.0806485638,
0.0984397754,
0.4184325039,
-0.0125569142,
0.2687785923,
-0.1211046576,
0.0469820872,
-0.1270651519,
-0.2123742998,
0.0569492877,
-0.3140260279,
-0.3949982226,
0.0235567912,
0.228068158,
0.3154692054,
0.0841655731,
-0.4360539615,
-0.2520711124,
0.0119623151,
0.0624023974,
0.0597274639,
0.3059400618,
0.3308090866,
-0.0383230038,
-0.0712784156,
-0.1615496129,
-0.0299326852,
0.0604028702,
-0.1182853431,
0.3523582518,
0.1799402833,
0.4678027034,
-0.0238170736,
0.6063984036,
0.497233808,
0.0376835614,
0.3265111148,
-0.055383496,
0.3800788522,
-0.1730224043,
-0.2814177275,
0.1348730326,
0.0858372301,
0.5628988743,
0.2523514032,
0.1252412945,
0.1085705608,
-0.2215528637,
-0.1356803477,
-0.2715798616,
-0.1474338174,
-0.1767601222,
0.0646157563,
0.2577385008,
0.1145812944,
-0.0049042404,
-0.0982686281,
-0.2549662292,
0.1954673529,
0.2483880371,
-0.070077464,
0.256475538,
-0.3562459946,
-0.2916221023,
-0.6304401755,
0.0574147999,
0.1644299775,
0.5169091225,
0.0251790434,
0.1565308869,
0.1662593484,
-0.1031454057,
0.3516109884,
0.000263501,
0.0848304629,
0.0737705529,
0.1700430959,
-0.5438057184,
-0.0358664393,
-0.0565551184,
0.3793545365,
0.4606743455,
0.0374551378,
-0.489433527,
-0.398358494,
0.3138809204,
0.1002968773,
-0.1521900147,
-0.0328312479,
-0.1738866866,
-0.2557459474,
-0.2024968565,
-0.202711314,
-0.0166214481,
0.3583608568,
0.0794929862,
0.2841626406,
-0.0602860525,
-0.0682128444,
-0.0234563835,
0.018086914,
0.3587937653,
0.087280564,
0.0495739505,
0.172274068,
0.1227549016,
-0.1981004179,
0.740023911,
0.0505525544,
-0.0480799079,
-0.1474428624,
0.2945863307,
0.1420303285,
0.1440688074,
-0.1842709184,
-0.0590411909,
-0.0059903655,
0.1297979653,
-0.0680280626,
0.111773476,
0.5068563223,
0.0243970901,
-0.0904783756,
-0.2858388722,
0.4987759292,
0.0972945243,
-0.0226619914,
0.2015022337,
0.436865747,
-0.2149178982,
-0.069213286,
-0.1267644763,
0.6320810318,
0.2031876892,
0.1952811778,
0.2906044722,
-0.0395302922,
0.7745856643,
-0.0314161181,
0.0939441174,
-0.2058900446,
0.1828344315,
0.0607477576,
-0.0090649202,
0.0709610805,
0.0198443085,
-0.235997811,
0.3186233938,
0.284871161,
0.0379711017,
0.0395125523,
0.4954657257,
-0.1353969276,
-0.1887604296,
-0.4930030107,
0.1709432602,
-0.1057603732,
0.4444634616,
-0.2025645822,
-0.1389370263,
0.0988434404,
-0.3229314089,
-0.3750477135,
0.1910779178,
0.0030197091,
0.3070126772,
0.257869184,
-0.1358373463,
0.3146024048,
0.1584175527,
0.3754768372,
0.0336968191,
-0.3622068763,
0.0785574317,
-0.3673883975,
-0.3306551874,
0.0866527259,
0.3278809786,
0.206379205,
-0.2947438359,
-0.1042837873,
-0.0982756615,
-0.0447292179,
-0.1725845933,
-0.0250438899,
-0.065838486,
0.0086646359,
-0.0605948456,
-0.226276055,
0.1407764256,
0.1795839667,
-0.2310498953,
0.1475729644,
0.1545288265,
-0.1493361592,
0.0412915647,
0.1740120947,
0.0057265162,
0.0376492925,
0.4061413407,
0.0011240887,
-0.1326887757,
0.5396637321,
-0.0424131081,
-0.0977003574,
-0.2147789001,
-0.1864569485,
0.0674130544,
-0.4528859556,
-0.0735907257,
0.1367860883,
-0.0382697433,
0.106043607,
0.398059994,
0.215078935,
-0.001826724,
0.0810617507,
-0.3938004375,
-0.1874093562,
0.3194370866,
0.0465021878,
0.0337695144,
0.1690133661,
0.2051026374,
0.2865498364,
0.0706636235,
-0.2923620343,
0.0620421991,
-0.0556401238,
0.0586600415,
0.2769286036,
-0.0889651775,
0.1965152621,
-0.0237098522,
0.0793572292,
0.0398357846,
-0.3567084372,
-0.1860346645,
-0.1730071008,
0.0893268064,
0.1075414643,
-0.316268146,
-0.0202369466,
-0.1181699261,
-0.0508418195,
0.0308044814,
0.1071536094,
-0.0029970855,
0.016107142,
0.1494887471,
0.1483575404,
0.229403168,
-0.0439707339,
0.07121104,
-0.0725781769,
0.2434770316,
0.0977991968,
0.0999879092,
0.0849314034,
-0.0207597911,
-0.1256195009,
-0.0588641241,
0.0226384345,
0.1156223193,
0.0954878032,
-0.4526156187,
0.1014504358,
0.0920292735,
0.322809875,
0.4633260369,
-0.1390681416,
-0.0120361671,
0.5312453508,
0.2185305059,
-0.4312899113,
-0.1404995918,
0.2825393379,
0.0634501651,
-0.0981673971,
-0.0459190235,
0.2540197372,
-0.2216083705,
0.1616757512,
0.0993236452,
0.2928044796,
-0.1385904104,
0.3533585668,
0.2495935857,
0.0416502655,
-0.2093860209,
-0.0784087405,
0.1208258122,
0.153065145,
0.2093197107,
-0.2409857661,
0.0943253636,
-0.2601040304,
0.2324688584,
0.0799042284,
-0.1778873503,
-0.2385255694,
0.106000416,
0.2365899831,
0.1821846813,
0.0217417758,
0.517604053,
-0.3201268613,
-0.2440443784,
-0.2050428689,
0.4098782837,
-0.0548686199,
-0.3036976457,
-0.0718474165,
0.0295324475,
-0.1447573453,
-0.0243564136,
-0.0471462235,
-0.2858161032,
0.1595942974,
0.1387341619,
-0.183124736,
-0.400831908,
-0.186846137,
0.1542173922,
0.1049751788,
0.0005532131,
0.4406987429,
0.0576149672,
-0.1394476295,
-0.0410799719,
0.5153030157,
0.4115841389,
0.2033143938,
0.0158383101,
-0.1346998811,
-0.0550060347,
0.1704123914,
-0.1088862866,
0.2642564178,
0.1135404482,
0.2260671109,
0.2382230759,
0.12211667,
-0.1751365364,
-0.0902134553,
-0.0171185266,
-0.027576752,
-0.2278971076,
0.5078482032,
-0.1254353225,
0.0195975378,
-0.2496705353,
0.3484073877,
-0.4911294281,
0.2872688174,
0.0201305337,
0.1537851691,
-0.0028903671,
-0.3591858745,
0.1024304554,
0.0927830338,
0.6194454432,
0.2813882232,
-0.0012828233,
0.0550917052,
-0.4740493894,
-0.8101555109,
-0.0658306479,
-0.2427225262,
-0.1460844278,
0.2658504248,
-0.0475194715,
-0.4228573143,
0.2343582809,
0.0174474642,
0.1221749038,
-0.0555880293,
-0.2802508771,
-0.1547044516,
0.0316941328,
0.0309368633,
-0.032714285,
0.1080193371,
-0.1543243527,
0.0754258931,
-0.2737427354,
-0.0130390041,
0.0053770114,
0.3522350192,
-0.2582992613,
-0.2497132868,
0.2114717066,
0.2341265678,
0.0735752285,
0.0139635801,
-0.1919139028,
-0.4351012409,
-0.2454767227,
-0.3566770554,
0.3163404465,
-0.1414400935,
0.3165802956,
-0.2512360513,
-0.0709539205,
-0.3026756644,
0.0841599256,
0.0305440575,
0.0684218407,
-0.1361842453,
0.088837795,
-0.1519756615,
0.107005775,
0.2659898996,
0.4927130044,
-0.1406166852,
0.137858212,
-0.3419420719,
-0.3935499489,
0.4931778014,
-0.5361912847,
-0.0375506543,
0.0461832397,
0.2863284349,
0.1845116913,
-0.3383825719,
-0.6405608654,
0.0665392578,
0.1711781621,
0.1276187599,
0.0320477635,
0.2320661247,
-0.1947587132,
-0.1220138595,
0.062313199,
0.3027657568,
0.0421601646,
-0.3858789504,
0.3996356726,
-0.175116986
] |
https://github.com/huggingface/datasets/issues/598 | The current version of the package on github has an error when loading dataset | I was using transformer 2.9. And I switch to the latest transformer package. Everything works just fine!!
Thanks for helping! I should look more carefully next time. Didn't realize loading the data part requires using tokenizer.
| Instead of downloading the package from pip, downloading the version from source will result in an error when loading dataset (the pip version is completely fine):
To recreate the error:
First, installing nlp directly from source:
```
git clone https://github.com/huggingface/nlp.git
cd nlp
pip install -e .
```
Then run:
```
from nlp import load_dataset
dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
```
will give error:
```
>>> dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
Checking /home/zeyuy/.cache/huggingface/datasets/84a754b488511b109e2904672d809c041008416ae74e38f9ee0c80a8dffa1383.2e21f48d63b5572d19c97e441fbb802257cf6a4c03fbc5ed8fae3d2c2273f59e.py for additional imports.
Found main folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext
Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Found script file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.py
Found dataset infos file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/dataset_infos.json to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/dataset_infos.json
Found metadata file for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.json
Loading Dataset Infos from /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Overwrite dataset info from restored data version.
Loading Dataset info from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Reusing dataset wikitext (/home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d)
Constructing Dataset for split train, from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/load.py", line 600, in load_dataset
ds = builder_instance.as_dataset(split=split, ignore_verifications=ignore_verifications)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 611, in as_dataset
datasets = utils.map_nested(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 216, in map_nested
return function(data_struct)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 631, in _build_single_dataset
ds = self._as_dataset(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 704, in _as_dataset
return Dataset(**dataset_kwargs)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/arrow_dataset.py", line 188, in __init__
self._fingerprint = generate_fingerprint(self)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 91, in generate_fingerprint
hasher.update(key)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 361, in dumps
with _no_cache_fields(obj):
File "/home/zeyuy/miniconda3/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 348, in _no_cache_fields
if isinstance(obj, tr.PreTrainedTokenizerBase) and hasattr(obj, "cache") and isinstance(obj.cache, dict):
AttributeError: module 'transformers' has no attribute 'PreTrainedTokenizerBase'
```
| 36 | The current version of the package on github has an error when loading dataset
Instead of downloading the package from pip, downloading the version from source will result in an error when loading dataset (the pip version is completely fine):
To recreate the error:
First, installing nlp directly from source:
```
git clone https://github.com/huggingface/nlp.git
cd nlp
pip install -e .
```
Then run:
```
from nlp import load_dataset
dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
```
will give error:
```
>>> dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
Checking /home/zeyuy/.cache/huggingface/datasets/84a754b488511b109e2904672d809c041008416ae74e38f9ee0c80a8dffa1383.2e21f48d63b5572d19c97e441fbb802257cf6a4c03fbc5ed8fae3d2c2273f59e.py for additional imports.
Found main folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext
Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Found script file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.py
Found dataset infos file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/dataset_infos.json to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/dataset_infos.json
Found metadata file for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.json
Loading Dataset Infos from /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Overwrite dataset info from restored data version.
Loading Dataset info from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Reusing dataset wikitext (/home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d)
Constructing Dataset for split train, from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/load.py", line 600, in load_dataset
ds = builder_instance.as_dataset(split=split, ignore_verifications=ignore_verifications)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 611, in as_dataset
datasets = utils.map_nested(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 216, in map_nested
return function(data_struct)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 631, in _build_single_dataset
ds = self._as_dataset(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 704, in _as_dataset
return Dataset(**dataset_kwargs)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/arrow_dataset.py", line 188, in __init__
self._fingerprint = generate_fingerprint(self)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 91, in generate_fingerprint
hasher.update(key)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 361, in dumps
with _no_cache_fields(obj):
File "/home/zeyuy/miniconda3/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 348, in _no_cache_fields
if isinstance(obj, tr.PreTrainedTokenizerBase) and hasattr(obj, "cache") and isinstance(obj.cache, dict):
AttributeError: module 'transformers' has no attribute 'PreTrainedTokenizerBase'
```
I was using transformer 2.9. And I switch to the latest transformer package. Everything works just fine!!
Thanks for helping! I should look more carefully next time. Didn't realize loading the data part requires using tokenizer.
| [
-0.2237435281,
-0.2012972683,
-0.0425651111,
0.0305602606,
0.0355135016,
-0.0190251619,
-0.1124797463,
0.245201692,
-0.0017795414,
-0.2171768695,
0.2775868773,
0.3542348444,
-0.2464633286,
0.2480498254,
0.3240588605,
-0.2881515622,
0.0670980141,
0.3065454066,
-0.2375207841,
0.0527713001,
-0.0257974789,
0.4857281744,
-0.2306239903,
0.3760010302,
-0.0834266543,
0.0187895242,
0.0690762997,
0.1313542128,
-0.2909764647,
-0.5385345221,
0.6074132323,
-0.0616440214,
0.0506869964,
0.1241365448,
-0.0001099683,
0.0639229566,
0.4524998367,
0.0468346104,
-0.2115154266,
-0.5344375968,
-0.0258459598,
-0.4982025325,
0.2533907592,
-0.0385188013,
-0.033229176,
-0.1539615244,
0.123736158,
0.2145824581,
0.5887582302,
0.2153976709,
0.2569970191,
0.3765674829,
0.3491863906,
-0.0690732598,
-0.0353674963,
-0.2311027199,
-0.057744287,
0.1733410358,
0.0412600413,
-0.0806533247,
0.2716606557,
0.1918047965,
0.059941411,
-0.0913043916,
0.4098451138,
-0.1614539325,
-0.1533887684,
-0.1004510373,
-0.1068111956,
0.2216579616,
0.2371094972,
-0.3243215382,
-0.407365799,
-0.3286473751,
0.1875767708,
-0.2094339132,
0.4702660441,
0.0190893486,
-0.1861206293,
0.1364383847,
-0.2939094603,
-0.3428579867,
-0.1923739612,
0.2609533966,
0.1397569776,
0.3448454738,
-0.1708794087,
0.1244882196,
0.2748578191,
-0.1763909459,
-0.1969624907,
0.1317259967,
-0.1033228412,
0.3190253079,
-0.1158775613,
-0.0795880333,
-0.1095688939,
0.1889614463,
0.0918164998,
0.0788193196,
-0.197187081,
-0.1108214557,
-0.0314430296,
0.1409584284,
0.2720893621,
0.271759063,
0.177199617,
-0.1189833283,
-0.003281977,
0.1774202883,
0.3424130082,
0.0457537211,
-0.0453366712,
-0.3309017718,
-0.4002842605,
-0.1914319098,
0.1130491346,
-0.1737481505,
-0.3322198093,
-0.0813275278,
0.0919059813,
0.0365610607,
0.1109027863,
0.0576234497,
-0.1283794791,
0.2926148474,
0.023491554,
0.19237867,
-0.3091998696,
0.0621639565,
-0.168789953,
0.1422374994,
-0.2686136663,
-0.1336474419,
0.1294742227,
-0.2735539675,
0.5659919977,
-0.0004659779,
-0.0992646664,
0.0367270485,
-0.2505792379,
0.2368857116,
-0.3236500621,
0.1910730153,
-0.0812206492,
0.2065938413,
-0.0171953142,
-0.0790163353,
-0.2939081788,
-0.1111740172,
-0.3565988541,
-0.2549516857,
-0.1689946949,
0.2119120359,
-0.2016154379,
-0.2702844739,
-0.2804157138,
0.0198177397,
-0.065805316,
-0.3608135283,
-0.0046904907,
-0.2430275679,
-0.1055392772,
-0.0930549204,
0.3117879033,
0.2317884713,
0.0926559567,
-0.3515031934,
-0.235999465,
-0.3532509208,
0.0857137442,
0.2320383191,
-0.2740489542,
0.2068705559,
-0.188013345,
0.1380043626,
0.623500824,
-0.5698124766,
-0.5247966647,
0.2389449328,
-0.1837095618,
0.1494829357,
-0.106520474,
-0.0558203645,
-0.2375169396,
-0.1302373409,
0.0816134661,
0.2938148677,
0.0514479578,
-0.1161219031,
-0.1398785263,
-0.4660893679,
0.2725194395,
0.0642340481,
-0.0643579215,
-0.0080759302,
-0.0887140632,
0.4918845296,
0.1930050105,
0.1258877218,
-0.0510848574,
0.1976207495,
0.0127383079,
0.0237860288,
-0.3512736559,
-0.261854738,
-0.4162874222,
0.153845191,
-0.2101593614,
-0.0290677473,
-0.1326702833,
-0.0577357598,
-0.307828784,
-0.0738362297,
-0.1671572179,
-0.2715991437,
0.1607210338,
0.004829362,
0.2955423892,
0.1357760727,
-0.1708002388,
0.3214162588,
0.0418823585,
0.1781066954,
-0.4866743088,
0.0572807491,
-0.1335321367,
-0.0460232757,
0.1729027331,
0.4731178582,
0.1276734173,
-0.2077902108,
0.0151711889,
0.4866742492,
-0.3806296289,
0.2787310779,
-0.0023202333,
-0.169099614,
0.1775315255,
-0.3081973195,
-0.1312316507,
0.0708649829,
-0.0935906321,
0.0851687565,
-0.0395265073,
0.0182777457,
-0.0922876596,
0.0961841941,
0.2280827314,
0.1834035218,
0.3326525092,
-0.0641416982,
-0.024161268,
-0.1820687354,
0.2553634346,
-0.064981021,
0.1488991976,
-0.1461169124,
-0.0508807525,
-0.0003215298,
0.4923802018,
0.1942919195,
0.0343223512,
0.2010838836,
0.0021547731,
-0.0955142453,
0.0481096059,
0.2638821006,
0.2911491692,
0.2004128098,
-0.0828937292,
0.1889246404,
-0.2043617368,
-0.1494051665,
0.2577416897,
0.1075861827,
-0.0307696518,
0.1811572313,
-0.1337857395,
0.0114307664,
-0.2375520617,
0.0948072374,
-0.0222692303,
0.2971347868,
-0.0851934776,
-0.0681525692,
-0.4806862772,
-0.3239580095,
-0.3728137314,
-0.2951775491,
-0.4344967604,
-0.1589564681,
-0.0362029672,
0.1149726361,
0.0485618114,
0.1649898589,
-0.0553436875,
-0.0526931286,
-0.2266287804,
-0.0447862558,
-0.0267590508,
-0.0484158769,
-0.3857757151,
0.0981594324,
0.1967357248,
0.2448917925,
0.4569404125,
-0.2492744029,
-0.1252939254,
-0.0644191578,
-0.2730084658,
0.1938120723,
-0.2926282883,
0.2592027187,
0.3556086719,
-0.051629059,
-0.2703352869,
-0.335970521,
0.3509817421,
-0.4124707282,
-0.1979118884,
-0.0850105658,
0.0744216889,
-0.0316738449,
-0.2016800195,
-0.308390826,
-0.2766190171,
-0.350523293,
0.3111944199,
0.0806485638,
0.0984397754,
0.4184325039,
-0.0125569142,
0.2687785923,
-0.1211046576,
0.0469820872,
-0.1270651519,
-0.2123742998,
0.0569492877,
-0.3140260279,
-0.3949982226,
0.0235567912,
0.228068158,
0.3154692054,
0.0841655731,
-0.4360539615,
-0.2520711124,
0.0119623151,
0.0624023974,
0.0597274639,
0.3059400618,
0.3308090866,
-0.0383230038,
-0.0712784156,
-0.1615496129,
-0.0299326852,
0.0604028702,
-0.1182853431,
0.3523582518,
0.1799402833,
0.4678027034,
-0.0238170736,
0.6063984036,
0.497233808,
0.0376835614,
0.3265111148,
-0.055383496,
0.3800788522,
-0.1730224043,
-0.2814177275,
0.1348730326,
0.0858372301,
0.5628988743,
0.2523514032,
0.1252412945,
0.1085705608,
-0.2215528637,
-0.1356803477,
-0.2715798616,
-0.1474338174,
-0.1767601222,
0.0646157563,
0.2577385008,
0.1145812944,
-0.0049042404,
-0.0982686281,
-0.2549662292,
0.1954673529,
0.2483880371,
-0.070077464,
0.256475538,
-0.3562459946,
-0.2916221023,
-0.6304401755,
0.0574147999,
0.1644299775,
0.5169091225,
0.0251790434,
0.1565308869,
0.1662593484,
-0.1031454057,
0.3516109884,
0.000263501,
0.0848304629,
0.0737705529,
0.1700430959,
-0.5438057184,
-0.0358664393,
-0.0565551184,
0.3793545365,
0.4606743455,
0.0374551378,
-0.489433527,
-0.398358494,
0.3138809204,
0.1002968773,
-0.1521900147,
-0.0328312479,
-0.1738866866,
-0.2557459474,
-0.2024968565,
-0.202711314,
-0.0166214481,
0.3583608568,
0.0794929862,
0.2841626406,
-0.0602860525,
-0.0682128444,
-0.0234563835,
0.018086914,
0.3587937653,
0.087280564,
0.0495739505,
0.172274068,
0.1227549016,
-0.1981004179,
0.740023911,
0.0505525544,
-0.0480799079,
-0.1474428624,
0.2945863307,
0.1420303285,
0.1440688074,
-0.1842709184,
-0.0590411909,
-0.0059903655,
0.1297979653,
-0.0680280626,
0.111773476,
0.5068563223,
0.0243970901,
-0.0904783756,
-0.2858388722,
0.4987759292,
0.0972945243,
-0.0226619914,
0.2015022337,
0.436865747,
-0.2149178982,
-0.069213286,
-0.1267644763,
0.6320810318,
0.2031876892,
0.1952811778,
0.2906044722,
-0.0395302922,
0.7745856643,
-0.0314161181,
0.0939441174,
-0.2058900446,
0.1828344315,
0.0607477576,
-0.0090649202,
0.0709610805,
0.0198443085,
-0.235997811,
0.3186233938,
0.284871161,
0.0379711017,
0.0395125523,
0.4954657257,
-0.1353969276,
-0.1887604296,
-0.4930030107,
0.1709432602,
-0.1057603732,
0.4444634616,
-0.2025645822,
-0.1389370263,
0.0988434404,
-0.3229314089,
-0.3750477135,
0.1910779178,
0.0030197091,
0.3070126772,
0.257869184,
-0.1358373463,
0.3146024048,
0.1584175527,
0.3754768372,
0.0336968191,
-0.3622068763,
0.0785574317,
-0.3673883975,
-0.3306551874,
0.0866527259,
0.3278809786,
0.206379205,
-0.2947438359,
-0.1042837873,
-0.0982756615,
-0.0447292179,
-0.1725845933,
-0.0250438899,
-0.065838486,
0.0086646359,
-0.0605948456,
-0.226276055,
0.1407764256,
0.1795839667,
-0.2310498953,
0.1475729644,
0.1545288265,
-0.1493361592,
0.0412915647,
0.1740120947,
0.0057265162,
0.0376492925,
0.4061413407,
0.0011240887,
-0.1326887757,
0.5396637321,
-0.0424131081,
-0.0977003574,
-0.2147789001,
-0.1864569485,
0.0674130544,
-0.4528859556,
-0.0735907257,
0.1367860883,
-0.0382697433,
0.106043607,
0.398059994,
0.215078935,
-0.001826724,
0.0810617507,
-0.3938004375,
-0.1874093562,
0.3194370866,
0.0465021878,
0.0337695144,
0.1690133661,
0.2051026374,
0.2865498364,
0.0706636235,
-0.2923620343,
0.0620421991,
-0.0556401238,
0.0586600415,
0.2769286036,
-0.0889651775,
0.1965152621,
-0.0237098522,
0.0793572292,
0.0398357846,
-0.3567084372,
-0.1860346645,
-0.1730071008,
0.0893268064,
0.1075414643,
-0.316268146,
-0.0202369466,
-0.1181699261,
-0.0508418195,
0.0308044814,
0.1071536094,
-0.0029970855,
0.016107142,
0.1494887471,
0.1483575404,
0.229403168,
-0.0439707339,
0.07121104,
-0.0725781769,
0.2434770316,
0.0977991968,
0.0999879092,
0.0849314034,
-0.0207597911,
-0.1256195009,
-0.0588641241,
0.0226384345,
0.1156223193,
0.0954878032,
-0.4526156187,
0.1014504358,
0.0920292735,
0.322809875,
0.4633260369,
-0.1390681416,
-0.0120361671,
0.5312453508,
0.2185305059,
-0.4312899113,
-0.1404995918,
0.2825393379,
0.0634501651,
-0.0981673971,
-0.0459190235,
0.2540197372,
-0.2216083705,
0.1616757512,
0.0993236452,
0.2928044796,
-0.1385904104,
0.3533585668,
0.2495935857,
0.0416502655,
-0.2093860209,
-0.0784087405,
0.1208258122,
0.153065145,
0.2093197107,
-0.2409857661,
0.0943253636,
-0.2601040304,
0.2324688584,
0.0799042284,
-0.1778873503,
-0.2385255694,
0.106000416,
0.2365899831,
0.1821846813,
0.0217417758,
0.517604053,
-0.3201268613,
-0.2440443784,
-0.2050428689,
0.4098782837,
-0.0548686199,
-0.3036976457,
-0.0718474165,
0.0295324475,
-0.1447573453,
-0.0243564136,
-0.0471462235,
-0.2858161032,
0.1595942974,
0.1387341619,
-0.183124736,
-0.400831908,
-0.186846137,
0.1542173922,
0.1049751788,
0.0005532131,
0.4406987429,
0.0576149672,
-0.1394476295,
-0.0410799719,
0.5153030157,
0.4115841389,
0.2033143938,
0.0158383101,
-0.1346998811,
-0.0550060347,
0.1704123914,
-0.1088862866,
0.2642564178,
0.1135404482,
0.2260671109,
0.2382230759,
0.12211667,
-0.1751365364,
-0.0902134553,
-0.0171185266,
-0.027576752,
-0.2278971076,
0.5078482032,
-0.1254353225,
0.0195975378,
-0.2496705353,
0.3484073877,
-0.4911294281,
0.2872688174,
0.0201305337,
0.1537851691,
-0.0028903671,
-0.3591858745,
0.1024304554,
0.0927830338,
0.6194454432,
0.2813882232,
-0.0012828233,
0.0550917052,
-0.4740493894,
-0.8101555109,
-0.0658306479,
-0.2427225262,
-0.1460844278,
0.2658504248,
-0.0475194715,
-0.4228573143,
0.2343582809,
0.0174474642,
0.1221749038,
-0.0555880293,
-0.2802508771,
-0.1547044516,
0.0316941328,
0.0309368633,
-0.032714285,
0.1080193371,
-0.1543243527,
0.0754258931,
-0.2737427354,
-0.0130390041,
0.0053770114,
0.3522350192,
-0.2582992613,
-0.2497132868,
0.2114717066,
0.2341265678,
0.0735752285,
0.0139635801,
-0.1919139028,
-0.4351012409,
-0.2454767227,
-0.3566770554,
0.3163404465,
-0.1414400935,
0.3165802956,
-0.2512360513,
-0.0709539205,
-0.3026756644,
0.0841599256,
0.0305440575,
0.0684218407,
-0.1361842453,
0.088837795,
-0.1519756615,
0.107005775,
0.2659898996,
0.4927130044,
-0.1406166852,
0.137858212,
-0.3419420719,
-0.3935499489,
0.4931778014,
-0.5361912847,
-0.0375506543,
0.0461832397,
0.2863284349,
0.1845116913,
-0.3383825719,
-0.6405608654,
0.0665392578,
0.1711781621,
0.1276187599,
0.0320477635,
0.2320661247,
-0.1947587132,
-0.1220138595,
0.062313199,
0.3027657568,
0.0421601646,
-0.3858789504,
0.3996356726,
-0.175116986
] |
https://github.com/huggingface/datasets/issues/598 | The current version of the package on github has an error when loading dataset | Yes it shouldn’t fail with older version of transformers since this is only a special feature to make caching more efficient when using transformers for tokenization.
We’ll update this. | Instead of downloading the package from pip, downloading the version from source will result in an error when loading dataset (the pip version is completely fine):
To recreate the error:
First, installing nlp directly from source:
```
git clone https://github.com/huggingface/nlp.git
cd nlp
pip install -e .
```
Then run:
```
from nlp import load_dataset
dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
```
will give error:
```
>>> dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
Checking /home/zeyuy/.cache/huggingface/datasets/84a754b488511b109e2904672d809c041008416ae74e38f9ee0c80a8dffa1383.2e21f48d63b5572d19c97e441fbb802257cf6a4c03fbc5ed8fae3d2c2273f59e.py for additional imports.
Found main folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext
Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Found script file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.py
Found dataset infos file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/dataset_infos.json to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/dataset_infos.json
Found metadata file for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.json
Loading Dataset Infos from /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Overwrite dataset info from restored data version.
Loading Dataset info from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Reusing dataset wikitext (/home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d)
Constructing Dataset for split train, from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/load.py", line 600, in load_dataset
ds = builder_instance.as_dataset(split=split, ignore_verifications=ignore_verifications)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 611, in as_dataset
datasets = utils.map_nested(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 216, in map_nested
return function(data_struct)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 631, in _build_single_dataset
ds = self._as_dataset(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 704, in _as_dataset
return Dataset(**dataset_kwargs)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/arrow_dataset.py", line 188, in __init__
self._fingerprint = generate_fingerprint(self)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 91, in generate_fingerprint
hasher.update(key)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 361, in dumps
with _no_cache_fields(obj):
File "/home/zeyuy/miniconda3/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 348, in _no_cache_fields
if isinstance(obj, tr.PreTrainedTokenizerBase) and hasattr(obj, "cache") and isinstance(obj.cache, dict):
AttributeError: module 'transformers' has no attribute 'PreTrainedTokenizerBase'
```
| 29 | The current version of the package on github has an error when loading dataset
Instead of downloading the package from pip, downloading the version from source will result in an error when loading dataset (the pip version is completely fine):
To recreate the error:
First, installing nlp directly from source:
```
git clone https://github.com/huggingface/nlp.git
cd nlp
pip install -e .
```
Then run:
```
from nlp import load_dataset
dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
```
will give error:
```
>>> dataset = load_dataset('wikitext', 'wikitext-2-v1',split = 'train')
Checking /home/zeyuy/.cache/huggingface/datasets/84a754b488511b109e2904672d809c041008416ae74e38f9ee0c80a8dffa1383.2e21f48d63b5572d19c97e441fbb802257cf6a4c03fbc5ed8fae3d2c2273f59e.py for additional imports.
Found main folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext
Found specific version folder for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Found script file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.py
Found dataset infos file from https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/dataset_infos.json to /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/dataset_infos.json
Found metadata file for dataset https://raw.githubusercontent.com/huggingface/nlp/0.4.0/datasets/wikitext/wikitext.py at /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d/wikitext.json
Loading Dataset Infos from /home/zeyuy/.cache/huggingface/modules/nlp_modules/datasets/wikitext/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Overwrite dataset info from restored data version.
Loading Dataset info from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Reusing dataset wikitext (/home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d)
Constructing Dataset for split train, from /home/zeyuy/.cache/huggingface/datasets/wikitext/wikitext-2-v1/1.0.0/5de6e79516446f747fcccc09aa2614fa159053b75909594d28d262395f72d89d
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/load.py", line 600, in load_dataset
ds = builder_instance.as_dataset(split=split, ignore_verifications=ignore_verifications)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 611, in as_dataset
datasets = utils.map_nested(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 216, in map_nested
return function(data_struct)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 631, in _build_single_dataset
ds = self._as_dataset(
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/builder.py", line 704, in _as_dataset
return Dataset(**dataset_kwargs)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/arrow_dataset.py", line 188, in __init__
self._fingerprint = generate_fingerprint(self)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 91, in generate_fingerprint
hasher.update(key)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 57, in update
self.m.update(self.hash(value).encode("utf-8"))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 53, in hash
return cls.hash_default(value)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/fingerprint.py", line 46, in hash_default
return cls.hash_bytes(dumps(value))
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 361, in dumps
with _no_cache_fields(obj):
File "/home/zeyuy/miniconda3/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/zeyuy/transformers/examples/language-modeling/nlp/src/nlp/utils/py_utils.py", line 348, in _no_cache_fields
if isinstance(obj, tr.PreTrainedTokenizerBase) and hasattr(obj, "cache") and isinstance(obj.cache, dict):
AttributeError: module 'transformers' has no attribute 'PreTrainedTokenizerBase'
```
Yes it shouldn’t fail with older version of transformers since this is only a special feature to make caching more efficient when using transformers for tokenization.
We’ll update this. | [
-0.2237435281,
-0.2012972683,
-0.0425651111,
0.0305602606,
0.0355135016,
-0.0190251619,
-0.1124797463,
0.245201692,
-0.0017795414,
-0.2171768695,
0.2775868773,
0.3542348444,
-0.2464633286,
0.2480498254,
0.3240588605,
-0.2881515622,
0.0670980141,
0.3065454066,
-0.2375207841,
0.0527713001,
-0.0257974789,
0.4857281744,
-0.2306239903,
0.3760010302,
-0.0834266543,
0.0187895242,
0.0690762997,
0.1313542128,
-0.2909764647,
-0.5385345221,
0.6074132323,
-0.0616440214,
0.0506869964,
0.1241365448,
-0.0001099683,
0.0639229566,
0.4524998367,
0.0468346104,
-0.2115154266,
-0.5344375968,
-0.0258459598,
-0.4982025325,
0.2533907592,
-0.0385188013,
-0.033229176,
-0.1539615244,
0.123736158,
0.2145824581,
0.5887582302,
0.2153976709,
0.2569970191,
0.3765674829,
0.3491863906,
-0.0690732598,
-0.0353674963,
-0.2311027199,
-0.057744287,
0.1733410358,
0.0412600413,
-0.0806533247,
0.2716606557,
0.1918047965,
0.059941411,
-0.0913043916,
0.4098451138,
-0.1614539325,
-0.1533887684,
-0.1004510373,
-0.1068111956,
0.2216579616,
0.2371094972,
-0.3243215382,
-0.407365799,
-0.3286473751,
0.1875767708,
-0.2094339132,
0.4702660441,
0.0190893486,
-0.1861206293,
0.1364383847,
-0.2939094603,
-0.3428579867,
-0.1923739612,
0.2609533966,
0.1397569776,
0.3448454738,
-0.1708794087,
0.1244882196,
0.2748578191,
-0.1763909459,
-0.1969624907,
0.1317259967,
-0.1033228412,
0.3190253079,
-0.1158775613,
-0.0795880333,
-0.1095688939,
0.1889614463,
0.0918164998,
0.0788193196,
-0.197187081,
-0.1108214557,
-0.0314430296,
0.1409584284,
0.2720893621,
0.271759063,
0.177199617,
-0.1189833283,
-0.003281977,
0.1774202883,
0.3424130082,
0.0457537211,
-0.0453366712,
-0.3309017718,
-0.4002842605,
-0.1914319098,
0.1130491346,
-0.1737481505,
-0.3322198093,
-0.0813275278,
0.0919059813,
0.0365610607,
0.1109027863,
0.0576234497,
-0.1283794791,
0.2926148474,
0.023491554,
0.19237867,
-0.3091998696,
0.0621639565,
-0.168789953,
0.1422374994,
-0.2686136663,
-0.1336474419,
0.1294742227,
-0.2735539675,
0.5659919977,
-0.0004659779,
-0.0992646664,
0.0367270485,
-0.2505792379,
0.2368857116,
-0.3236500621,
0.1910730153,
-0.0812206492,
0.2065938413,
-0.0171953142,
-0.0790163353,
-0.2939081788,
-0.1111740172,
-0.3565988541,
-0.2549516857,
-0.1689946949,
0.2119120359,
-0.2016154379,
-0.2702844739,
-0.2804157138,
0.0198177397,
-0.065805316,
-0.3608135283,
-0.0046904907,
-0.2430275679,
-0.1055392772,
-0.0930549204,
0.3117879033,
0.2317884713,
0.0926559567,
-0.3515031934,
-0.235999465,
-0.3532509208,
0.0857137442,
0.2320383191,
-0.2740489542,
0.2068705559,
-0.188013345,
0.1380043626,
0.623500824,
-0.5698124766,
-0.5247966647,
0.2389449328,
-0.1837095618,
0.1494829357,
-0.106520474,
-0.0558203645,
-0.2375169396,
-0.1302373409,
0.0816134661,
0.2938148677,
0.0514479578,
-0.1161219031,
-0.1398785263,
-0.4660893679,
0.2725194395,
0.0642340481,
-0.0643579215,
-0.0080759302,
-0.0887140632,
0.4918845296,
0.1930050105,
0.1258877218,
-0.0510848574,
0.1976207495,
0.0127383079,
0.0237860288,
-0.3512736559,
-0.261854738,
-0.4162874222,
0.153845191,
-0.2101593614,
-0.0290677473,
-0.1326702833,
-0.0577357598,
-0.307828784,
-0.0738362297,
-0.1671572179,
-0.2715991437,
0.1607210338,
0.004829362,
0.2955423892,
0.1357760727,
-0.1708002388,
0.3214162588,
0.0418823585,
0.1781066954,
-0.4866743088,
0.0572807491,
-0.1335321367,
-0.0460232757,
0.1729027331,
0.4731178582,
0.1276734173,
-0.2077902108,
0.0151711889,
0.4866742492,
-0.3806296289,
0.2787310779,
-0.0023202333,
-0.169099614,
0.1775315255,
-0.3081973195,
-0.1312316507,
0.0708649829,
-0.0935906321,
0.0851687565,
-0.0395265073,
0.0182777457,
-0.0922876596,
0.0961841941,
0.2280827314,
0.1834035218,
0.3326525092,
-0.0641416982,
-0.024161268,
-0.1820687354,
0.2553634346,
-0.064981021,
0.1488991976,
-0.1461169124,
-0.0508807525,
-0.0003215298,
0.4923802018,
0.1942919195,
0.0343223512,
0.2010838836,
0.0021547731,
-0.0955142453,
0.0481096059,
0.2638821006,
0.2911491692,
0.2004128098,
-0.0828937292,
0.1889246404,
-0.2043617368,
-0.1494051665,
0.2577416897,
0.1075861827,
-0.0307696518,
0.1811572313,
-0.1337857395,
0.0114307664,
-0.2375520617,
0.0948072374,
-0.0222692303,
0.2971347868,
-0.0851934776,
-0.0681525692,
-0.4806862772,
-0.3239580095,
-0.3728137314,
-0.2951775491,
-0.4344967604,
-0.1589564681,
-0.0362029672,
0.1149726361,
0.0485618114,
0.1649898589,
-0.0553436875,
-0.0526931286,
-0.2266287804,
-0.0447862558,
-0.0267590508,
-0.0484158769,
-0.3857757151,
0.0981594324,
0.1967357248,
0.2448917925,
0.4569404125,
-0.2492744029,
-0.1252939254,
-0.0644191578,
-0.2730084658,
0.1938120723,
-0.2926282883,
0.2592027187,
0.3556086719,
-0.051629059,
-0.2703352869,
-0.335970521,
0.3509817421,
-0.4124707282,
-0.1979118884,
-0.0850105658,
0.0744216889,
-0.0316738449,
-0.2016800195,
-0.308390826,
-0.2766190171,
-0.350523293,
0.3111944199,
0.0806485638,
0.0984397754,
0.4184325039,
-0.0125569142,
0.2687785923,
-0.1211046576,
0.0469820872,
-0.1270651519,
-0.2123742998,
0.0569492877,
-0.3140260279,
-0.3949982226,
0.0235567912,
0.228068158,
0.3154692054,
0.0841655731,
-0.4360539615,
-0.2520711124,
0.0119623151,
0.0624023974,
0.0597274639,
0.3059400618,
0.3308090866,
-0.0383230038,
-0.0712784156,
-0.1615496129,
-0.0299326852,
0.0604028702,
-0.1182853431,
0.3523582518,
0.1799402833,
0.4678027034,
-0.0238170736,
0.6063984036,
0.497233808,
0.0376835614,
0.3265111148,
-0.055383496,
0.3800788522,
-0.1730224043,
-0.2814177275,
0.1348730326,
0.0858372301,
0.5628988743,
0.2523514032,
0.1252412945,
0.1085705608,
-0.2215528637,
-0.1356803477,
-0.2715798616,
-0.1474338174,
-0.1767601222,
0.0646157563,
0.2577385008,
0.1145812944,
-0.0049042404,
-0.0982686281,
-0.2549662292,
0.1954673529,
0.2483880371,
-0.070077464,
0.256475538,
-0.3562459946,
-0.2916221023,
-0.6304401755,
0.0574147999,
0.1644299775,
0.5169091225,
0.0251790434,
0.1565308869,
0.1662593484,
-0.1031454057,
0.3516109884,
0.000263501,
0.0848304629,
0.0737705529,
0.1700430959,
-0.5438057184,
-0.0358664393,
-0.0565551184,
0.3793545365,
0.4606743455,
0.0374551378,
-0.489433527,
-0.398358494,
0.3138809204,
0.1002968773,
-0.1521900147,
-0.0328312479,
-0.1738866866,
-0.2557459474,
-0.2024968565,
-0.202711314,
-0.0166214481,
0.3583608568,
0.0794929862,
0.2841626406,
-0.0602860525,
-0.0682128444,
-0.0234563835,
0.018086914,
0.3587937653,
0.087280564,
0.0495739505,
0.172274068,
0.1227549016,
-0.1981004179,
0.740023911,
0.0505525544,
-0.0480799079,
-0.1474428624,
0.2945863307,
0.1420303285,
0.1440688074,
-0.1842709184,
-0.0590411909,
-0.0059903655,
0.1297979653,
-0.0680280626,
0.111773476,
0.5068563223,
0.0243970901,
-0.0904783756,
-0.2858388722,
0.4987759292,
0.0972945243,
-0.0226619914,
0.2015022337,
0.436865747,
-0.2149178982,
-0.069213286,
-0.1267644763,
0.6320810318,
0.2031876892,
0.1952811778,
0.2906044722,
-0.0395302922,
0.7745856643,
-0.0314161181,
0.0939441174,
-0.2058900446,
0.1828344315,
0.0607477576,
-0.0090649202,
0.0709610805,
0.0198443085,
-0.235997811,
0.3186233938,
0.284871161,
0.0379711017,
0.0395125523,
0.4954657257,
-0.1353969276,
-0.1887604296,
-0.4930030107,
0.1709432602,
-0.1057603732,
0.4444634616,
-0.2025645822,
-0.1389370263,
0.0988434404,
-0.3229314089,
-0.3750477135,
0.1910779178,
0.0030197091,
0.3070126772,
0.257869184,
-0.1358373463,
0.3146024048,
0.1584175527,
0.3754768372,
0.0336968191,
-0.3622068763,
0.0785574317,
-0.3673883975,
-0.3306551874,
0.0866527259,
0.3278809786,
0.206379205,
-0.2947438359,
-0.1042837873,
-0.0982756615,
-0.0447292179,
-0.1725845933,
-0.0250438899,
-0.065838486,
0.0086646359,
-0.0605948456,
-0.226276055,
0.1407764256,
0.1795839667,
-0.2310498953,
0.1475729644,
0.1545288265,
-0.1493361592,
0.0412915647,
0.1740120947,
0.0057265162,
0.0376492925,
0.4061413407,
0.0011240887,
-0.1326887757,
0.5396637321,
-0.0424131081,
-0.0977003574,
-0.2147789001,
-0.1864569485,
0.0674130544,
-0.4528859556,
-0.0735907257,
0.1367860883,
-0.0382697433,
0.106043607,
0.398059994,
0.215078935,
-0.001826724,
0.0810617507,
-0.3938004375,
-0.1874093562,
0.3194370866,
0.0465021878,
0.0337695144,
0.1690133661,
0.2051026374,
0.2865498364,
0.0706636235,
-0.2923620343,
0.0620421991,
-0.0556401238,
0.0586600415,
0.2769286036,
-0.0889651775,
0.1965152621,
-0.0237098522,
0.0793572292,
0.0398357846,
-0.3567084372,
-0.1860346645,
-0.1730071008,
0.0893268064,
0.1075414643,
-0.316268146,
-0.0202369466,
-0.1181699261,
-0.0508418195,
0.0308044814,
0.1071536094,
-0.0029970855,
0.016107142,
0.1494887471,
0.1483575404,
0.229403168,
-0.0439707339,
0.07121104,
-0.0725781769,
0.2434770316,
0.0977991968,
0.0999879092,
0.0849314034,
-0.0207597911,
-0.1256195009,
-0.0588641241,
0.0226384345,
0.1156223193,
0.0954878032,
-0.4526156187,
0.1014504358,
0.0920292735,
0.322809875,
0.4633260369,
-0.1390681416,
-0.0120361671,
0.5312453508,
0.2185305059,
-0.4312899113,
-0.1404995918,
0.2825393379,
0.0634501651,
-0.0981673971,
-0.0459190235,
0.2540197372,
-0.2216083705,
0.1616757512,
0.0993236452,
0.2928044796,
-0.1385904104,
0.3533585668,
0.2495935857,
0.0416502655,
-0.2093860209,
-0.0784087405,
0.1208258122,
0.153065145,
0.2093197107,
-0.2409857661,
0.0943253636,
-0.2601040304,
0.2324688584,
0.0799042284,
-0.1778873503,
-0.2385255694,
0.106000416,
0.2365899831,
0.1821846813,
0.0217417758,
0.517604053,
-0.3201268613,
-0.2440443784,
-0.2050428689,
0.4098782837,
-0.0548686199,
-0.3036976457,
-0.0718474165,
0.0295324475,
-0.1447573453,
-0.0243564136,
-0.0471462235,
-0.2858161032,
0.1595942974,
0.1387341619,
-0.183124736,
-0.400831908,
-0.186846137,
0.1542173922,
0.1049751788,
0.0005532131,
0.4406987429,
0.0576149672,
-0.1394476295,
-0.0410799719,
0.5153030157,
0.4115841389,
0.2033143938,
0.0158383101,
-0.1346998811,
-0.0550060347,
0.1704123914,
-0.1088862866,
0.2642564178,
0.1135404482,
0.2260671109,
0.2382230759,
0.12211667,
-0.1751365364,
-0.0902134553,
-0.0171185266,
-0.027576752,
-0.2278971076,
0.5078482032,
-0.1254353225,
0.0195975378,
-0.2496705353,
0.3484073877,
-0.4911294281,
0.2872688174,
0.0201305337,
0.1537851691,
-0.0028903671,
-0.3591858745,
0.1024304554,
0.0927830338,
0.6194454432,
0.2813882232,
-0.0012828233,
0.0550917052,
-0.4740493894,
-0.8101555109,
-0.0658306479,
-0.2427225262,
-0.1460844278,
0.2658504248,
-0.0475194715,
-0.4228573143,
0.2343582809,
0.0174474642,
0.1221749038,
-0.0555880293,
-0.2802508771,
-0.1547044516,
0.0316941328,
0.0309368633,
-0.032714285,
0.1080193371,
-0.1543243527,
0.0754258931,
-0.2737427354,
-0.0130390041,
0.0053770114,
0.3522350192,
-0.2582992613,
-0.2497132868,
0.2114717066,
0.2341265678,
0.0735752285,
0.0139635801,
-0.1919139028,
-0.4351012409,
-0.2454767227,
-0.3566770554,
0.3163404465,
-0.1414400935,
0.3165802956,
-0.2512360513,
-0.0709539205,
-0.3026756644,
0.0841599256,
0.0305440575,
0.0684218407,
-0.1361842453,
0.088837795,
-0.1519756615,
0.107005775,
0.2659898996,
0.4927130044,
-0.1406166852,
0.137858212,
-0.3419420719,
-0.3935499489,
0.4931778014,
-0.5361912847,
-0.0375506543,
0.0461832397,
0.2863284349,
0.1845116913,
-0.3383825719,
-0.6405608654,
0.0665392578,
0.1711781621,
0.1276187599,
0.0320477635,
0.2320661247,
-0.1947587132,
-0.1220138595,
0.062313199,
0.3027657568,
0.0421601646,
-0.3858789504,
0.3996356726,
-0.175116986
] |
https://github.com/huggingface/datasets/issues/597 | Indices incorrect with multiprocessing | I fixed a bug that could cause this issue earlier today. Could you pull the latest version and try again ? | When `num_proc` > 1, the indices argument passed to the map function is incorrect:
```python
d = load_dataset('imdb', split='test[:1%]')
def fn(x, inds):
print(inds)
return x
d.select(range(10)).map(fn, with_indices=True, batched=True)
# [0, 1]
# [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
d.select(range(10)).map(fn, with_indices=True, batched=True, num_proc=2)
# [0, 1]
# [0, 1]
# [0, 1, 2, 3, 4]
# [0, 1, 2, 3, 4]
```
As you can see, the subset passed to each thread is indexed from 0 to N which doesn't reflect their positions in `d`. | 21 | Indices incorrect with multiprocessing
When `num_proc` > 1, the indices argument passed to the map function is incorrect:
```python
d = load_dataset('imdb', split='test[:1%]')
def fn(x, inds):
print(inds)
return x
d.select(range(10)).map(fn, with_indices=True, batched=True)
# [0, 1]
# [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
d.select(range(10)).map(fn, with_indices=True, batched=True, num_proc=2)
# [0, 1]
# [0, 1]
# [0, 1, 2, 3, 4]
# [0, 1, 2, 3, 4]
```
As you can see, the subset passed to each thread is indexed from 0 to N which doesn't reflect their positions in `d`.
I fixed a bug that could cause this issue earlier today. Could you pull the latest version and try again ? | [
-0.431897223,
-0.3361008763,
-0.1858977526,
0.2841145396,
-0.2439695597,
-0.0441067144,
0.4962971807,
0.0556305423,
0.2259727567,
0.3695326149,
-0.0362008996,
0.3468486369,
0.1309848577,
0.1321682334,
-0.2639610171,
0.1251573116,
-0.042551145,
0.0501094908,
-0.1806093454,
-0.1643417776,
-0.3834482133,
0.2268319577,
-0.494538486,
-0.1573975235,
-0.4757075608,
-0.1956424415,
-0.075905472,
0.1099577695,
0.3210601807,
-0.1605068594,
0.0099580735,
0.022106519,
-0.0914489925,
0.7085776329,
-0.0001031148,
-0.1593159586,
0.0751708895,
-0.1524420083,
0.1996047646,
-0.0002129227,
-0.0909555778,
0.1350020617,
0.2246319056,
-0.2731077075,
0.0537286848,
0.0532296188,
-0.0010817414,
-0.536521554,
0.1882873923,
0.114147231,
0.2736754417,
0.2315299511,
-0.1976777464,
0.1363343447,
-0.0576328896,
0.0317329206,
0.0174952522,
0.1988241076,
0.5325115919,
-0.0010721274,
-0.0761029571,
0.2661800981,
-0.2675153017,
0.1340355426,
-0.0550686941,
0.242673859,
0.191054225,
-0.5292802453,
0.0899394304,
0.0624272861,
0.2397210151,
-0.0146803837,
-0.3417313099,
-0.167034179,
-0.0660365373,
-0.2750048339,
-0.1892736405,
0.130120635,
0.0398403928,
-0.1010044068,
-0.3078898489,
0.2415544838,
0.1040982753,
-0.0423950404,
0.1155257672,
0.2264727056,
0.1334076226,
0.0960729718,
0.1119296253,
0.0600201823,
0.0332671963,
0.0170927905,
0.2272538841,
0.0101898089,
-0.222809732,
0.0659201443,
0.1626020074,
-0.1847362369,
-0.2487387061,
-0.0327616334,
-0.0716660395,
0.1557405591,
0.0264387205,
0.1709656268,
0.2026148587,
-0.1219341978,
-0.0715442598,
0.0419614911,
0.3141615391,
-0.1366680264,
-0.2969742715,
-0.0143849421,
0.2374505848,
-0.2750280499,
0.1658841074,
0.2358230501,
-0.1792281866,
0.0275975466,
-0.341396302,
0.0850764215,
-0.0593764968,
-0.104961127,
0.0169432238,
0.1543861032,
-0.0021363385,
0.2345192581,
0.1239192635,
0.4017364979,
-0.4142127335,
0.1561332345,
-0.3718008995,
-0.108310312,
-0.0729490817,
0.1471422017,
0.1215671599,
0.0886530131,
0.237752974,
0.2813609242,
0.2871544063,
-0.2365334481,
0.0768127218,
-0.2243104875,
0.2108921558,
0.1928569824,
0.2893157601,
0.2574093938,
0.1592211425,
-0.0874204934,
-0.1630048454,
0.2510742843,
-0.2757888436,
-0.0819020942,
-0.1048828438,
0.2451404482,
0.3177550435,
0.0950774699,
-0.3571496606,
0.0302920267,
0.3217177391,
0.0474877954,
-0.0547760539,
-0.2276132703,
-0.2643951476,
-0.2511070073,
0.136369437,
0.2460227162,
-0.1555966735,
0.131892994,
0.0226904899,
0.1686577797,
0.2406099141,
0.3759451807,
0.0269460082,
-0.0760059059,
-0.1439496577,
0.4667577446,
-0.1113359854,
-0.4037055075,
-0.1221789494,
0.3217079937,
-0.1662311256,
-0.0430529602,
0.1146674454,
0.1156927422,
0.391544193,
0.0218701474,
0.5178160667,
0.1550717354,
-0.0904083401,
0.3614135683,
-0.2356897593,
0.0310661197,
-0.0882616341,
0.2377883494,
0.1266496629,
-0.1160870567,
-0.1164999157,
-0.7145777941,
0.5741791129,
0.0097649749,
0.0703198984,
0.2118833214,
-0.0618281364,
0.3727455437,
-0.0360420682,
-0.3215183318,
0.0664665401,
0.191087082,
0.1904695332,
0.1615993083,
0.0930097252,
-0.2018200755,
-0.0757165775,
0.0433043018,
-0.0996578634,
-0.0145301763,
0.2032821029,
0.0761815012,
-0.2731062174,
-0.0606172457,
-0.2873535156,
0.1573040783,
0.2398827374,
-0.1844335198,
-0.3069195449,
0.2067868263,
-0.2794632018,
-0.4271340966,
0.0233320147,
0.2178395689,
0.3047003746,
-0.0238304138,
0.1050536036,
0.3498241007,
0.1846111566,
-0.0830979645,
0.0375300162,
0.1117563322,
0.1906623244,
-0.2784384787,
0.1273005456,
0.0858667195,
-0.0252231732,
-0.1873189956,
0.0788249969,
0.3482651114,
-0.5426025391,
0.2951743007,
-0.0744439811,
-0.0522121303,
0.1970886886,
0.0442084745,
0.1586543918,
-0.2298736274,
-0.0192228854,
0.1169385985,
-0.0048678145,
-0.0748981908,
-0.4469332695,
0.3034744263,
0.1718004346,
0.2147852182,
0.0117490105,
0.0673648268,
-0.1435045451,
0.0583521575,
-0.0029325318,
-0.1886031479,
0.1613482833,
0.3425672948,
0.0278087147,
-0.1712355912,
0.0145634674,
-0.19940795,
0.0625372007,
-0.0080838613,
0.2212474793,
0.2412867248,
-0.0798237324,
-0.136154741,
-0.1645223349,
-0.1607880741,
-0.0919125974,
-0.0036567729,
-0.3200867176,
-0.0344344862,
-0.2061437964,
0.1468796432,
-0.080696173,
-0.3674124777,
0.1646693945,
-0.2895932198,
0.063726455,
0.4193496108,
-0.0656612068,
0.1587557942,
-0.0810861588,
-0.0964360833,
-0.1005368158,
0.097990267,
0.063027069,
-0.1917269081,
-0.1016530097,
0.1198315844,
0.2083981782,
0.2010071278,
0.1114134565,
-0.0428648964,
-0.2291793674,
0.064368993,
0.1898298264,
-0.2369539738,
-0.1981433183,
0.2363655716,
-0.2259236425,
0.2451671958,
-0.0284199528,
-0.4693145156,
0.085132435,
-0.1445761472,
-0.0585235953,
0.0074146315,
0.0539937839,
-0.2731428742,
-0.1177037954,
-0.2989676893,
-0.3862083554,
-0.1980377883,
0.2820742428,
0.052722156,
0.2734356225,
-0.0726148412,
0.0076206252,
-0.3040594459,
0.4685652852,
0.1552219391,
-0.3286645412,
-0.0194328576,
0.1838422716,
-0.2203139067,
-0.1887142956,
-0.0422975197,
-0.0904347077,
-0.0526838116,
-0.1176268309,
-0.213793844,
0.0841790661,
-0.1675971448,
0.1001663655,
0.0828329399,
0.3064702451,
0.3374304771,
0.1993238032,
-0.2635639012,
-0.0856797993,
-0.3538396657,
0.0745084211,
0.128910616,
-0.1440955102,
0.0223595053,
0.3712634146,
-0.223084867,
0.2971280515,
0.4055191875,
0.0535829812,
0.3756560087,
-0.1485210061,
-0.1126590818,
-0.1953378618,
-0.3681216836,
-0.0093422905,
-0.1500665694,
-0.2315350175,
0.2093336284,
-0.0445222184,
-0.4889574945,
-0.0024670362,
0.2656967044,
-0.8140268326,
-0.210798502,
0.1814646125,
-0.1368274987,
0.0374779254,
-0.0845679194,
0.0923027396,
-0.2396162301,
-0.1102340668,
-0.0795103312,
-0.298556596,
0.2515532374,
-0.3308256865,
0.0888075233,
-0.1440134943,
-0.3974335492,
0.2931147814,
0.0634498745,
0.2471881062,
-0.0068139806,
-0.1064688265,
0.0457208455,
-0.0695266724,
0.6664329767,
-0.3575477898,
-0.0807214677,
0.3131142557,
-0.3457058668,
-0.3597399592,
-0.1085786447,
-0.2446024269,
0.4223020673,
0.2490155697,
0.1651693881,
-0.1902460903,
0.0813659653,
0.3031410575,
-0.0155176986,
-0.088672772,
-0.1500975639,
-0.3474946022,
0.0120633319,
-0.1366433352,
0.10322427,
0.1718510985,
0.1410548538,
-0.2319597304,
-0.2071466148,
0.1057456434,
-0.0937179849,
0.0874393284,
0.1244039685,
0.3212255538,
0.2401575595,
0.1785145551,
0.2920943797,
-0.1577495188,
0.3126630485,
0.222532481,
-0.2178101689,
0.3429151475,
0.1411984563,
-0.4355605841,
0.260258317,
0.2999693155,
0.2798712254,
-0.1091568694,
0.0233675055,
0.303096205,
-0.3033765554,
-0.3331573606,
0.379098624,
-0.0484519154,
0.1102814004,
-0.335311383,
0.3447390199,
0.1321624517,
-0.05871889,
0.5077547431,
-0.0573801994,
-0.1065008268,
0.4608895183,
0.2137581259,
0.7910373807,
-0.187899068,
-0.0348219797,
-0.1420484185,
0.0561312363,
-0.1971439421,
0.0901201665,
0.2725309134,
-0.546135366,
-0.4393996596,
0.0830270424,
-0.1587143093,
-0.4350051582,
0.2416203171,
-0.168701306,
0.1569957882,
-0.1435948908,
-0.405359745,
-0.332461834,
0.0233392045,
0.1497025192,
-0.1628325135,
0.1608373374,
0.2037874758,
0.2574537694,
-0.1328362077,
-0.0363435186,
0.1462357044,
-0.2572633624,
0.0861457363,
-0.512928009,
-0.1046797633,
-0.2424634993,
0.4165170789,
-0.0738144368,
-0.0363195688,
-0.2102034986,
0.2446127236,
0.0141951442,
-0.1521611214,
0.0803565755,
-0.0889952034,
0.3105019033,
0.2046588808,
0.2126338184,
-0.0884663761,
0.2994721532,
-0.0039054304,
-0.4743807614,
-0.1928718388,
0.1596225053,
-0.3719164729,
-0.0219207257,
0.3371341825,
-0.1180268526,
0.0194585584,
-0.2655510306,
0.0873411745,
0.0585139915,
-0.1877588332,
0.200604856,
0.0517515838,
-0.0782903433,
0.7023659945,
-0.1270115972,
-0.2312024683,
-0.0480042659,
0.4856058061,
0.235204652,
-0.0405893251,
0.1515126228,
0.3260003328,
-0.3293744326,
-0.2541390061,
0.0042923987,
0.0762187243,
0.0773218349,
0.1749635041,
-0.1473270357,
-0.1165868491,
0.2391573042,
0.2125855833,
0.0134926615,
0.3130643964,
-0.2659535408,
-0.2406715751,
-0.1655675769,
0.2730347216,
0.0203963909,
0.1812419742,
-0.0341895968,
-0.0327734984,
0.0266868845,
0.1719718128,
-0.4117129445,
0.0941282585,
-0.339902848,
0.1016599089,
0.0688778609,
-0.1732462943,
0.0709695891,
0.0608964376,
0.23256661,
0.1283654124,
0.041657906,
-0.2969101667,
0.0495160818,
0.0432571471,
-0.1128756553,
-0.1817081571,
-0.0096664988,
-0.0501001403,
-0.30248487,
-0.3113754392,
0.0393275172,
0.2253237367,
0.0662995279,
0.0374972336,
0.4365173578,
-0.0279011391,
0.037096329,
0.0683793277,
-0.15656735,
0.2994941473,
-0.0888125449,
0.0638458431,
0.1475528926,
-0.121896781,
0.0224345066,
-0.0766596198,
0.0941207111,
0.0960460901,
0.1418559253,
-0.3297685683,
-0.1255578995,
0.3771895468,
0.6326012611,
0.1965839565,
0.0646506622,
0.0106701329,
-0.2082909495,
0.3497163951,
-0.1520094573,
-0.0465256907,
0.3921127915,
0.084954381,
0.089450866,
0.2458792627,
0.0966356769,
0.1112937033,
-0.025035318,
0.1211650372,
0.1282775253,
-0.4195346832,
0.0072612278,
-0.2902779877,
0.110340178,
0.0754345059,
0.1623440087,
0.0146115217,
0.2888442278,
0.451672256,
0.1128158569,
0.287920177,
0.0965843797,
-0.243103072,
-0.0861928314,
-0.3390609026,
0.0753160864,
-0.1352416128,
0.0778426901,
0.1244156212,
0.208372727,
0.0993632302,
-0.4265169799,
-0.1266115755,
-0.2195393443,
0.2401001751,
-0.470462501,
-0.0629105568,
-0.0076146424,
-0.0582301319,
-0.0885784924,
-0.1483084559,
-0.067690298,
0.1363839656,
0.4371733069,
0.2658163607,
-0.0693714917,
-0.2387351394,
-0.2568921447,
-0.2201755196,
0.4944888353,
-0.1979592741,
0.0106242131,
0.3356946111,
0.2972984612,
-0.0094713951,
0.4283318222,
0.4837701023,
0.2592352629,
-0.5519337058,
0.0509172045,
0.03970927,
-0.1065052003,
-0.1573694348,
0.0114661437,
-0.0606249012,
0.374830246,
0.1408799589,
0.3006328344,
-0.2232874036,
0.0280752145,
0.3141002953,
0.2852794826,
0.2101731449,
-0.1244816929,
-0.1865978837,
-0.3570147157,
-0.0979556814,
-0.2014215589,
-0.2138443887,
0.0823479742,
0.2129534036,
-0.4234229028,
0.0401819311,
0.0786066353,
0.1550251544,
0.1217678636,
0.2978002131,
0.0791388303,
0.3265963197,
-0.3561429381,
-0.1397107244,
-0.4843373299,
0.1872499883,
0.0585229956,
-0.0798806101,
-0.0666164532,
0.5402232409,
0.0532770678,
0.0000192542,
0.1980646104,
-0.1719198078,
0.2607089579,
0.7007671595,
-0.3734978437,
0.2200196683,
-0.2280752808,
-0.1617433578,
0.1120652184,
-0.1107743755,
0.0504587702,
-0.3848444223,
0.2862800658,
-0.0855060518,
-0.063539058,
0.0800916702,
0.455542922,
0.3495668769,
-0.0937796161,
0.4544937909,
-0.3589241803,
0.006167151,
0.0604164116,
-0.1165306717,
-0.2515925765,
0.0916561931,
0.0739609003,
0.4379524887,
-0.068891339,
-0.0576680042,
-0.1749538481,
0.0739881545,
-0.0019079074,
-0.0922779664,
-0.253375411,
0.0331605226,
0.0801899806,
-0.111574173,
0.0184371844,
0.320723474,
-0.0257729553,
0.3453641534,
-0.0797881931,
-0.4359317124,
0.313523531,
-0.1023563594,
-0.3047312498,
-0.4569964409,
0.3395486772,
-0.1360757798,
0.1617170721,
-0.0694947541,
0.0883441567,
0.1941782236,
0.0357638709,
-0.3385346532,
-0.0157030877,
-0.0470939353,
-0.1415555626,
-0.1656773984,
0.0031906907,
0.1155248433,
-0.2539276481,
-0.2057241797,
-0.204240337
] |
https://github.com/huggingface/datasets/issues/597 | Indices incorrect with multiprocessing | Still the case on master.
I guess we should have an offset in the multi-procs indeed (hopefully it's enough).
Also, side note is that we should add some logging before the "test" to say we are testing the function otherwise its confusing for the user to see two outputs I think. Proposal (see the "Testing the mapped function outputs:" lines):
```
>>> d.select(range(10)).map(fn, with_indices=True, batched=True, num_proc=2)
Done writing 10 indices in 80 bytes .
Done writing 5 indices in 41 bytes .
Done writing 5 indices in 41 bytes .
Spawning 2 processes
Testing the mapped function outputs:
inds: [0, 1]
inds: [0, 1]
Testing finished, running the mapped function on the dataset:
#0: 0%| | 0/1 [00:00<?, ?ba/s]
inds: [0, 1, 2, 3, 4] inds: [0, 1, 2, 3, 4] | 0/1 [00:00<?, ?ba/s]
#0: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1321.04ba/s]
#1: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1841.22ba/s]
Concatenating 2 shards from multiprocessing
Dataset(features: {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=2, names=['neg', 'pos'], names_file=None, id=None)}, num_rows: 10)
``` | When `num_proc` > 1, the indices argument passed to the map function is incorrect:
```python
d = load_dataset('imdb', split='test[:1%]')
def fn(x, inds):
print(inds)
return x
d.select(range(10)).map(fn, with_indices=True, batched=True)
# [0, 1]
# [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
d.select(range(10)).map(fn, with_indices=True, batched=True, num_proc=2)
# [0, 1]
# [0, 1]
# [0, 1, 2, 3, 4]
# [0, 1, 2, 3, 4]
```
As you can see, the subset passed to each thread is indexed from 0 to N which doesn't reflect their positions in `d`. | 163 | Indices incorrect with multiprocessing
When `num_proc` > 1, the indices argument passed to the map function is incorrect:
```python
d = load_dataset('imdb', split='test[:1%]')
def fn(x, inds):
print(inds)
return x
d.select(range(10)).map(fn, with_indices=True, batched=True)
# [0, 1]
# [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
d.select(range(10)).map(fn, with_indices=True, batched=True, num_proc=2)
# [0, 1]
# [0, 1]
# [0, 1, 2, 3, 4]
# [0, 1, 2, 3, 4]
```
As you can see, the subset passed to each thread is indexed from 0 to N which doesn't reflect their positions in `d`.
Still the case on master.
I guess we should have an offset in the multi-procs indeed (hopefully it's enough).
Also, side note is that we should add some logging before the "test" to say we are testing the function otherwise its confusing for the user to see two outputs I think. Proposal (see the "Testing the mapped function outputs:" lines):
```
>>> d.select(range(10)).map(fn, with_indices=True, batched=True, num_proc=2)
Done writing 10 indices in 80 bytes .
Done writing 5 indices in 41 bytes .
Done writing 5 indices in 41 bytes .
Spawning 2 processes
Testing the mapped function outputs:
inds: [0, 1]
inds: [0, 1]
Testing finished, running the mapped function on the dataset:
#0: 0%| | 0/1 [00:00<?, ?ba/s]
inds: [0, 1, 2, 3, 4] inds: [0, 1, 2, 3, 4] | 0/1 [00:00<?, ?ba/s]
#0: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1321.04ba/s]
#1: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1841.22ba/s]
Concatenating 2 shards from multiprocessing
Dataset(features: {'text': Value(dtype='string', id=None), 'label': ClassLabel(num_classes=2, names=['neg', 'pos'], names_file=None, id=None)}, num_rows: 10)
``` | [
-0.3910767138,
-0.4112409949,
-0.1836056411,
0.2711613178,
-0.3301049471,
-0.1160224304,
0.5005038977,
0.1166042238,
0.106039539,
0.4203521609,
0.0210540965,
0.3124268651,
0.0579330511,
0.1175344214,
-0.2667939365,
0.2599062026,
-0.0655997097,
0.0601688921,
-0.3335292339,
-0.1833897084,
-0.3181263804,
0.2125172019,
-0.4629001021,
-0.1326190531,
-0.3110151589,
-0.2076472193,
-0.0664365217,
0.1109295338,
0.240282461,
-0.1356408149,
0.0628128573,
-0.0520632491,
-0.2454712987,
0.8725661039,
-0.0001055827,
-0.1686011553,
0.1327813715,
-0.1767239571,
0.2557115257,
-0.0440283865,
-0.1471694857,
0.1887272,
0.045694232,
-0.3676325679,
-0.0554585047,
0.0771234781,
-0.0901450068,
-0.5626958013,
0.071364522,
0.1461566985,
0.2426275313,
0.0921505839,
-0.2653032839,
0.1799373925,
-0.0514844768,
0.000001248,
0.0786735788,
0.1380770952,
0.3338857293,
0.0206264257,
-0.1120761558,
0.2574878931,
-0.1998975873,
0.1658311337,
0.0479152799,
0.1884691417,
0.4429796636,
-0.4548017085,
0.0140016563,
0.1714255214,
0.076501891,
-0.0286794566,
-0.2724148035,
-0.2252687663,
-0.2483910471,
-0.2456668764,
-0.0806914419,
0.0676275119,
-0.0155445263,
-0.1526167095,
-0.4177742302,
0.2804912925,
0.06227009,
0.0412675738,
-0.0223893449,
0.2801385224,
0.1688674986,
0.1247981489,
0.2155302316,
-0.0031882194,
-0.0951530635,
-0.052831918,
0.2153054178,
0.071430549,
-0.1718954444,
-0.0526898392,
0.2076423317,
-0.2216392308,
-0.2764875591,
0.0082332715,
-0.1167066842,
0.1178073287,
-0.0458208323,
0.1933469474,
0.2213090807,
0.0341466479,
-0.0692187026,
0.1834474504,
0.2160072476,
-0.0012567528,
-0.3653994203,
0.1079142094,
0.3758713603,
-0.292691797,
0.2313122451,
0.3381736279,
-0.3253426552,
-0.1112418249,
-0.2383358479,
0.1973326802,
-0.1021323055,
-0.1534311622,
0.0871515125,
0.1127256453,
-0.0100678615,
0.2547236383,
0.0523728803,
0.2664979994,
-0.4644142687,
0.2895244956,
-0.286221683,
-0.0766085684,
-0.2670092881,
0.1258991659,
0.0665819943,
0.0638547689,
0.1573703289,
0.2249408364,
0.4051100016,
-0.1377280802,
0.0269696936,
-0.2504668236,
0.4778962731,
0.1375292838,
0.1411996931,
0.1892671883,
0.0875437632,
-0.0034042895,
-0.2521339953,
0.2119445652,
-0.1446779519,
-0.0629124194,
0.0058159344,
0.2167234868,
0.1516548991,
0.3152198792,
-0.2772167921,
0.1823466271,
0.328409493,
-0.1004111618,
0.0519218296,
-0.2757974863,
-0.3509440422,
-0.2752366066,
0.2030744702,
0.1950678527,
-0.0062629431,
0.1461379826,
0.1258711666,
0.2690039277,
0.2144623399,
0.3467644155,
0.0196745396,
-0.1099931449,
-0.1407661438,
0.3324012458,
-0.1847784519,
-0.312807709,
-0.1643875837,
0.3284080923,
-0.1849719882,
0.0688833296,
0.1857280731,
-0.0957636684,
0.3820740283,
-0.0666146874,
0.4744655192,
0.1066798568,
-0.1397287101,
0.3166437447,
-0.3250190616,
0.0013278797,
-0.0399941728,
0.1066810638,
0.1451596618,
-0.2366894931,
-0.085972026,
-0.6445000768,
0.5220373869,
-0.0885258466,
0.0595650598,
0.1397807747,
-0.0939242691,
0.203567639,
-0.0551491305,
-0.3799473941,
0.1289733648,
0.2552981973,
0.0092976168,
0.2037032843,
0.0678793788,
-0.1114591807,
0.0459929891,
-0.045670405,
-0.0376254022,
-0.0912924856,
0.1730395854,
0.0572231896,
-0.2121398151,
-0.0726203099,
-0.1362882555,
0.129544735,
0.2848338187,
-0.2879863679,
-0.3537177145,
0.082953237,
-0.2009757459,
-0.2908988595,
-0.0033921674,
0.316167593,
0.3665757179,
-0.108189702,
0.0095773973,
0.2655246556,
0.2048650235,
-0.0258752666,
0.0107744187,
0.2131206393,
0.236767292,
-0.3246811628,
0.1559292078,
0.2504885793,
-0.0555101521,
-0.1659860313,
-0.0835675746,
0.4199189246,
-0.496835351,
0.2824420929,
-0.1163878813,
-0.1537791193,
0.0523726642,
0.0249148384,
-0.0073139258,
-0.202061981,
0.0603292994,
0.1109677106,
-0.1008908525,
-0.0723247603,
-0.2344713658,
0.2854291499,
0.1713830233,
0.0873316824,
-0.0693240389,
-0.0081383837,
-0.0807935819,
-0.0266586877,
0.1020047143,
-0.2638393641,
0.1642605215,
0.3359587491,
0.1982011795,
-0.1459436268,
-0.0341507159,
-0.0403983817,
0.1023054868,
-0.084420234,
0.2667042911,
0.204846248,
-0.0976210833,
-0.1244318783,
-0.0333174914,
-0.1147603169,
-0.0911622718,
-0.0124762878,
-0.3146418333,
-0.0698380023,
-0.1532041132,
0.2369306833,
-0.0572202429,
-0.3602384329,
0.1795164645,
-0.3023766279,
-0.0132760899,
0.4150727689,
-0.0780607015,
0.1847171187,
-0.0846397728,
-0.1025813669,
-0.1331315637,
-0.0831396654,
0.0670248568,
-0.1013769209,
-0.0591904335,
0.1163099557,
0.1773074269,
0.2144095153,
0.1332990229,
0.0940610096,
-0.1922878325,
0.1254297793,
0.0153618753,
-0.2188110352,
-0.271021992,
0.1883388311,
-0.1709579229,
0.0261897035,
-0.0233629644,
-0.328813225,
0.1242882386,
-0.2972333133,
-0.0278342031,
-0.1320940554,
-0.0146278869,
-0.2380352914,
-0.1538008749,
-0.1921209991,
-0.2937822938,
-0.1396251619,
0.3042711616,
0.1084468812,
0.3475037217,
-0.1840419322,
0.0001091808,
-0.2826233506,
0.3912386298,
0.0776480585,
-0.4431109428,
-0.1414066553,
0.1247887388,
-0.2343485355,
-0.1426166892,
-0.1725438833,
-0.1278420836,
-0.1158871576,
-0.0639463812,
-0.2264843434,
0.0080261156,
-0.1173777878,
0.1169522777,
0.0998290777,
0.3103835583,
0.3031775951,
0.2281727791,
-0.244468689,
-0.0598861277,
-0.4934155941,
0.0906018019,
0.2810758054,
-0.1371043921,
0.0189824179,
0.240350455,
-0.1708364785,
0.3510895371,
0.533847332,
-0.05047784,
0.1747779995,
-0.2027626783,
-0.0765641481,
-0.0659674853,
-0.2839648724,
0.1603351384,
-0.0037911609,
-0.2403053045,
0.1769824624,
-0.0580699109,
-0.5434803963,
0.0354698226,
0.2663355172,
-0.712656796,
-0.3004767895,
0.2838755846,
-0.3026226759,
-0.0811359212,
-0.1143256873,
0.0966879129,
-0.3498641253,
-0.0278235804,
-0.0748817846,
-0.4129069448,
0.2545217276,
-0.3162635267,
0.2005376518,
-0.1877774149,
-0.4116970599,
0.3155353367,
0.2320747674,
0.145292446,
0.0548223555,
-0.0290686525,
-0.035631422,
-0.0862537473,
0.6001464725,
-0.5158305764,
-0.0338332914,
0.3574091196,
-0.3718165159,
-0.3988596201,
-0.0688084364,
-0.1937642992,
0.4010442495,
0.101646401,
0.1860633641,
-0.2378630936,
0.0766691566,
0.2436867803,
-0.0697648302,
-0.0611627772,
-0.1438084096,
-0.2677904367,
0.0517565012,
-0.0405487977,
0.2658188045,
0.0788354576,
0.1630345136,
-0.1532446444,
-0.1279493123,
0.1677361578,
0.0161563382,
0.1190768182,
0.181191504,
0.2898450196,
0.2418311983,
0.0977224261,
0.3397428691,
-0.1506508887,
0.1656903774,
0.2621914446,
-0.2122151405,
0.5130614042,
0.2099908739,
-0.341871053,
0.2145705521,
0.304305315,
0.2497878373,
-0.2347815335,
-0.0134933684,
0.3976300657,
-0.3324230015,
-0.3621890843,
0.2209202051,
0.0979547352,
0.0677654147,
-0.3353710175,
0.1729924828,
-0.0006702542,
-0.1629149318,
0.5282339454,
-0.1989284456,
-0.1802894175,
0.4125847518,
0.3348653615,
0.7906063795,
-0.2270302176,
-0.0294164792,
-0.1285952181,
0.1680883765,
-0.2251195014,
-0.0527224354,
0.2099867463,
-0.3576921225,
-0.3432036638,
0.0697159171,
-0.1616351753,
-0.4296001196,
0.1389190406,
-0.1863939911,
0.1940373182,
0.0250120908,
-0.2598969936,
-0.3387441635,
0.005012691,
0.4230814278,
-0.1667318046,
0.1342788637,
0.2063210607,
0.4948374331,
-0.1253346503,
0.0594343878,
0.1302827299,
-0.3505457044,
0.1049799323,
-0.4203339219,
-0.0742863193,
-0.3331217468,
0.2314802855,
-0.2039553672,
-0.0209639296,
-0.1774665862,
0.1904472113,
-0.0350682959,
-0.1221125722,
-0.0045948215,
-0.1546929032,
0.3599348664,
0.3342029154,
0.1538673639,
-0.1454567313,
0.305801928,
0.0406037122,
-0.4359683394,
-0.1341867,
0.134648174,
-0.3190567195,
0.1046993583,
0.2870687246,
-0.2308183014,
0.023336526,
-0.1520987451,
0.1466378272,
-0.0560920089,
-0.0413319431,
0.1845853925,
0.0453629307,
-0.2868357301,
0.8450875282,
-0.1319475472,
-0.4104746878,
-0.0337706134,
0.4405320287,
0.2351766676,
0.0902180076,
0.0502428971,
0.1424870044,
-0.2120416313,
-0.2564302683,
-0.0198031217,
0.1026874632,
0.1887370944,
0.1092246771,
-0.2817034423,
-0.1780261397,
0.2748166919,
0.1344035566,
0.2005607784,
0.413420707,
-0.2734939456,
-0.1706083268,
-0.174300909,
0.084912315,
-0.0985691398,
0.1442487538,
-0.0243016817,
0.0028557442,
0.0284102336,
0.18627958,
-0.424574554,
0.1236379519,
-0.377528131,
0.1838731468,
0.0018024296,
-0.1312130541,
0.1206212044,
-0.0559810176,
0.2369421273,
0.1135866344,
-0.1563913375,
-0.2942230999,
0.019092299,
0.066120103,
-0.0740264729,
-0.1426884532,
-0.0690258741,
0.0293126553,
-0.2474026382,
-0.3549549878,
0.0772439837,
0.128171593,
-0.0082228258,
0.0159808658,
0.4201313257,
-0.1592501104,
-0.1846595705,
0.0934565291,
-0.1307509243,
0.2019438595,
-0.2125239372,
0.1375515461,
0.0694646984,
-0.1816459298,
0.1447717845,
0.0715746135,
-0.021270059,
0.2221684605,
0.2003692985,
-0.1095196158,
-0.0639295205,
0.2514489889,
0.6699185371,
0.1478268355,
-0.0035350956,
0.0018873215,
-0.2060385346,
0.315081656,
-0.0202071927,
0.0604918301,
0.3824722171,
0.094636023,
0.2593531907,
0.3717329502,
0.2219747901,
0.0257652849,
0.0981080458,
0.0108807087,
-0.0218814444,
-0.3711982667,
-0.0489232875,
-0.1159255356,
0.0799679309,
0.1274136007,
0.3494868577,
-0.0602412336,
0.3144952357,
0.429608047,
0.2094266117,
0.3505509496,
0.0553855971,
-0.2268168777,
-0.1631909013,
-0.2393542677,
0.199414432,
-0.0032795221,
0.0517141558,
0.2685457766,
0.2488978356,
0.1601737291,
-0.5418085456,
-0.1194698289,
-0.2211736292,
0.1755753756,
-0.3712430894,
-0.0089359134,
-0.0270309262,
-0.1240658537,
-0.1185610741,
-0.1623262465,
-0.1113406494,
0.1844201386,
0.5847885013,
0.2405045182,
-0.0501870215,
-0.0658197775,
-0.4881672263,
-0.2559198737,
0.557631731,
-0.1734601408,
0.0864133239,
0.3961077034,
0.2808992565,
-0.0493039563,
0.4639005065,
0.4753144085,
0.1773912162,
-0.6370457411,
0.0877153203,
0.1971678138,
-0.0477366596,
-0.2354273349,
0.0636429936,
-0.013459418,
0.2053562254,
0.1832565963,
0.2913366556,
-0.2434186041,
0.098301813,
0.269497335,
0.1083287895,
0.3435382247,
-0.1505184174,
-0.0212079231,
-0.2053918242,
0.0279432386,
-0.1470686346,
-0.1231898963,
0.0789561272,
0.1648572683,
-0.4540511966,
0.2122902125,
0.1642920971,
0.1567546427,
0.1446088105,
0.3897768259,
-0.0588468499,
0.2958665788,
-0.3047596514,
-0.1029703915,
-0.564363122,
0.2311453074,
-0.0094628949,
-0.1113326102,
0.0587620847,
0.4745411277,
0.2023712993,
0.0816899836,
0.2203580588,
-0.0922070816,
0.2018156201,
0.7049713731,
-0.4164031744,
0.2099650502,
-0.2217616439,
-0.1741589606,
0.1488357633,
-0.1768951118,
0.0444813669,
-0.372952193,
0.3136249483,
-0.0362443253,
0.0237342194,
-0.0098149627,
0.4138362408,
0.4208665788,
0.0215058457,
0.4189322293,
-0.4326873422,
-0.0012487918,
0.0622587763,
0.2253863364,
-0.3149060309,
0.0232640784,
0.0010218974,
0.3829726577,
-0.1051051021,
-0.1056711376,
-0.2199727595,
0.0568576902,
0.1212075129,
-0.0350116342,
-0.2539077997,
0.0556142703,
0.0466684252,
-0.0884669721,
-0.0444155484,
0.4599958658,
-0.0279925466,
0.3139515519,
-0.0548091978,
-0.2318792939,
0.2483797371,
-0.1875479072,
-0.2203114927,
-0.3956337571,
0.2779985666,
-0.2484604418,
0.1722340435,
-0.0953582674,
0.0872730613,
0.1940007806,
0.0580678172,
-0.4339936376,
-0.0281169321,
-0.1502596438,
-0.149198696,
-0.1352632791,
-0.0007942468,
0.1186171472,
-0.2021312416,
-0.1844309866,
-0.1668343246
] |
https://github.com/huggingface/datasets/issues/595 | `Dataset`/`DatasetDict` has no attribute 'save_to_disk' | `pip install git+https://github.com/huggingface/nlp.git` should have done the job.
Did you uninstall `nlp` before installing from github ? | Hi,
As the title indicates, both `Dataset` and `DatasetDict` classes don't seem to have the `save_to_disk` method. While the file [`arrow_dataset.py`](https://github.com/huggingface/nlp/blob/34bf0b03bfe03e7f77b8fec1cd48f5452c4fc7c1/src/nlp/arrow_dataset.py) in the repo here has the method, the file `arrow_dataset.py` which is saved after `pip install nlp -U` in my `conda` environment DOES NOT contain the `save_to_disk` method. I even tried `pip install git+https://github.com/huggingface/nlp.git ` and still no luck. Do I need to install the library in another way? | 17 | `Dataset`/`DatasetDict` has no attribute 'save_to_disk'
Hi,
As the title indicates, both `Dataset` and `DatasetDict` classes don't seem to have the `save_to_disk` method. While the file [`arrow_dataset.py`](https://github.com/huggingface/nlp/blob/34bf0b03bfe03e7f77b8fec1cd48f5452c4fc7c1/src/nlp/arrow_dataset.py) in the repo here has the method, the file `arrow_dataset.py` which is saved after `pip install nlp -U` in my `conda` environment DOES NOT contain the `save_to_disk` method. I even tried `pip install git+https://github.com/huggingface/nlp.git ` and still no luck. Do I need to install the library in another way?
`pip install git+https://github.com/huggingface/nlp.git` should have done the job.
Did you uninstall `nlp` before installing from github ? | [
-0.057685446,
0.3064988852,
-0.0244450569,
0.0952456743,
0.1892765462,
0.2053214461,
-0.0553884804,
0.1952709258,
-0.1948142946,
-0.0407714695,
0.1610011309,
0.6358865499,
-0.2991714776,
-0.0846177116,
0.4080446959,
0.0153641468,
0.2040480822,
0.4670259058,
0.072053358,
-0.2329987139,
-0.1809493899,
0.3781038225,
0.0341220796,
0.0287981685,
-0.1356096417,
-0.1028220579,
-0.2556762099,
-0.1338192225,
-0.0067098457,
-0.4790410995,
0.5973632932,
0.0164338015,
-0.1387767792,
0.1500871181,
-0.0001247489,
-0.1039687842,
0.1948116422,
-0.0901650786,
-0.5097925663,
-0.1813153476,
-0.0954520926,
-0.4069553018,
0.4231100678,
-0.3911998868,
0.0021167397,
-0.070933558,
0.3261788189,
-0.0896708146,
0.2776719928,
0.23945041,
0.1396990716,
0.0691262707,
0.2781967819,
-0.0320286155,
-0.0457660556,
0.3639846146,
-0.2870586514,
0.1687532514,
-0.0774379373,
-0.0247768164,
0.2729672194,
0.0327315889,
-0.0990405381,
-0.161460638,
0.3065212369,
0.1500529796,
-0.2726333141,
-0.1285836846,
-0.1275785565,
-0.1237386763,
0.4632945657,
-0.4910502434,
-0.3580474257,
-0.117212534,
0.1763522327,
-0.3220730424,
0.1518508792,
0.2760243118,
-0.2431337684,
0.0617488399,
-0.148297444,
-0.4829069674,
-0.2292246073,
0.3118004501,
0.1502443552,
-0.0112669952,
-0.2004211247,
0.096509099,
0.2891165018,
0.0886986703,
0.0511452332,
-0.0441863574,
0.1211479306,
0.0314179212,
0.1531010568,
-0.1660220027,
-0.0288151465,
-0.011626672,
-0.0508580618,
0.3493903279,
0.1532946825,
-0.2702507079,
-0.0151150376,
0.2592651546,
0.0581007712,
0.2494723946,
0.5383612514,
0.3887924552,
0.4372165501,
-0.1594177783,
0.2229382545,
-0.0370939448,
0.1210598499,
0.015488226,
0.1832455844,
-0.2957259715,
0.2924042344,
-0.3235225081,
-0.2845232785,
0.0290024355,
0.1464636624,
-0.0753198415,
-0.0601791069,
0.2803545296,
0.1070967466,
0.241839096,
0.1395725906,
0.3593184054,
0.0801759362,
-0.0368364155,
-0.1544836015,
0.0541917272,
0.0157965124,
0.1448022425,
0.591796577,
-0.068556726,
0.3948674798,
-0.0441053659,
-0.3212048709,
0.0737065971,
0.0234102458,
-0.1391796023,
0.0231915545,
0.2737835944,
0.010185577,
0.2499600798,
0.0163397808,
-0.2614889145,
-0.3220078945,
0.1296746731,
-0.0567859076,
-0.0743324161,
-0.0439535752,
0.0210303031,
0.1145610213,
-0.3268062472,
-0.2669492662,
0.1397688687,
0.3172677159,
-0.3323732913,
0.0321766473,
-0.1229315251,
0.0017136335,
-0.4182410836,
0.3721024394,
0.0798884481,
-0.1918759793,
-0.1784035861,
-0.0792948306,
-0.0416532978,
0.2084513307,
0.3483616114,
-0.0795305669,
0.1954371333,
-0.1444524974,
0.0965557247,
0.8378013372,
-0.5005511642,
-0.5021434426,
-0.0146165416,
0.2384307981,
-0.1392712891,
-0.1509952247,
0.2623469234,
0.0058819093,
-0.1259306073,
0.1444756985,
0.2695759535,
0.1040390879,
0.0157508403,
-0.3102186322,
-0.0621166117,
0.2738031149,
0.0696569234,
-0.1182744503,
0.0085188113,
-0.0438958257,
0.091457516,
0.2146678567,
0.033631295,
0.1904894114,
0.2619517744,
0.2920949161,
-0.0155355893,
-0.2956847548,
-0.1719348431,
-0.3159611821,
-0.0912840366,
0.061864689,
-0.2314924598,
-0.1292691827,
-0.0992315114,
-0.04523772,
-0.0977914184,
-0.1335942149,
-0.2724176943,
0.0092170052,
0.0959907547,
0.206470564,
0.3455794454,
-0.4039349258,
0.2961392403,
-0.021274101,
0.1568085551,
-0.2580246925,
0.2367766201,
-0.2979540527,
-0.3281769753,
-0.0922595486,
0.4159758687,
0.0844308063,
-0.0270584114,
-0.0332020558,
0.3201881945,
-0.3143808842,
-0.1482997984,
0.1487740129,
-0.0822434872,
0.2005811632,
-0.4163780808,
-0.0879671201,
0.0980148315,
0.1353492886,
0.0386446491,
-0.3379628956,
0.1252186745,
0.1524576396,
0.1615886688,
0.2309929132,
0.2231131047,
0.2986207008,
-0.0681876838,
-0.0921692178,
-0.0746713579,
0.2888092697,
-0.1447556913,
0.4032862782,
-0.265285939,
-0.3689488769,
0.1862707436,
0.5735304952,
0.1868014038,
0.5875403285,
0.2163975835,
-0.1049686521,
-0.0852609575,
-0.0449209586,
0.469073534,
0.6300860047,
0.1020104736,
0.031012293,
-0.1670398116,
-0.4748410881,
-0.2145324051,
0.2900677323,
-0.041559197,
0.1297398508,
0.1205123067,
0.1305898577,
0.0523799695,
-0.383654654,
-0.048610732,
-0.1777879745,
0.0709196329,
-0.0688032359,
-0.0428559706,
-0.396068275,
-0.192987591,
-0.1187580824,
-0.3759918511,
-0.2696288824,
-0.3636482954,
-0.0481493361,
0.167381227,
0.059665218,
-0.0209343806,
-0.3365979791,
0.3685641885,
-0.378170073,
-0.0989807919,
-0.3116407394,
-0.1323968768,
-0.2740914226,
-0.0800352246,
0.1156493649,
0.132353738,
0.483815372,
-0.1280394942,
0.0651177615,
-0.1661773324,
-0.1233084723,
0.042687621,
-0.1347209215,
0.2439866662,
0.061474137,
0.1860229373,
-0.3289710581,
-0.2357398868,
0.3705384135,
-0.177791357,
-0.1927278042,
0.0620924011,
0.0706932619,
-0.3187223673,
-0.2763974667,
-0.2736292779,
-0.4865005016,
-0.4724041522,
0.4291748405,
0.0022043884,
0.2098551393,
0.1633350253,
-0.1868856996,
0.3642728925,
-0.0969285443,
0.3241605759,
-0.0344821997,
-0.1116037667,
0.4596473575,
-0.457300663,
-0.5065929294,
0.0659338534,
0.1178519353,
0.3186714649,
-0.0784474984,
-0.4767915308,
-0.0632542819,
-0.3380720913,
0.1348124892,
-0.0090775564,
0.195604682,
0.3882960081,
0.0982561707,
0.0456519052,
-0.1662943661,
-0.400844276,
0.3753772974,
0.1130126193,
0.3920339346,
-0.1350151002,
0.1833524555,
-0.0676995814,
0.2905861735,
0.2983125746,
0.0744907111,
0.4115737379,
0.3516533077,
0.471950829,
-0.1254163086,
-0.5036278963,
0.0411286168,
0.0504853204,
0.1892405003,
0.0555854551,
0.0691165477,
-0.0055978801,
-0.1221469343,
-0.0195504129,
-0.2420723438,
-0.2064189613,
-0.0650286004,
-0.0861939117,
0.1582930237,
0.0334025621,
0.0539824069,
0.0585004538,
0.0301901661,
-0.071504578,
0.5122175813,
-0.0429130644,
0.2273360789,
-0.559230864,
-0.4526758194,
-0.7059276104,
0.1493567526,
-0.2508862615,
0.4395966232,
-0.1966254711,
0.1666858494,
0.1066768914,
-0.099489145,
0.3977797329,
0.0314256996,
-0.2973349094,
0.1747864634,
0.1483097225,
-0.6878679991,
0.0245518461,
-0.0301461369,
0.0808390826,
0.1477580667,
0.2117551267,
-0.3747902811,
-0.2322302759,
0.0214671455,
0.0772846341,
-0.1316453218,
0.0094013102,
-0.164786756,
-0.4430541098,
-0.2546954751,
-0.0852767453,
-0.0267473049,
0.1087700725,
0.0268113837,
-0.006970644,
-0.2921900451,
-0.1547454298,
-0.0299634524,
-0.1773817092,
0.4007045627,
-0.1765722036,
0.016297102,
-0.1245776415,
0.2342349589,
0.0951602608,
0.3477663398,
-0.2530573905,
-0.0623921156,
0.1805556118,
-0.0157290809,
0.4948551953,
0.0750762895,
-0.0007126033,
-0.1263860464,
-0.1417509913,
-0.1267602742,
-0.0193337761,
-0.1548911035,
0.167999804,
-0.1604671627,
-0.3305188119,
-0.1659292728,
0.4436076283,
0.1086933017,
0.0813200176,
0.3405787945,
0.2611694038,
-0.1473251581,
0.4154501557,
-0.0161064528,
0.8213797808,
0.1217929572,
0.6255540848,
0.1412681043,
-0.3488845825,
0.6622037888,
0.194334805,
0.3420632184,
-0.2074199617,
0.052882228,
0.0063351393,
-0.1722123623,
0.1727354676,
0.0399783216,
-0.5027008653,
0.3164103031,
0.1324864477,
-0.5379623175,
-0.0238861702,
0.1237436309,
-0.1276746392,
-0.2993038893,
-0.4431587458,
0.1207481697,
-0.2031993568,
0.3285003603,
-0.0563370995,
-0.057797119,
-0.3582286835,
-0.125717625,
-0.2703657746,
0.2586685419,
-0.1589597613,
0.6079384685,
0.0830380768,
-0.3375376761,
0.0971333161,
0.2447016537,
0.5418916345,
0.0794952884,
-0.3354736269,
-0.0393900909,
-0.2932342291,
-0.0141564719,
0.0426770262,
0.0584296249,
0.330401808,
-0.1745099127,
0.1848753244,
-0.22771281,
-0.2636814117,
-0.3943446875,
-0.0752213299,
0.0595058948,
0.020982191,
0.0394039191,
-0.2589335442,
0.1749897003,
-0.0322129652,
-0.3993843198,
0.0444162562,
0.3888376951,
0.1419640034,
0.3495689631,
0.0217036009,
-0.0110557005,
0.0761487857,
0.159962371,
-0.0485816523,
-0.136146605,
0.6374307871,
-0.2584862709,
-0.1061137244,
-0.2650417686,
0.0691266209,
-0.0883132815,
-0.353287369,
0.2797166407,
-0.0579955205,
-0.1244319379,
0.0427912213,
0.1967558563,
-0.0023864862,
-0.1989199668,
-0.11433056,
-0.3767807782,
-0.4714047909,
0.1940358132,
0.1989246309,
0.2183165848,
0.2220207006,
0.0166341066,
0.3549530804,
0.0191403534,
-0.1581321955,
-0.1340222955,
-0.23525168,
-0.0110495482,
0.310750097,
-0.0593306571,
0.1848316491,
-0.0213295519,
-0.0679535568,
0.0031086858,
-0.1043021828,
-0.1208003461,
-0.2125731111,
0.1969395131,
0.1923293173,
0.1092872396,
-0.1313859969,
-0.2397309393,
0.0439852513,
-0.1399282962,
-0.4426005483,
0.0799326524,
-0.0888808146,
0.3664726615,
0.0575380661,
0.2464977801,
0.1441571712,
0.0473046824,
-0.0154914148,
-0.0473273955,
0.0799998716,
0.082598649,
-0.0391017646,
-0.0425982028,
-0.4447648823,
-0.0654553473,
-0.1222841367,
0.1487651467,
0.0366331637,
-0.3802487254,
0.0044798106,
-0.0247701649,
0.1926672906,
0.3444615304,
-0.072133705,
0.1401771605,
0.3488788605,
0.0576967224,
-0.1214594543,
-0.1246096939,
0.2949682772,
0.0958723277,
-0.3309557438,
-0.0350717381,
-0.0882492214,
-0.1412103772,
-0.23118788,
0.0485367626,
0.2422637045,
-0.5197969079,
0.2657163143,
0.2405581772,
0.219642818,
0.1018093601,
0.192179352,
0.0511512421,
0.2464907914,
0.2721924186,
-0.2552720308,
0.3102764189,
0.2736406326,
-0.0309979171,
0.3919942379,
0.2052750587,
-0.1697523743,
-0.1164796874,
0.2805294693,
0.211879164,
0.0934592634,
0.485799998,
-0.5065949559,
-0.3788499236,
-0.0686055645,
0.4413185418,
-0.0941785797,
-0.0603896789,
-0.1129990295,
0.1150915921,
0.2628127337,
-0.1528171152,
-0.2369880527,
-0.1597887427,
0.1075637192,
0.2128525823,
0.0177870598,
-0.5339605808,
0.2798568606,
0.1498138458,
0.0744920969,
0.0285123587,
0.2995051444,
-0.1057926118,
-0.2620376945,
-0.4199062586,
0.6136074066,
0.3363558352,
0.4281280637,
0.1993325055,
0.098278299,
-0.0779117793,
0.0065819025,
-0.171457082,
0.2047611177,
0.4125984907,
-0.0030335179,
0.2666644454,
0.0383119732,
0.038198594,
0.0080966493,
0.0181995928,
0.1316436827,
-0.038886711,
0.7803915739,
-0.1914773583,
-0.5222327113,
0.0024653822,
0.3051823378,
-0.4095512629,
0.1250981688,
0.1826252043,
0.0405014977,
0.0143573321,
-0.2799426317,
0.0108900592,
0.1269238293,
0.395319432,
0.2955890894,
0.0432538092,
-0.1514936537,
-0.3134216666,
-0.5588613749,
-0.1181416884,
0.1788042337,
0.1157042459,
-0.0851336792,
0.2500915229,
-0.1083266884,
-0.0275568739,
0.1249396428,
0.2288713753,
0.1746482104,
-0.0397468358,
-0.3313184083,
0.1571528316,
0.1166600063,
0.1270062029,
0.1858628392,
-0.1724364161,
0.1059404388,
0.0519057587,
-0.0609419867,
0.0912807584,
0.2953475118,
0.0733893216,
0.2903581262,
0.3931968808,
0.0932852253,
0.3351486921,
-0.0508645475,
-0.3590811789,
0.1334626675,
-0.0794148445,
-0.1128136665,
0.1632490754,
-0.070265241,
0.3910036385,
-0.493611753,
0.0850492567,
-0.185579747,
0.1490805894,
-0.1309606284,
0.1026668847,
-0.0199990533,
0.1764272749,
0.0601875335,
0.1169021577,
0.2232002616,
0.1393833607,
-0.1337044984,
0.1963241398,
-0.6758764982,
-0.2496197224,
0.3895049095,
-0.2825265527,
-0.1214381754,
0.1087155491,
0.3584265113,
0.1816570014,
-0.0227926709,
-0.5878947973,
-0.39075315,
0.2333760113,
-0.0768754929,
-0.3335980475,
0.3162536621,
-0.3343215585,
-0.0192265064,
0.0485636294,
0.7234618664,
0.1110990047,
-0.3040367067,
0.1088187769,
-0.1862644851
] |
https://github.com/huggingface/datasets/issues/595 | `Dataset`/`DatasetDict` has no attribute 'save_to_disk' | > Did you uninstall `nlp` before installing from github ?
I did not. I created a new environment and installed `nlp` directly from `github` and it worked!
Thanks.
| Hi,
As the title indicates, both `Dataset` and `DatasetDict` classes don't seem to have the `save_to_disk` method. While the file [`arrow_dataset.py`](https://github.com/huggingface/nlp/blob/34bf0b03bfe03e7f77b8fec1cd48f5452c4fc7c1/src/nlp/arrow_dataset.py) in the repo here has the method, the file `arrow_dataset.py` which is saved after `pip install nlp -U` in my `conda` environment DOES NOT contain the `save_to_disk` method. I even tried `pip install git+https://github.com/huggingface/nlp.git ` and still no luck. Do I need to install the library in another way? | 28 | `Dataset`/`DatasetDict` has no attribute 'save_to_disk'
Hi,
As the title indicates, both `Dataset` and `DatasetDict` classes don't seem to have the `save_to_disk` method. While the file [`arrow_dataset.py`](https://github.com/huggingface/nlp/blob/34bf0b03bfe03e7f77b8fec1cd48f5452c4fc7c1/src/nlp/arrow_dataset.py) in the repo here has the method, the file `arrow_dataset.py` which is saved after `pip install nlp -U` in my `conda` environment DOES NOT contain the `save_to_disk` method. I even tried `pip install git+https://github.com/huggingface/nlp.git ` and still no luck. Do I need to install the library in another way?
> Did you uninstall `nlp` before installing from github ?
I did not. I created a new environment and installed `nlp` directly from `github` and it worked!
Thanks.
| [
-0.051175978,
0.3495247364,
-0.0176615305,
0.1092848778,
0.1668479741,
0.1748827845,
-0.0380231962,
0.1976731122,
-0.1885309964,
-0.0432849228,
0.1474729776,
0.6693126559,
-0.3041142821,
-0.0753773674,
0.4125317335,
0.0165581256,
0.2039824426,
0.4532090724,
0.0589580797,
-0.2329030782,
-0.1727201045,
0.3684437573,
0.0128147155,
0.0299811754,
-0.1168764755,
-0.0699490607,
-0.2774167359,
-0.1472544074,
0.0213423297,
-0.487875849,
0.6232536435,
0.0256579071,
-0.0995817333,
0.1401851922,
-0.0001252176,
-0.0947314501,
0.2191042155,
-0.0743310452,
-0.4842473567,
-0.1860629171,
-0.1118606925,
-0.4106759429,
0.4256404042,
-0.4007544518,
-0.0077745765,
-0.1093166843,
0.3234488964,
-0.0821148977,
0.2835480869,
0.2425791323,
0.1348460913,
0.0490632653,
0.2647362351,
-0.0296145752,
-0.0352694094,
0.3740504682,
-0.2868793607,
0.1654682606,
-0.0698284209,
-0.0520023331,
0.2786717415,
0.0281591713,
-0.1160390079,
-0.1483510137,
0.2933445573,
0.1480831951,
-0.2648942173,
-0.1451871395,
-0.1403986365,
-0.1053525656,
0.4972682595,
-0.4533621073,
-0.3359110355,
-0.0857649818,
0.1459581703,
-0.3395586014,
0.1564504504,
0.2751292586,
-0.2350641489,
0.0538977236,
-0.1645870954,
-0.4610550702,
-0.2480333596,
0.3096341789,
0.1412262917,
0.022556277,
-0.194624275,
0.10923291,
0.2540321946,
0.0773192495,
0.080960393,
-0.0396721996,
0.1217783093,
0.0294953026,
0.1341828406,
-0.1684677005,
-0.0206552222,
-0.0255479775,
-0.0454424247,
0.3416431248,
0.1140735373,
-0.2621801496,
-0.0336803459,
0.2551233768,
0.0709180608,
0.2513376474,
0.568415761,
0.3965989351,
0.4513339102,
-0.180692479,
0.1973210871,
-0.0392239317,
0.1051935107,
-0.011481747,
0.1841276735,
-0.2710449696,
0.2821147144,
-0.3572695255,
-0.2509008944,
0.0311011821,
0.1265180409,
-0.067160584,
-0.0512968078,
0.2768731415,
0.1140204668,
0.2726818919,
0.1592995077,
0.3604345322,
0.0709906965,
-0.0292261783,
-0.1366820186,
0.0375479273,
-0.0009317771,
0.1527215391,
0.5961232781,
-0.0289045293,
0.4048800468,
-0.0496451519,
-0.329702884,
0.0721498057,
0.0249027684,
-0.142515406,
0.0376487076,
0.2746223509,
-0.0059602633,
0.2406769693,
0.0008028401,
-0.2513995171,
-0.3149657845,
0.1724731326,
-0.0685224831,
-0.0784180388,
-0.0470561571,
0.0120061273,
0.1201018095,
-0.3300992846,
-0.2568698227,
0.1341208816,
0.3234740794,
-0.3195581734,
0.0261892751,
-0.1375034451,
-0.0047141388,
-0.4191740155,
0.3578745723,
0.0753014311,
-0.2224452347,
-0.1512532383,
-0.0840385705,
-0.0440829322,
0.2268166393,
0.3499016464,
-0.1063028947,
0.1825852245,
-0.1158607006,
0.0924045146,
0.8516136408,
-0.4896753132,
-0.4979109168,
-0.016196344,
0.2707596123,
-0.1674657464,
-0.1509116888,
0.2612157464,
0.0241099559,
-0.1258378178,
0.1311419308,
0.2887351215,
0.1377987713,
0.0192628726,
-0.3267796338,
-0.0591939986,
0.3038319647,
0.0636945665,
-0.1093654782,
0.020269582,
-0.048257336,
0.081566155,
0.2218415737,
0.0228145681,
0.190859586,
0.266526401,
0.3088767529,
-0.0222316459,
-0.2593230307,
-0.1783488691,
-0.2849612534,
-0.0821532831,
0.0372204855,
-0.2111757845,
-0.09523312,
-0.0784113929,
-0.0413033217,
-0.0949671268,
-0.1408418864,
-0.2969381809,
-0.0042202808,
0.0681392401,
0.2069658637,
0.3199115396,
-0.407087326,
0.2959429026,
-0.019058302,
0.1253739297,
-0.2283025235,
0.2535345554,
-0.301104039,
-0.3270966411,
-0.0948332548,
0.3961183131,
0.092349261,
-0.021085633,
-0.0583558381,
0.3021885753,
-0.2987954915,
-0.1604404449,
0.1566129327,
-0.1066599563,
0.211446479,
-0.4306337237,
-0.0665081367,
0.0895550549,
0.1336699575,
0.0361180529,
-0.3419331312,
0.1272632182,
0.1734530628,
0.1750141531,
0.2303905487,
0.2305628955,
0.3229707479,
-0.0843194872,
-0.0659547746,
-0.0693054795,
0.2749515474,
-0.1217547804,
0.4015532136,
-0.2916475236,
-0.3590726852,
0.2103358656,
0.5552312732,
0.2151247859,
0.5608871579,
0.2083572149,
-0.1154355407,
-0.1199426651,
-0.0386919454,
0.4832510352,
0.6548287868,
0.0990173817,
0.0429355092,
-0.1476527601,
-0.4620837569,
-0.2071963847,
0.3073486686,
-0.0605249554,
0.1345580816,
0.1047623232,
0.1706266552,
0.0544821322,
-0.3766707778,
-0.0536191948,
-0.1458651721,
0.0812435672,
-0.0552989654,
-0.0335778072,
-0.4216028154,
-0.2106355429,
-0.100279361,
-0.3712032139,
-0.2571747601,
-0.3521024287,
-0.0483966209,
0.1493589133,
0.0526545793,
-0.005059192,
-0.3272662163,
0.3506120443,
-0.3900288045,
-0.073400788,
-0.3050142527,
-0.1633828878,
-0.2843655944,
-0.0878531784,
0.1165301353,
0.1370328665,
0.4658263922,
-0.1015403047,
0.0480971932,
-0.1767475903,
-0.1433371305,
0.0440042131,
-0.106602639,
0.2553693354,
0.0543653071,
0.1757210344,
-0.3232786059,
-0.221773237,
0.3675333858,
-0.1886542588,
-0.1849095225,
0.0674370974,
0.0623549968,
-0.3191285133,
-0.2779294252,
-0.2873333097,
-0.5013769269,
-0.4498513043,
0.3742218614,
0.00613188,
0.2335650772,
0.1620811522,
-0.216931954,
0.3564762175,
-0.080360204,
0.2904996276,
-0.024606226,
-0.115003109,
0.4654689729,
-0.4859731793,
-0.5153086782,
0.0681332201,
0.134435311,
0.3015756607,
-0.0698587894,
-0.4759968519,
-0.0519193597,
-0.3486809731,
0.1344188601,
0.031137757,
0.1858063638,
0.406175524,
0.1082542092,
0.0428352021,
-0.1794449687,
-0.404648453,
0.3869391382,
0.1595668793,
0.4086625278,
-0.123414427,
0.1844182462,
-0.0775085986,
0.3114274442,
0.2893514633,
0.0900779068,
0.4319110215,
0.3623687923,
0.4333247244,
-0.1328257769,
-0.4879173636,
0.0329264477,
0.0417346023,
0.1875581443,
0.0343160927,
0.079615593,
-0.0590604767,
-0.1343891621,
-0.0549029708,
-0.2365453839,
-0.1970096827,
-0.073875308,
-0.0758690313,
0.1651832163,
0.0359007344,
0.0366296992,
0.0521710962,
0.0121738166,
-0.0941133797,
0.5079706907,
-0.0324251726,
0.2057264745,
-0.5988539457,
-0.4365366995,
-0.6885818243,
0.1605244875,
-0.280225426,
0.4618176818,
-0.1839819252,
0.1600233763,
0.1078511849,
-0.118913129,
0.3920047283,
0.0264735278,
-0.2874023616,
0.1963048279,
0.1640448421,
-0.6690200567,
0.0100665912,
-0.0221416503,
0.0774899721,
0.1449767351,
0.1754984856,
-0.3347236514,
-0.2417743355,
0.0204196572,
0.0821195543,
-0.1349853724,
0.0013742559,
-0.1567964256,
-0.4167925119,
-0.2565543354,
-0.0940179825,
-0.0602873228,
0.1025562584,
-0.0005181544,
-0.0125445575,
-0.2849374413,
-0.1403105557,
-0.0548362285,
-0.1625195891,
0.4008567035,
-0.2031633258,
0.0290596262,
-0.1423398852,
0.2161473483,
0.1099636853,
0.3306182325,
-0.2481498122,
-0.0624472909,
0.1909239143,
-0.0227070637,
0.5063280463,
0.0863354877,
-0.0102371834,
-0.1358548403,
-0.1407330632,
-0.1478158385,
-0.0070227291,
-0.1694612503,
0.1523396671,
-0.1671596766,
-0.3186496198,
-0.1829835773,
0.4536819756,
0.1151346937,
0.0693847388,
0.3714611232,
0.2221018225,
-0.1585941166,
0.4316369593,
-0.0246999711,
0.8286110759,
0.1389376819,
0.6248916388,
0.1214364469,
-0.3604223728,
0.666231215,
0.2118098438,
0.3587364852,
-0.2334712297,
0.0666394606,
-0.0030154586,
-0.1960454732,
0.1963263899,
0.0513781197,
-0.4806957245,
0.3287312686,
0.1060487479,
-0.5526171923,
-0.0421291962,
0.1051836908,
-0.0862590075,
-0.2951772809,
-0.4481682479,
0.1107850671,
-0.1988898516,
0.3293922544,
-0.0434379727,
-0.0704305023,
-0.3588097394,
-0.1231827736,
-0.2659943998,
0.2920845449,
-0.1017145291,
0.6084007621,
0.0909377337,
-0.3310433328,
0.1179754585,
0.2404783964,
0.5367373824,
0.0682953149,
-0.345130533,
-0.0286541414,
-0.2723816633,
0.0116611011,
0.0727144778,
0.0613140427,
0.3213415742,
-0.1756272018,
0.1509183049,
-0.2304439694,
-0.2583703399,
-0.3924551904,
-0.064520359,
0.0728891566,
0.0564482436,
0.0609992072,
-0.2666788995,
0.1987737119,
-0.0558424965,
-0.3915694058,
0.0442521833,
0.3813622296,
0.1286257207,
0.3738574386,
0.0004810712,
-0.0010865033,
0.0977271199,
0.1469852477,
-0.033375442,
-0.1118765473,
0.6497094035,
-0.2514716983,
-0.1003413647,
-0.2615315914,
0.0792423561,
-0.1083054468,
-0.3478524685,
0.2586362958,
-0.0585951582,
-0.0978538767,
0.0621515326,
0.1924932152,
-0.0522925109,
-0.182787478,
-0.1236052066,
-0.3982816041,
-0.4713502526,
0.1672694534,
0.1461134553,
0.2080513835,
0.2146082371,
-0.0032130033,
0.368601799,
0.0062743966,
-0.1483166367,
-0.1474383771,
-0.2444082946,
0.0028685089,
0.2821365595,
-0.0443984158,
0.1526606381,
-0.0413819738,
-0.0892751291,
-0.0028669108,
-0.1005370095,
-0.1137331501,
-0.2085820735,
0.2042340785,
0.1988558024,
0.0803738534,
-0.136183247,
-0.2762065828,
0.0311331265,
-0.1081658229,
-0.4312263131,
0.0681488141,
-0.0990016386,
0.3777640164,
0.0549806729,
0.2136467546,
0.133639276,
0.0711663589,
0.0015578493,
-0.0568110757,
0.0871938318,
0.0867157131,
-0.0195385534,
-0.0442281663,
-0.4699700177,
-0.0777489692,
-0.16609326,
0.1743428558,
0.0513617545,
-0.3840720057,
0.0083666593,
-0.0289267134,
0.1852201074,
0.368047297,
-0.048699595,
0.1466161609,
0.3838504255,
0.0455298498,
-0.1440125108,
-0.1348595619,
0.3171976209,
0.102792412,
-0.3474205136,
-0.0502429307,
-0.0638300627,
-0.1537362039,
-0.2240029871,
0.058465369,
0.2263320386,
-0.5590145588,
0.260012418,
0.2483329773,
0.2128689587,
0.1100931168,
0.2052619457,
0.0609769262,
0.2157881409,
0.2888850272,
-0.2264429331,
0.2971999645,
0.3332092762,
-0.0283938088,
0.3937421739,
0.210358724,
-0.1808765531,
-0.1364279687,
0.2831425667,
0.1936684996,
0.0886235088,
0.4431956708,
-0.4910352826,
-0.3830979168,
-0.0617896467,
0.4575688541,
-0.0753065199,
-0.0721779019,
-0.1289484501,
0.1098824292,
0.287389338,
-0.1455774009,
-0.2623183429,
-0.1291572154,
0.1120996922,
0.2135378718,
0.0100191049,
-0.5514191985,
0.2878577113,
0.1470239758,
0.0715339631,
0.0286221057,
0.3166797757,
-0.1223570108,
-0.243540585,
-0.4203832448,
0.6235865951,
0.3259894252,
0.4648932219,
0.1818103641,
0.0967924222,
-0.0837047398,
0.0049347952,
-0.163079232,
0.2107359469,
0.4334669411,
-0.0168519318,
0.2520664632,
0.0186229013,
0.0630383193,
0.0313764885,
0.0300563406,
0.1394675076,
-0.0340470746,
0.7894075513,
-0.1301582456,
-0.5200197697,
0.0189709291,
0.2907424271,
-0.3882833123,
0.1396140456,
0.2263671905,
0.0553826541,
0.0139355697,
-0.2838945985,
0.004855182,
0.1097204611,
0.4014545977,
0.2755794227,
0.0409507528,
-0.1465992182,
-0.2888377011,
-0.5644545555,
-0.1432637423,
0.171091795,
0.1132694855,
-0.0880269483,
0.2399015725,
-0.0980881751,
-0.0342175774,
0.1278227568,
0.23469989,
0.1664561927,
-0.0448133834,
-0.3491310477,
0.1687339395,
0.1297891438,
0.1550361812,
0.1488063335,
-0.1924626231,
0.0946165025,
0.0623644926,
-0.0870191604,
0.1077696681,
0.2719552517,
0.0959562808,
0.3114101291,
0.3811458051,
0.1047640219,
0.3348772824,
-0.0284330957,
-0.3347640634,
0.1108494252,
-0.0951380357,
-0.0980136171,
0.1521931887,
-0.0495535322,
0.3603441119,
-0.4953611791,
0.0990427881,
-0.1936008781,
0.1300022751,
-0.1606474817,
0.0955837145,
-0.0119375326,
0.1552876234,
0.0668631792,
0.1224891692,
0.2099285722,
0.1312452108,
-0.144149065,
0.1677515954,
-0.6890085936,
-0.2295209169,
0.3672011495,
-0.293967545,
-0.1553185731,
0.0923160315,
0.3587691188,
0.1638011038,
-0.0171200465,
-0.5786734223,
-0.4102854431,
0.2423867732,
-0.0847051591,
-0.3228685856,
0.2954092622,
-0.300037086,
0.0040376112,
0.0361745358,
0.6977638006,
0.1210366637,
-0.2942038774,
0.0681874976,
-0.1811478734
] |
https://github.com/huggingface/datasets/issues/590 | The process cannot access the file because it is being used by another process (windows) | Hi, which version of `nlp` are you using?
By the way we'll be releasing today a significant update fixing many issues (but also comprising a few breaking changes).
You can see more informations here #545 and try it by installing from source from the master branch. | Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
``` | 46 | The process cannot access the file because it is being used by another process (windows)
Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
```
Hi, which version of `nlp` are you using?
By the way we'll be releasing today a significant update fixing many issues (but also comprising a few breaking changes).
You can see more informations here #545 and try it by installing from source from the master branch. | [
-0.1765668839,
0.170540601,
-0.0338352248,
0.1972524524,
0.2760625482,
0.0469865352,
0.2461638302,
0.3142569661,
0.1317558289,
0.1372043341,
-0.0612206236,
0.529302299,
-0.3115151823,
-0.0680440813,
0.0767635852,
0.0142780319,
-0.0274666771,
0.122323215,
0.0646730959,
0.1522337645,
-0.4235681295,
0.0898190066,
-0.3216854036,
0.3932021856,
-0.2467509508,
-0.2833741903,
-0.0073879808,
0.4002690315,
-0.1476263404,
-0.3065513074,
-0.1500898302,
-0.1016813666,
0.1635935903,
0.3397302032,
-0.0001092507,
0.1117592007,
0.1657466292,
0.0723256171,
-0.0584405363,
-0.1861774325,
-0.0081052557,
-0.4370205402,
-0.0492739603,
-0.0949191749,
0.250993371,
-0.0299175605,
0.2664937079,
-0.2296125889,
0.3667315543,
0.4969293475,
0.268247366,
0.2371562421,
0.1719655246,
-0.082258746,
0.4717099965,
-0.0912237763,
-0.0806376785,
0.5170612931,
0.2832084,
-0.3384491205,
-0.0735352784,
0.1981331408,
-0.1567199677,
-0.2741838098,
-0.0703399852,
-0.1646666229,
0.3293886185,
-0.5745836496,
0.3498038948,
-0.0944102556,
0.1831559092,
-0.0103390357,
-0.112781018,
-0.128883332,
-0.0901361555,
-0.1023519263,
0.2645033896,
0.4485737383,
-0.2464573085,
0.1231352612,
-0.1331924796,
-0.1977986395,
-0.2744648457,
0.0432029441,
0.4913314581,
0.1742408723,
-0.0321539119,
0.2832347751,
0.3321834803,
0.0047875941,
0.1627506763,
0.1906651556,
0.0822516233,
0.2991726398,
-0.2648767829,
0.2358038723,
-0.1145856753,
0.3475695252,
-0.2456607968,
0.2115575969,
-0.0407707617,
-0.2607629895,
0.2219273597,
0.3335287571,
-0.096291624,
0.3657916784,
-0.1115012169,
0.0800432563,
0.3007518351,
-0.0706380755,
-0.1508823037,
0.0309133343,
-0.512748003,
-0.7366865873,
-0.3002049923,
0.2035432309,
0.3316034973,
0.0195167139,
-0.2784527838,
-0.210852325,
-0.1304582804,
0.1237713844,
0.2527555227,
0.3760262132,
-0.0118372738,
0.1665928364,
0.2711791396,
0.2126280069,
-0.1514803469,
0.1475892216,
0.0255004168,
0.2315756083,
-0.8735946417,
-0.0115484707,
0.1191696227,
-0.0363753624,
0.0999687761,
-0.0447638519,
-0.2247332931,
-0.2866721749,
0.2228052318,
-0.1009205803,
-0.0311278738,
0.1471239626,
0.0334278494,
0.0902858526,
0.1703852713,
-0.2340902984,
0.0005170852,
0.2663550973,
-0.4707209468,
-0.3566275537,
-0.1959465891,
0.1416383684,
-0.0774975196,
-0.0170256849,
0.5023505092,
-0.042513717,
0.2484273165,
-0.3122110665,
-0.0222018994,
-0.006507617,
-0.3388593793,
-0.3757829964,
-0.175288111,
0.1712941229,
-0.4227337539,
0.0585827902,
-0.0057380721,
-0.1998028755,
0.2495298088,
0.424210459,
-0.2043908685,
0.3054889441,
-0.065872401,
0.0704582036,
0.8549775481,
-0.1990094185,
-0.608281374,
0.3079840541,
-0.5397827625,
-0.3694061637,
0.1779019237,
0.245678097,
0.2107002437,
-0.0261023361,
0.3786219954,
0.0901166499,
0.2324511856,
-0.0657922626,
-0.2402688414,
-0.1774431765,
0.2858873308,
0.0218282863,
-0.1026389375,
0.2008702159,
0.2385696322,
0.2402629256,
0.3049709797,
-0.10665676,
0.3304069638,
0.2181222886,
0.2690803111,
0.3724411726,
-0.1290143728,
-0.0760852098,
0.1096111238,
0.1873377264,
-0.2275868356,
0.0113489479,
0.0424693301,
-0.1055791304,
-0.1242936552,
0.1277855635,
-0.4303869307,
-0.3422091603,
0.173395738,
0.138312906,
-0.0282928981,
-0.074161604,
-0.0046197325,
0.0543020368,
-0.0665287375,
-0.0914481953,
-0.3181313872,
0.1229489818,
-0.4717281163,
-0.1494161338,
-0.1340329945,
0.3342448771,
0.1710793972,
0.1060620397,
-0.167899251,
0.1558743715,
0.0076920353,
0.0440262966,
0.1992293149,
-0.0338904932,
-0.0364172757,
-0.1528819799,
0.2365458459,
0.0526505448,
0.0528611243,
0.1382521391,
0.1897076815,
0.1646212339,
0.0175440703,
0.2983321548,
0.0722367316,
0.4031209648,
0.0345553756,
-0.0202593356,
0.0355993807,
0.1939071268,
0.3396205902,
0.0514371432,
-0.0347824022,
0.0088162534,
-0.0511058047,
-0.0156217963,
0.5277144909,
0.1092472523,
-0.0153417401,
0.0442286544,
-0.110706538,
-0.0175627545,
0.0518274754,
0.3424904943,
0.4299670756,
0.0601910278,
0.0533260778,
0.1188571528,
-0.0037310831,
-0.0990176201,
0.0872749165,
0.1116728336,
0.3432449102,
0.1183099076,
0.154489249,
0.2098273486,
-0.1425982714,
-0.3123350143,
0.1767476797,
0.4526072443,
-0.2081016302,
-0.0024069175,
-0.2591178417,
-0.3028214574,
-0.357477963,
0.0808269829,
-0.2272002399,
-0.0903565735,
0.0413532332,
0.2309692651,
0.0392376184,
0.2571948171,
-0.196036309,
0.0636121258,
-0.107097216,
-0.0212111603,
0.1065062135,
-0.2841465175,
-0.3487018347,
0.073253192,
0.4635370672,
-0.0886284485,
0.3015561402,
0.0722926557,
0.1173652411,
-0.1981844902,
-0.1850032806,
-0.1156250611,
-0.0971712247,
0.0256878939,
-0.0124273114,
0.1032664031,
0.0239476264,
-0.4096835256,
0.3573966622,
-0.360668242,
-0.2367312014,
0.0684053302,
-0.1406069249,
-0.3221733868,
-0.2491959929,
-0.3021600544,
-0.3738675416,
-0.5032863617,
0.4350881875,
-0.1762312949,
0.2365989685,
0.210287571,
-0.0092039518,
0.406984508,
0.0855108798,
0.1652593017,
-0.0485856049,
0.1821393967,
-0.0066835899,
-0.2967911959,
-0.3691343665,
0.1906415671,
0.185684666,
0.0145067945,
0.1041684374,
-0.2383522093,
-0.1529035866,
-0.3123940527,
-0.0752278864,
-0.0741516054,
-0.0384571701,
0.4369110167,
0.165457651,
-0.1200054884,
-0.0298600458,
-0.2360510677,
0.1802695096,
0.0876835585,
0.1502384841,
0.1737026572,
0.5097602606,
0.2883395255,
0.2158127427,
0.3241435289,
0.1112615019,
0.1727728099,
-0.2193649858,
0.1895135492,
0.0585500821,
-0.3553049564,
0.3438937068,
-0.095479317,
-0.2766894698,
0.2590983808,
-0.1013752595,
-0.2876359224,
-0.2063463479,
-0.0922213793,
-0.3289087713,
-0.1647433043,
0.0548559651,
0.1095769405,
0.2776191533,
0.0203297213,
-0.1122777089,
-0.1518609822,
-0.3898850083,
0.0563464575,
0.3294501901,
-0.111793533,
0.1413696408,
-0.3306562304,
-0.3258555532,
-0.4211723208,
0.4124718308,
0.1170854568,
0.5280156136,
-0.380548954,
-0.2255395949,
0.3277384043,
0.0289471187,
0.4310584068,
-0.0674411133,
0.0125314016,
0.0216138363,
0.2006272972,
-0.2590778768,
-0.0499866828,
-0.2014169395,
0.5041893125,
0.5344510674,
0.3871465325,
0.1936192065,
-0.1236495227,
0.0587882437,
-0.0701737627,
-0.2266269624,
0.0727825463,
-0.3008034229,
-0.2470258623,
-0.6267358065,
0.0150925592,
0.065895915,
0.3500671089,
0.1190824136,
0.1042062119,
0.0870115012,
0.1025858447,
0.0400873572,
0.0641186088,
0.2440540791,
0.1596347392,
-0.0238585137,
0.0440104716,
0.458768189,
-0.1278715283,
0.3438457251,
-0.1523287594,
-0.0505897291,
0.2836278975,
-0.0745736808,
0.4772110283,
0.4196676016,
-0.1903259605,
0.120234102,
-0.0999223962,
-0.1097334623,
-0.2003022134,
0.0211650878,
0.4794916511,
-0.1645487994,
-0.169609949,
-0.138150841,
0.0144468527,
0.0097681731,
-0.1500310302,
0.0664378554,
-0.2481897175,
-0.1457013339,
0.3004772067,
0.0781546459,
1.1246333122,
-0.1336108744,
0.2934189439,
0.0346854068,
-0.1152115464,
0.3822294772,
0.0316155925,
0.4024689794,
-0.3729639947,
-0.0356485732,
0.0449334532,
-0.1520716995,
-0.1320188642,
0.1621407121,
-0.0202920139,
0.047232572,
0.3583297729,
-0.0815220326,
0.1023364961,
0.1004133672,
0.2062202245,
-0.0612241775,
-0.1537361741,
0.1076200306,
-0.052532196,
0.4005292058,
-0.1503711194,
-0.1527943313,
-0.1039495319,
0.1493448168,
-0.3186147511,
0.2342416942,
-0.5883259177,
0.2398847193,
-0.2838946879,
-0.203596279,
0.1377361864,
0.5363993049,
0.0978983343,
0.1275991648,
-0.1082235724,
0.0722133294,
0.2377337217,
-0.2076977044,
0.1523756832,
0.1767380089,
0.169297114,
-0.2880901396,
-0.2455841154,
-0.0238140635,
-0.0056499578,
-0.2746349871,
-0.1823807955,
0.2214823067,
0.1758715957,
0.0375196487,
-0.2288471162,
-0.1714286059,
-0.2919130325,
-0.1695639789,
0.1304813623,
-0.1744418591,
-0.1661675721,
0.0666682869,
-0.2520960569,
-0.5136221647,
0.2282321155,
0.2680214643,
-0.2405700684,
0.0433933139,
0.5279791951,
0.029118605,
-0.2210386395,
-0.2394233048,
0.0924204439,
-0.3640983105,
-0.5032576919,
-0.0143475868,
0.083526656,
0.1502232105,
0.1606291533,
0.1007514894,
0.2525136173,
0.0363743305,
-0.1265899241,
-0.6852416396,
-0.4013647735,
0.0656943023,
-0.151758194,
0.0423025899,
-0.1014606804,
0.1632277369,
0.1880224049,
-0.2616633475,
-0.3049159646,
0.1068876609,
-0.3397384286,
0.0690585524,
0.2392172068,
-0.3161025047,
0.0863335729,
0.1420860887,
-0.0048737079,
0.106087856,
-0.09255521,
-0.2245420367,
-0.0560304224,
0.1410080045,
0.1974922717,
-0.0461966321,
-0.0985008925,
-0.4103366137,
0.0212622434,
0.0283583552,
-0.081216529,
0.1416976303,
-0.0814266205,
0.0727272332,
-0.1813591719,
0.0592009537,
0.1271413863,
0.2357470393,
-0.2389319837,
0.0665899515,
0.218126148,
0.1345452815,
0.2189330906,
0.0052001029,
-0.3545105457,
-0.1517056823,
0.1742686629,
0.19528234,
-0.224966079,
-0.2974478304,
0.0157679915,
-0.1268917918,
0.2684792876,
0.3564196229,
0.0442749858,
0.1074712947,
0.4834799469,
0.1019811034,
-0.2300042808,
0.1293837279,
0.2237644494,
-0.4923120141,
-0.2539443672,
0.1021160707,
0.1005869284,
0.0964757353,
0.0691591129,
0.0250118151,
-0.1789846867,
-0.2635630369,
-0.0562897287,
0.3038933873,
0.0431197658,
0.0624703914,
0.1846092045,
0.1837250292,
0.2011341006,
0.3358921707,
-0.1404509097,
0.1026415378,
0.1053098962,
-0.0080891158,
0.1069148406,
-0.3594929576,
-0.2406989932,
0.0367144533,
0.0672632009,
-0.0023504831,
-0.288055867,
0.7975199223,
-0.2712240815,
-0.1419472098,
0.0676134154,
0.3134135008,
-0.2785671055,
-0.1933803558,
-0.3274289966,
-0.1071231216,
-0.326384902,
0.0946330726,
-0.0282576289,
0.2516312897,
-0.1173731834,
0.0587299988,
-0.169963941,
-0.1321098804,
-0.2452429831,
0.3337425888,
-0.133959949,
-0.2808952332,
0.1638757735,
-0.2397955358,
0.2477650046,
0.0722719431,
0.1565957516,
0.401740104,
0.3074326217,
-0.3477717042,
-0.0361199118,
-0.1223837733,
0.0071288571,
-0.2038351744,
0.4214954376,
0.4806914032,
0.2369433343,
0.2757431865,
0.0185548253,
-0.1177505925,
0.1282255501,
-0.0405921601,
-0.1940853596,
0.2781762183,
0.3976896107,
-0.018102726,
-0.4306954741,
-0.1541191638,
-0.0883548707,
-0.2297394723,
-0.0503089949,
0.3019016683,
-0.0576788038,
0.1849602908,
-0.2418808341,
0.0751062706,
-0.0019746106,
0.4621952176,
0.4583956897,
0.119091332,
-0.1391512603,
-0.3664032817,
-0.6616659164,
0.4916215539,
-0.1032011062,
-0.095077157,
-0.1530800462,
-0.0902869925,
-0.384508729,
-0.0181291215,
0.2054775655,
-0.2124763131,
0.1697884947,
0.2388367057,
-0.1559884846,
0.1548923701,
0.2879475951,
-0.0596502274,
0.17048195,
-0.4716246128,
0.0754613876,
-0.1548442841,
-0.0108605847,
-0.1652823836,
-0.0244550109,
-0.309174329,
0.0255906638,
0.487398833,
0.1285871416,
0.4149319232,
-0.087980561,
-0.0702043027,
-0.6123091578,
-0.176034525,
-0.20889543,
-0.1943846047,
0.1757497936,
0.2102413923,
-0.1574781388,
0.3392834365,
-0.244216606,
0.1438350081,
-0.0030858219,
0.0850604326,
0.0815798044,
0.040526025,
0.1516460776,
0.0587925874,
0.418905437,
0.122527726,
0.0522084311,
0.0806522667,
-0.6217883825,
-0.1854169518,
0.5325635672,
-0.5758375525,
-0.4868764281,
0.4769593179,
0.211981535,
-0.05004327,
0.0520828366,
-0.3509194851,
-0.0963340402,
0.1097565815,
0.0829490423,
-0.3734716475,
0.1202868968,
-0.4446397424,
-0.101197578,
0.1041451916,
0.0656334162,
-0.0175138041,
-0.19921422,
0.0896269679,
-0.2572650611
] |
https://github.com/huggingface/datasets/issues/590 | The process cannot access the file because it is being used by another process (windows) | Ok, it's probably fixed on master. Otherwise if you can give me a fully self-contained exemple to reproduce the error, I can try to investigate. | Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
``` | 25 | The process cannot access the file because it is being used by another process (windows)
Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
```
Ok, it's probably fixed on master. Otherwise if you can give me a fully self-contained exemple to reproduce the error, I can try to investigate. | [
-0.2114658356,
0.0665601194,
-0.0459682867,
0.2789509296,
0.3627561629,
0.1312586069,
0.3703977168,
0.2482227832,
0.2258246839,
0.2196415514,
-0.1714619696,
0.2673841715,
-0.1824050993,
-0.1288230121,
-0.1086881906,
0.0654451549,
-0.0072368346,
0.0088254176,
0.0774261355,
0.2269591689,
-0.4474640787,
0.0641372427,
-0.3092193604,
0.4365320206,
-0.3265937865,
-0.2484489381,
0.0702649876,
0.4150612056,
-0.0526371002,
-0.2850815058,
-0.1506449282,
-0.0810045376,
0.1276623309,
0.553619206,
-0.0001143735,
0.2240378708,
0.0947395712,
0.0214059278,
0.0279273205,
-0.1374548525,
-0.087502256,
-0.2366850674,
-0.1152373552,
-0.0333840102,
0.164927572,
0.0310008749,
0.2015662044,
-0.3285354078,
0.2990382016,
0.4722038507,
0.2349823862,
0.2076994777,
0.1914321929,
-0.2376168221,
0.5633090138,
-0.1176090837,
-0.0370916761,
0.6386205554,
0.4471786618,
-0.2424456477,
-0.164786011,
0.177320376,
-0.2169589102,
-0.176890552,
-0.0892447382,
-0.26450333,
0.4385123551,
-0.6413012147,
0.4716545939,
-0.0395032875,
0.2375400215,
-0.0037548239,
-0.14931041,
-0.0266972855,
-0.1550510675,
0.1002502665,
0.2798582613,
0.4880426526,
-0.2075368166,
0.2474028617,
-0.1368507892,
-0.0809998363,
-0.2716525793,
-0.0632066876,
0.3949955106,
0.0777705014,
-0.0472149029,
0.2620339692,
0.3292510808,
0.0647500157,
0.2178885639,
0.0957397074,
-0.0180255324,
0.2244769931,
-0.3569529653,
0.3265303373,
-0.1940014511,
0.349080801,
-0.2772834897,
0.1277077794,
-0.115076676,
-0.2551476359,
0.2213641107,
0.4073949456,
-0.0367275625,
0.2987487018,
-0.261821568,
0.1614616513,
0.3607692122,
-0.0533831194,
-0.2936811447,
0.0442496091,
-0.5343068838,
-0.8324418068,
-0.1913454533,
0.3128834367,
0.344866842,
0.0278748572,
-0.3136814833,
-0.1413042396,
-0.144143939,
0.1395822018,
0.2897667885,
0.3550517559,
-0.0385161899,
0.1184187531,
0.191598624,
0.1892249435,
-0.069310084,
0.0967920125,
0.0672733635,
0.1931972504,
-0.846601069,
-0.0646452233,
-0.0300681517,
-0.1518510878,
0.0300290883,
0.1117195934,
-0.0966771245,
-0.4035181403,
0.2738668919,
-0.195449844,
-0.0170244798,
0.2679279149,
0.0631163642,
0.0826182365,
0.3132299781,
-0.1889098585,
0.0930424184,
0.3087937832,
-0.5649149418,
-0.4224680066,
-0.1513076425,
0.1013720185,
-0.0662614033,
0.1933199763,
0.5052140951,
-0.1071623936,
0.3107563555,
-0.3346769214,
0.0081837736,
-0.0452090129,
-0.3426702023,
-0.434448719,
-0.1166655868,
0.2212157249,
-0.4432278872,
0.1203270704,
0.123036325,
-0.3053643405,
0.1948787421,
0.3802189827,
-0.153810963,
0.3382058442,
-0.1127020121,
-0.1150260121,
0.5909103751,
-0.1668601334,
-0.6615203023,
0.247511521,
-0.5012634397,
-0.2549395859,
0.3151994944,
0.1936203688,
0.3255034089,
-0.065066196,
0.4324887991,
-0.0485435352,
0.164804399,
-0.0730373487,
-0.1286814511,
-0.1494333148,
0.1266530156,
0.0218282491,
-0.0161911175,
0.2625212371,
0.3569929898,
0.0693525374,
0.2791788578,
-0.1492738575,
0.334237963,
0.2405436337,
0.3433427811,
0.4428648055,
-0.0851450711,
-0.1356278509,
0.1946944743,
0.2756518424,
-0.0956628546,
-0.1632957458,
0.0504534431,
-0.1325707287,
-0.0460985303,
0.21534428,
-0.5356930494,
-0.2269078493,
0.1207042187,
0.1248263121,
-0.0949286669,
-0.0859352574,
0.0596957803,
-0.0047476236,
-0.0532566831,
-0.051214736,
-0.1109486893,
0.0484347492,
-0.4160233438,
-0.1190474778,
-0.2173051685,
0.169472307,
0.204947859,
0.1270709932,
-0.1877917349,
0.1887205988,
0.187229991,
0.1316923201,
0.1440540254,
0.0508290529,
-0.0603660718,
0.0016103014,
0.1975797713,
0.0844227746,
0.0869318545,
0.0391186625,
0.2750763893,
0.0635230467,
-0.0350245126,
0.3883450329,
0.00554768,
0.3557153344,
0.0042984635,
0.0541649088,
0.0657005236,
0.2314147651,
0.2577153742,
0.0834213197,
-0.0048112571,
0.065605104,
0.0971828401,
-0.0874444693,
0.4135710001,
0.0413590744,
-0.0040187854,
0.0519633181,
-0.1366136968,
0.0770382881,
0.0497036129,
0.3433395624,
0.37940678,
-0.0573428348,
0.0114711402,
0.0805657804,
0.1906320453,
-0.0809611529,
0.1559362113,
0.192627281,
0.321872741,
0.2225375772,
0.0483523756,
0.1897526383,
0.0217798501,
-0.2853981555,
0.1454843134,
0.3537412286,
-0.2817570865,
0.0333628058,
-0.1760063469,
0.0325175151,
-0.2888406813,
0.0986630321,
-0.1178459078,
-0.0183151066,
-0.0628001019,
0.3851553798,
-0.0559985712,
0.2293781936,
-0.1767351776,
-0.0484612882,
-0.0998530835,
-0.0012477767,
0.2512575984,
-0.3172087073,
-0.1889256835,
0.0308866426,
0.4870846272,
-0.3395861983,
0.2789708674,
0.0210346058,
0.1365531385,
-0.326413691,
-0.1483819634,
-0.0809009746,
-0.0233136006,
0.0468307659,
0.0199412201,
0.0718652308,
0.1895564049,
-0.381559819,
0.4461348355,
-0.2553525269,
-0.21345222,
0.1011981368,
-0.2133298218,
-0.3726611435,
-0.173494488,
-0.133959204,
-0.3441822827,
-0.4939570427,
0.4266899228,
-0.382851243,
0.1781272292,
-0.0357738212,
0.0552560128,
0.2962763906,
0.1091319621,
0.1583040953,
-0.0177807938,
0.0172441006,
-0.1017856896,
-0.2491124123,
-0.2508816123,
0.2443488091,
0.2196869254,
-0.0482334979,
0.2001387775,
-0.1813922971,
-0.1735417545,
-0.3676159084,
-0.0372533426,
-0.1183129251,
-0.0274097435,
0.3658216,
0.2106414437,
-0.1014409959,
-0.0020053983,
-0.1801066399,
0.1635890752,
-0.0637449101,
-0.0035242289,
0.2608181238,
0.5538354516,
0.2435319275,
0.3106752038,
0.2992293239,
0.2581656277,
0.1126634255,
-0.2037101537,
0.1955658495,
0.0714909434,
-0.3873261809,
0.3228875101,
-0.2160300165,
-0.5179752111,
0.2211581469,
-0.1499303877,
-0.0986212268,
-0.1731103063,
-0.2908780873,
-0.2867600024,
-0.19637537,
0.0446304157,
0.0739590079,
0.3265118599,
-0.1473403871,
-0.1058720201,
-0.0832559541,
-0.2712127864,
0.1070028692,
0.3940207362,
-0.0297027677,
0.0916700065,
-0.2047255635,
-0.2930423319,
-0.258056581,
0.4496951401,
0.1703340113,
0.4296237826,
-0.3886588216,
-0.1273222566,
0.2763598859,
0.0712187588,
0.5536614656,
-0.0467513986,
0.1151601225,
-0.0569405518,
0.0121363103,
-0.1071468741,
-0.1470271796,
-0.1440140903,
0.5119685531,
0.3892016113,
0.3635126352,
0.2263506055,
-0.0852320269,
-0.0016635247,
-0.0544380359,
-0.2536140084,
0.0420831665,
-0.3382838964,
-0.2160534561,
-0.6214622259,
0.0946312398,
0.1319585145,
0.2071798891,
0.1676150709,
0.0907708928,
0.0802083984,
0.1334479898,
0.0519935898,
-0.0032121986,
0.2480489612,
0.0629133582,
0.0978415683,
0.0849565491,
0.3432384431,
0.1036638916,
0.3674911261,
-0.2215963155,
-0.2028967738,
0.2907595038,
-0.164724052,
0.466694504,
0.3905703723,
-0.2132817507,
0.1104504913,
-0.061938908,
-0.1088381708,
-0.2788861394,
-0.1341563463,
0.484084636,
-0.0348858796,
-0.1999313384,
-0.1988428533,
-0.1572068632,
-0.0048601255,
-0.1904499829,
0.1059812903,
-0.233812958,
-0.1743932217,
0.2608867586,
0.1183212548,
1.1062232256,
-0.2257889658,
0.2793616056,
-0.1018453091,
-0.172865659,
0.0449685603,
-0.0019078702,
0.3326907754,
-0.3763325214,
-0.2995592952,
-0.0191269852,
-0.2436328828,
-0.112093538,
0.1470807046,
0.0192082375,
0.0236797743,
0.1171267927,
0.153285414,
0.1730034947,
-0.1391004026,
0.2013518959,
0.0400507562,
0.0612147413,
0.0750313103,
0.0416843519,
0.2896728516,
-0.1694940925,
-0.0740856081,
-0.0342115089,
0.2264030278,
-0.3800480962,
0.1525078416,
-0.7754431963,
0.262568146,
-0.4186421037,
-0.1962833703,
0.094563365,
0.5298977494,
0.1237983257,
0.1372975409,
-0.0150657864,
0.1529945582,
0.3583408296,
-0.1340483576,
0.1258837432,
0.133756876,
0.184899658,
-0.2221435159,
-0.3096384108,
0.0611797795,
-0.0552835278,
-0.2310341001,
-0.2675895393,
0.2978300452,
0.2179478109,
-0.0421867818,
-0.2948171496,
-0.3396803737,
-0.2047765255,
-0.139822647,
0.0923410207,
-0.2002800852,
-0.1480073184,
0.0953224748,
-0.4863421917,
-0.629573524,
0.1926596761,
0.3028707206,
-0.1706923246,
0.0375142395,
0.4928510785,
-0.0302346572,
-0.2388460934,
-0.1716968119,
0.193743825,
-0.3293121755,
-0.48413378,
0.1162543297,
-0.0370856002,
0.1057281941,
0.072047919,
0.0245765075,
0.3253956735,
0.0260177497,
-0.1965754181,
-0.6410690546,
-0.3526349068,
-0.0495194495,
-0.0757590234,
0.0325631201,
-0.2115786523,
0.0357924774,
0.0924601406,
-0.3269481659,
-0.2626306117,
0.1730852276,
-0.2984367907,
0.0239122212,
0.2066276968,
-0.2245716453,
0.0567712523,
0.0920964628,
-0.0277758129,
0.1124163866,
-0.0668173581,
-0.1739187539,
-0.0778119117,
0.1674265414,
0.0890329331,
0.0575485304,
-0.0051387399,
-0.4036239386,
0.0469920337,
-0.0048167314,
0.0552517772,
0.3545491099,
-0.0326882191,
-0.0850430876,
-0.2105119377,
0.0766869187,
-0.0223547816,
0.2580422759,
-0.2664374113,
0.0524811484,
0.1899421662,
0.0978464112,
0.0852970183,
0.0487020686,
-0.2524018884,
-0.1314859837,
0.2411215305,
0.1285729855,
-0.1531635076,
-0.2609266043,
-0.0011970997,
-0.0239944384,
0.3382186592,
0.2167879343,
0.0252137929,
0.1930728853,
0.3364776373,
0.0848471895,
-0.1381898224,
0.0676321238,
0.1782612503,
-0.576857686,
-0.2878249884,
0.1503684521,
0.0809060484,
0.1030317396,
0.1729650199,
0.0152335986,
-0.0828055143,
-0.1475142539,
-0.0355972946,
0.3810206652,
0.1159983054,
0.2030681074,
0.1854526997,
0.0992244035,
0.2369523942,
0.418928057,
-0.1544700861,
0.0909945294,
0.1707578599,
-0.0959752128,
0.2076541781,
-0.4515889883,
-0.0706640929,
-0.0182406977,
-0.0657055676,
-0.0890735313,
-0.3714003861,
0.7061865926,
-0.3409684002,
-0.1721338183,
0.0367110819,
0.2421208322,
-0.3435504735,
-0.1791521013,
-0.1876555085,
-0.1753401309,
-0.276293844,
0.1574504375,
0.0459495112,
0.33765769,
-0.0722652078,
-0.0009432472,
-0.1236012951,
0.0642234311,
-0.375031203,
0.3611931801,
-0.1106320843,
-0.3491289318,
0.0585405752,
-0.0011291318,
0.2037257254,
0.2221875191,
0.1219840199,
0.5111347437,
0.3227024376,
-0.3435428143,
-0.0707838088,
-0.1407614648,
0.0528826118,
-0.159322679,
0.3650750518,
0.3437561393,
0.2447635382,
0.2015526295,
0.0124923773,
-0.1023596823,
0.0914750844,
-0.0523320176,
-0.1686680317,
0.2005509734,
0.3143182993,
-0.0290102288,
-0.5234509706,
-0.0481198505,
-0.1434709877,
-0.1753616184,
-0.0986291021,
0.3696940243,
-0.1777477711,
0.1657353491,
-0.1102026179,
0.0658401698,
0.0460832305,
0.387924701,
0.4450754523,
-0.0128080565,
-0.2195873559,
-0.2608726919,
-0.612372756,
0.5736037493,
-0.0702378303,
0.0302800387,
-0.1296098977,
-0.2331713438,
-0.3260963559,
-0.0178500935,
0.3657418489,
-0.4090033174,
0.1689551473,
0.3255276084,
-0.1080319285,
0.0970970467,
0.2898110747,
-0.1113476902,
0.1933063567,
-0.527130723,
0.164650321,
-0.1570422053,
0.0108297467,
-0.1827662885,
-0.0195545554,
-0.2236521393,
-0.1941231936,
0.5261107683,
0.0835382491,
0.4081569612,
-0.0912901983,
-0.0587923862,
-0.5051494837,
-0.1380487084,
-0.1489104927,
-0.2095572501,
0.1882746965,
0.1756818295,
-0.2250097096,
0.1258350015,
-0.3072803617,
0.2281693816,
0.0165921673,
0.1367893368,
0.0588470176,
0.0183331147,
0.2320975512,
0.0202528499,
0.4795726836,
0.1273061633,
0.1037765965,
0.1305950284,
-0.5822555423,
-0.1587871313,
0.513584733,
-0.6223828197,
-0.524294734,
0.3736461699,
0.2037202865,
-0.1904454231,
0.1150776744,
-0.2549937367,
-0.0020232797,
0.1618898809,
0.1187796146,
-0.4265724719,
0.0556963943,
-0.3917177916,
0.0160404593,
0.1347406209,
0.0209407024,
-0.0386480242,
-0.2768904865,
0.0921135396,
-0.1791352332
] |
https://github.com/huggingface/datasets/issues/590 | The process cannot access the file because it is being used by another process (windows) | I get the same behavior, on Windows, when `map`ping a function to a loaded dataset.
The error doesn't occur if I re-run the cell a second time though!
I'm on version 1.0.1. | Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
``` | 32 | The process cannot access the file because it is being used by another process (windows)
Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
```
I get the same behavior, on Windows, when `map`ping a function to a loaded dataset.
The error doesn't occur if I re-run the cell a second time though!
I'm on version 1.0.1. | [
-0.14136222,
0.0115138143,
-0.0121004023,
0.0894322097,
0.3479508162,
0.0587961078,
0.4676167369,
0.1985685676,
0.2368674576,
0.2460595518,
-0.2052185088,
0.255439043,
-0.0708041787,
-0.2149479091,
0.0211808123,
0.1801767051,
0.0048984662,
0.0978128016,
0.111659281,
0.2412271351,
-0.5601938963,
0.0767829716,
-0.3426275253,
0.4758162498,
-0.2404765338,
-0.2781504691,
0.0295199677,
0.3297816515,
-0.0508583114,
-0.2468932271,
-0.0585948266,
-0.0388165452,
0.0610767454,
0.5021089911,
-0.0001234325,
0.2076143324,
0.0500869937,
0.018369019,
-0.0317801759,
-0.1070128158,
-0.0684425011,
-0.1512519419,
-0.0919811279,
-0.0677361563,
0.164781332,
0.1098998785,
0.1434138268,
-0.4373571575,
0.322997123,
0.414714992,
0.1281574368,
0.2726636827,
0.1071122661,
-0.1961137056,
0.6379078031,
-0.0293756723,
-0.1111276746,
0.6946661472,
0.5680832863,
-0.4091536403,
-0.134603709,
0.2250448316,
-0.2191308737,
-0.0264013559,
-0.1551884413,
-0.1384026259,
0.435749501,
-0.4799388647,
0.4535286725,
-0.030351121,
0.1233580709,
-0.0895666778,
-0.0924163908,
-0.1571207047,
-0.180388242,
-0.0489245504,
0.3667474389,
0.3262473643,
-0.1456917226,
0.1990916431,
-0.2490427494,
-0.0421518758,
-0.259116292,
-0.0333372094,
0.3093460798,
-0.074569881,
-0.0314112008,
0.3234910071,
0.3730354905,
0.1414939463,
0.2741086185,
0.0841259882,
-0.018320132,
0.2918750942,
-0.2928035557,
0.3861652017,
-0.0639365166,
0.3130856156,
-0.1918385923,
-0.0625620782,
-0.0166271441,
-0.3786754608,
0.187184602,
0.3874211907,
0.0990444943,
0.1986860633,
-0.2627533078,
0.1062399745,
0.3773985505,
-0.2037293613,
-0.3466935456,
-0.0822349936,
-0.397213459,
-0.6542534828,
-0.0959607288,
0.2318629175,
0.3864629269,
0.0201445371,
-0.4447945952,
-0.1302548945,
-0.2066165358,
0.1450044662,
0.2346305251,
0.3613010943,
0.1014285535,
0.025127735,
0.2232878506,
0.1528560221,
-0.1587818265,
0.0670554787,
0.1297106892,
0.0575867668,
-0.8975262046,
-0.0086455178,
0.0262226984,
-0.0616102368,
-0.0616378151,
0.0595625378,
-0.0952668786,
-0.446226418,
0.2453482449,
-0.2969215214,
0.1318943352,
0.3331853747,
-0.0351103507,
0.0989933908,
0.3047440648,
-0.2101067752,
0.0419484079,
0.3383021653,
-0.5428122282,
-0.3435716629,
-0.0544378832,
-0.0100680441,
-0.1112361848,
0.2478314787,
0.4594936669,
-0.088102445,
0.2433750778,
-0.5887205005,
-0.0351887867,
-0.110090442,
-0.4130097926,
-0.4058869779,
-0.1700188518,
0.3269718289,
-0.3458701372,
0.1046345532,
0.0815204009,
-0.2160981894,
0.1933882833,
0.3409868479,
-0.2419814914,
0.4212696254,
-0.2040449977,
-0.0338631049,
0.5636239052,
-0.1880697161,
-0.7522539496,
0.283264637,
-0.4704799056,
-0.1629838347,
0.1556882709,
0.2498374432,
0.3320498765,
-0.0803198218,
0.4212531447,
-0.0354684852,
0.1602880359,
-0.0513004735,
-0.0560668521,
-0.0933916047,
0.2097370028,
-0.1015671417,
-0.0983510092,
0.3848282695,
0.5134422183,
0.0934620053,
0.289984405,
-0.1401345432,
0.2673079371,
0.0757897347,
0.2811132967,
0.4180847108,
-0.1146785989,
-0.1326086521,
0.0769592375,
0.2450481355,
-0.0843715221,
-0.2340931892,
-0.0087466687,
-0.2115938067,
-0.0939519182,
0.2580210865,
-0.491674304,
-0.1750736237,
0.0038237385,
0.1637929231,
-0.0698044449,
-0.0209142901,
0.0334408581,
0.0008904524,
-0.1489133835,
-0.0832628012,
-0.0076969024,
0.0859412625,
-0.2953229249,
-0.1756305248,
-0.2129374743,
0.1347343326,
0.1840853542,
0.1390434802,
-0.1797683686,
0.2243452519,
0.3209835291,
0.2022687644,
0.0562612005,
0.0483096913,
-0.0491745546,
0.1920032352,
0.1274476498,
-0.0017578434,
0.1362548023,
-0.0406482518,
0.1508337259,
0.1049962342,
0.0537456721,
0.3756719828,
-0.039480105,
0.2637694776,
-0.0308439098,
0.0898997188,
-0.028479334,
0.2370774597,
0.1620221734,
0.0600726157,
-0.0317353643,
0.0925723985,
0.0218157247,
-0.1094776541,
0.4865915179,
-0.0019061193,
0.1156214178,
-0.0405835807,
-0.1285411417,
0.0856794044,
0.1505116224,
0.350905031,
0.4571463466,
-0.1010678113,
-0.0190004427,
0.0774335191,
0.0760465264,
-0.0636860728,
0.0982598364,
0.1598390043,
0.3917522728,
0.1816731095,
0.0771105587,
0.1335734129,
-0.1835931838,
-0.2560079694,
0.2024310231,
0.3051228821,
-0.3449658155,
0.0733581781,
-0.1770000607,
0.0318622962,
-0.316849947,
0.0179167576,
-0.0927900001,
-0.0392646492,
-0.1229692698,
0.5164972544,
-0.0034539029,
0.2352935672,
-0.2392578721,
0.066959314,
-0.1224462315,
-0.0871037543,
0.2209887952,
-0.256757468,
-0.2006317675,
-0.0195786953,
0.4293871522,
-0.3859558105,
0.3175409436,
-0.0251155384,
0.2133797109,
-0.2417890429,
-0.1630088091,
-0.0314557813,
0.0066922102,
0.0758859888,
-0.0006340332,
0.1624709666,
0.0390706211,
-0.228588298,
0.380076617,
-0.3706244528,
-0.2118951082,
0.1213599592,
-0.1508994251,
-0.3524762988,
-0.1128025353,
-0.2050598115,
-0.2362648994,
-0.3290402293,
0.5150361061,
-0.5015125871,
0.1188742518,
0.0526284389,
0.1022956446,
0.3818595111,
0.0674311519,
0.146216929,
-0.0838082582,
0.0460401773,
-0.0927364379,
-0.2266791165,
-0.1957885474,
0.1894261092,
0.322817713,
-0.1510991156,
0.4219773412,
-0.2286885381,
-0.2187833637,
-0.293579489,
-0.0576852746,
-0.0539280176,
-0.0149468873,
0.3952733874,
0.219396323,
-0.0485513918,
-0.0531063154,
-0.2103043795,
0.081593141,
-0.0166882426,
-0.0519203693,
0.3699100316,
0.5677778721,
0.2121953368,
0.2237596512,
0.3477528095,
0.2361708581,
0.1962561458,
-0.2543724775,
0.3022276461,
-0.0147837698,
-0.3478396237,
0.330124259,
-0.2493269593,
-0.5216109157,
0.2343191952,
-0.1332304925,
-0.1404320896,
-0.0753042847,
-0.2685906887,
-0.3589951992,
-0.2149878442,
0.1313046366,
0.0430365801,
0.3650547862,
-0.0159004703,
-0.0638614148,
-0.0945854858,
-0.2662542462,
0.0111897774,
0.2958979607,
0.2029912919,
0.1099671647,
-0.1891302168,
-0.3375431597,
-0.3593936563,
0.4088296294,
0.1948035359,
0.4997757673,
-0.386269033,
-0.0939890295,
0.3094779551,
0.0615850426,
0.5285872221,
-0.082569845,
0.134960264,
-0.0322110206,
0.0120383427,
-0.1825373173,
-0.0618380308,
-0.1464899778,
0.5905238986,
0.3163485527,
0.4090489745,
0.3880319595,
0.0856933519,
-0.0564418659,
0.0261865929,
-0.1849886328,
-0.0189294033,
-0.3187393546,
-0.3213718534,
-0.6887791753,
0.1587913334,
0.1606619954,
0.0763269141,
0.1857758462,
0.0524580553,
-0.022572916,
0.1010531709,
0.1076678634,
0.1295808554,
0.2206731439,
0.0387902185,
0.0728813708,
0.1941771656,
0.2760865986,
0.1475390643,
0.4105928838,
-0.2495785654,
-0.1668434739,
0.2950472534,
-0.1544923037,
0.4194768667,
0.5211333036,
-0.1932346672,
0.1478033066,
-0.1029861867,
-0.2066301703,
-0.34868595,
-0.2006396949,
0.5689412951,
0.0225590579,
-0.1061345786,
-0.2542788386,
-0.1583707184,
0.0197268873,
-0.3631270528,
0.1986816525,
-0.2120747715,
-0.1715661287,
0.3051836491,
0.0270488802,
1.1691451073,
-0.2686496973,
0.1820838004,
-0.0748525783,
-0.111144416,
0.0132018179,
-0.0046124235,
0.3997063339,
-0.2680261731,
-0.3571720719,
-0.0744403601,
-0.3069594204,
0.0026622377,
0.2075480223,
0.1080098897,
0.0051190555,
0.1454627216,
0.2447901964,
0.1421998143,
-0.1945205629,
0.2674925029,
0.0287273116,
-0.0151184211,
0.0028055944,
0.0803770721,
0.3495512307,
-0.1559371054,
0.1015476584,
-0.0940512419,
0.1380603313,
-0.2798286676,
0.184441328,
-0.7715756297,
0.2749502063,
-0.509480834,
-0.0246035606,
0.0325671807,
0.6402310729,
0.0685770661,
0.0868127421,
0.0277770888,
0.1095400304,
0.491410315,
-0.0713761598,
0.0799369216,
0.0126028508,
0.1288146079,
-0.1386253536,
-0.3554723859,
0.0933272392,
-0.0085454099,
-0.3653728366,
-0.3121625781,
0.3467278779,
0.2849790752,
-0.101049453,
-0.1500525773,
-0.3407863975,
-0.1899007857,
-0.1435969323,
0.0098659657,
-0.2104018927,
-0.0666451305,
0.3015581965,
-0.5401710272,
-0.6096471548,
0.198138684,
0.3259809911,
-0.1335199475,
0.0663746148,
0.3299747705,
-0.1386146098,
-0.2458220422,
-0.1201137677,
0.2088448405,
-0.2566813231,
-0.4841575623,
0.1436629295,
-0.0594276562,
0.1939187348,
-0.0004123226,
0.149332121,
0.2973938286,
0.0471617207,
-0.244589299,
-0.6123515964,
-0.2343078554,
0.082004942,
0.0850908831,
0.1012050584,
0.012290746,
0.125825271,
0.1343889087,
-0.290114373,
-0.1751638055,
0.2391397357,
-0.1433005482,
0.0820975378,
0.2798567414,
-0.1230388433,
0.1190679073,
-0.0504878536,
-0.1129847392,
0.2349791974,
-0.0249788277,
-0.0788337663,
-0.0615958348,
0.209541887,
-0.0111299409,
0.0855228603,
-0.1009880975,
-0.4917933047,
0.1396083832,
-0.1762099862,
0.0901894718,
0.3931544423,
-0.0345813185,
-0.2588269413,
-0.0482473075,
0.0897352695,
0.1152819395,
0.2991900742,
-0.3786822557,
-0.0274621509,
0.1826840639,
0.1161413789,
0.173587516,
0.1432120651,
-0.1663253605,
-0.0768380016,
0.2709829807,
-0.0188510343,
-0.1418465972,
-0.2272292972,
0.0257440433,
-0.1293252409,
0.3927754164,
0.2528841794,
-0.0798220932,
0.0150176138,
0.2868749201,
-0.0068377694,
-0.1479764581,
-0.0632159635,
0.2872813344,
-0.4038874209,
-0.2592488229,
0.2804381847,
0.1411350071,
0.0427062958,
0.0749325305,
-0.0996243879,
0.0239485651,
-0.1973244846,
-0.1192630082,
0.3578239977,
0.0735375285,
0.2194059193,
0.2079419047,
0.1932835877,
0.2636317909,
0.528095305,
-0.1284554899,
-0.0686656982,
0.1418905109,
0.0015380504,
0.1378089935,
-0.4184563756,
0.0150376651,
0.0566637553,
-0.1932704002,
0.0281064995,
-0.2981097698,
0.6454685926,
-0.3578941524,
-0.1090260297,
-0.0337383337,
0.2910705507,
-0.3422369361,
-0.2771371603,
-0.2313897312,
-0.1520060003,
-0.2557016015,
0.0601682514,
0.0127156684,
0.4008626938,
0.1555401832,
-0.0556649193,
-0.0292531997,
0.086975947,
-0.4900073707,
0.294021368,
-0.037481878,
-0.4188626409,
0.0647213608,
-0.1684446633,
0.3638532758,
0.2634750307,
0.148450464,
0.5313631296,
0.3240064681,
-0.1957424879,
-0.0670815036,
-0.1455583721,
0.0605516732,
-0.2047079504,
0.3779831827,
0.3348794878,
0.2564014494,
0.2689195573,
-0.0570321679,
-0.0056082271,
0.0701850876,
-0.0868095309,
-0.1274067461,
0.1538698077,
0.2339244634,
-0.0588723458,
-0.4490206242,
-0.0687281787,
-0.0442025922,
-0.1037910506,
-0.0279980954,
0.2987700105,
-0.3428550363,
0.2525936365,
-0.0414709747,
-0.0017833225,
0.0940829366,
0.2185762823,
0.3783856928,
-0.0119016655,
-0.2470856607,
-0.3395849466,
-0.7406722903,
0.498664856,
-0.2982269228,
0.0753229409,
-0.1138961017,
-0.2813334167,
-0.2506820261,
0.0077748299,
0.3160146773,
-0.4026605487,
0.0189628527,
0.3225497007,
-0.0602931231,
0.0331288986,
0.2629620731,
-0.0624235496,
0.1254678369,
-0.4483185709,
0.2661892176,
-0.1243148297,
-0.056860216,
-0.2836570144,
0.0359412655,
-0.3244715035,
-0.1425284743,
0.6262719631,
0.2383573353,
0.449377507,
-0.086479485,
-0.0794855803,
-0.5034240484,
-0.1842188686,
-0.0806109011,
-0.188890487,
0.1566800773,
0.0228783041,
-0.2726193368,
0.0991904438,
-0.2502709627,
0.0577516742,
-0.0042231679,
0.3306559026,
0.0701656118,
-0.0507740416,
0.1258134693,
0.1061972231,
0.2523444295,
0.1627786458,
0.1021653861,
0.2257297337,
-0.4350599051,
-0.1813191772,
0.480258286,
-0.6469634175,
-0.5080168843,
0.3463339508,
0.1946894079,
-0.2940408885,
0.1316040158,
-0.3184370697,
-0.0499444827,
0.2210283875,
0.1013966277,
-0.4337934852,
0.1087732613,
-0.4045116305,
-0.0239029676,
0.0889080614,
0.1531780958,
-0.0511759557,
-0.3144826293,
0.0913260728,
-0.2533951998
] |
https://github.com/huggingface/datasets/issues/590 | The process cannot access the file because it is being used by another process (windows) | @saareliad I got the same issue that troubled me quite a while. Unfortunately, there are no good answers to this issue online, I tried it on Linux and that's absolutely fine. After hacking the source code, I solved this problem as follows.
In the source code file: arrow_dataset.py -> _map_single(...)
change
```python
if update_data and tmp_file is not None:
shutil.move(tmp_file.name, cache_file_name)
```
to
```python
tmp_file.close()
if update_data and tmp_file is not None:
shutil.move(tmp_file.name, cache_file_name)
```
Then it works without needing multiple times runs to avoid the permission error.
I know this solution is unusual since it changes the source code. Hopefully, the lib's contributors can have better solutions in the future.
| Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
``` | 111 | The process cannot access the file because it is being used by another process (windows)
Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
```
@saareliad I got the same issue that troubled me quite a while. Unfortunately, there are no good answers to this issue online, I tried it on Linux and that's absolutely fine. After hacking the source code, I solved this problem as follows.
In the source code file: arrow_dataset.py -> _map_single(...)
change
```python
if update_data and tmp_file is not None:
shutil.move(tmp_file.name, cache_file_name)
```
to
```python
tmp_file.close()
if update_data and tmp_file is not None:
shutil.move(tmp_file.name, cache_file_name)
```
Then it works without needing multiple times runs to avoid the permission error.
I know this solution is unusual since it changes the source code. Hopefully, the lib's contributors can have better solutions in the future.
| [
-0.104675293,
0.1074616611,
-0.0233314279,
0.2269623131,
0.3214991689,
0.0007901341,
0.4211473167,
0.1908867806,
0.1951683462,
0.1922735274,
-0.1767426133,
0.360742569,
-0.1743268818,
-0.2670359612,
-0.1233480349,
0.0355834812,
0.0496971793,
0.0130528063,
0.2309348285,
0.2061700672,
-0.3807086945,
-0.0331441239,
-0.329926312,
0.3217108548,
-0.28379035,
-0.3542702496,
0.0100987926,
0.4027676284,
0.029514268,
-0.288077116,
-0.1288164407,
0.0231066905,
0.119694896,
0.5545583963,
-0.0001154609,
0.174819991,
-0.046863474,
0.0502056777,
0.0135351382,
-0.0613387004,
0.0619853735,
-0.2356382906,
-0.0451323874,
-0.1107082814,
0.1963999271,
0.0137781352,
0.235169962,
-0.4139158726,
0.4707051814,
0.4032633901,
0.1879202127,
0.2586318552,
0.1164235249,
-0.0457815714,
0.4986585975,
-0.0801041424,
-0.0995624363,
0.5888599753,
0.4468512833,
-0.2438418567,
-0.1711248308,
0.3385622799,
-0.2344777584,
-0.1508842856,
-0.1414981335,
-0.2691134214,
0.2938320935,
-0.6156042218,
0.4682238698,
-0.0681589469,
0.0907823294,
-0.0633870512,
-0.2490688264,
-0.1350112706,
-0.0936148241,
-0.1098756269,
0.2257646322,
0.4395216107,
-0.149549216,
0.2055869251,
-0.0682477057,
-0.1247704923,
-0.3324674368,
-0.1159741729,
0.4712693095,
0.1288679093,
-0.0207261145,
0.2954644263,
0.3008289933,
0.1232296228,
0.291613847,
0.1742065102,
0.0111909145,
0.2981750071,
-0.3884973526,
0.3316391706,
-0.1883034706,
0.2630652487,
-0.3031634986,
0.2108686566,
-0.0133270603,
-0.1471739411,
0.2455122322,
0.3030040562,
-0.0604699403,
0.2595027387,
-0.294814527,
0.252959609,
0.4194633365,
-0.1222942024,
-0.2796793878,
0.0336034559,
-0.5552169085,
-0.7762517929,
-0.0632789806,
0.3375048637,
0.4028719664,
0.1093556583,
-0.3044161201,
-0.1326162219,
-0.2338761836,
0.2082680911,
0.2471145988,
0.4144238234,
0.042513784,
0.1491457224,
0.2400327027,
0.1745166928,
-0.1405508816,
0.1698298752,
0.0849566758,
0.149600625,
-0.719840169,
-0.0348416939,
0.0289798602,
-0.0626346469,
0.0019792728,
0.083467111,
-0.117539525,
-0.4118119478,
0.3406250775,
-0.2460240126,
0.1409450322,
0.3160173297,
0.0178084746,
0.1441167742,
0.2787434757,
-0.122760579,
0.0719845891,
0.2836942673,
-0.4965554178,
-0.3666151166,
-0.0984081477,
0.0370427072,
-0.0430699065,
0.1910550594,
0.4627479315,
-0.159444347,
0.3161436021,
-0.3744055033,
0.129358232,
-0.0766196623,
-0.345259577,
-0.4303707182,
-0.1973849535,
0.349232316,
-0.2921373546,
0.1789639294,
0.1128088236,
-0.1752116531,
0.1565729827,
0.3864723742,
-0.2189256847,
0.2811247408,
-0.1052907184,
-0.094180584,
0.6837504506,
-0.2806607187,
-0.6752012968,
0.2573556006,
-0.5013669729,
-0.1445811093,
0.4097955227,
0.235809505,
0.2997540236,
-0.0794482455,
0.3253082931,
-0.0315069929,
0.1878615171,
0.0564684495,
-0.1306352615,
-0.1833697706,
0.1399635673,
-0.0709242225,
-0.0423559211,
0.1124439687,
0.4389398098,
0.0218533203,
0.3322863579,
-0.1725774258,
0.3336983323,
0.2270243168,
0.3746823668,
0.3700478971,
-0.1073647887,
-0.0706316605,
0.0883533061,
0.2813764811,
-0.0693754852,
-0.2177545577,
-0.0564319193,
-0.2180896103,
0.0088665187,
0.1772120595,
-0.5243188143,
-0.2778484225,
0.1006737798,
0.1897176206,
-0.08617495,
-0.1850093007,
0.0722657442,
0.0684192255,
-0.0747068375,
-0.0877973735,
-0.1331679374,
-0.0027413666,
-0.3007887006,
-0.1416791081,
-0.2466589063,
0.1272358596,
0.2303459644,
0.1158881634,
-0.1657587886,
0.2577354908,
0.1691139638,
0.2231011987,
0.0879700407,
0.0498901233,
-0.0814920962,
0.1020357236,
0.1587973535,
0.0954466313,
0.093741037,
0.0492442995,
0.1746815294,
0.0925375521,
0.0108560156,
0.3938213289,
0.0800944045,
0.321502775,
0.0077006221,
0.057398051,
-0.073932007,
0.2918779254,
0.1616860181,
0.1740431488,
0.1012412459,
0.0455263779,
0.026424028,
0.0338478647,
0.4503728151,
0.0911366045,
-0.0042145215,
-0.0510590896,
-0.1219829842,
0.0876496434,
0.0985327959,
0.4043947458,
0.3523020744,
-0.047235094,
0.0283691492,
-0.0014656428,
0.1685305536,
-0.0364185162,
0.1677800417,
0.1691947281,
0.3131158948,
0.1606740057,
0.0856152177,
0.2174443454,
-0.0743992552,
-0.3199754357,
0.270878762,
0.2420297265,
-0.274025023,
0.1205574274,
-0.1823227704,
0.1474022567,
-0.386397332,
0.0602685139,
-0.0597069487,
0.0446178094,
-0.0311293416,
0.4068612456,
-0.1384709626,
0.3061870039,
-0.2012709975,
-0.0706978887,
-0.0700493604,
-0.238923043,
0.2810955644,
-0.1928631365,
-0.1357764602,
-0.0142606311,
0.5200596452,
-0.3461207151,
0.3348175287,
0.1310817897,
0.116842106,
-0.4002670944,
-0.1492414027,
-0.1489004344,
-0.0322173014,
-0.0071633998,
-0.0577063225,
0.082321547,
0.0533612594,
-0.3870070875,
0.2943770289,
-0.3648618162,
-0.1971298754,
0.0926635563,
-0.1414786279,
-0.2842852175,
-0.2090808004,
-0.1819685102,
-0.2725280523,
-0.4515671134,
0.5456200242,
-0.3829762936,
0.2032079101,
0.0438952632,
0.1365899146,
0.3035441935,
0.1075678319,
0.180955112,
-0.0206500087,
0.0749728829,
-0.0540524647,
-0.214455694,
-0.2169632018,
0.2142414302,
0.207710743,
-0.0423307344,
0.3718560934,
-0.1384138018,
-0.2154287696,
-0.3418719769,
0.0235146284,
-0.1657322347,
0.0905479565,
0.4791870713,
0.2240643948,
-0.1020766795,
-0.0121041015,
-0.0576758236,
0.1700654328,
-0.0863169283,
-0.0665367395,
0.2340047657,
0.4373006523,
0.2596573234,
0.4526245892,
0.2906375229,
0.2908762991,
0.0928926393,
-0.1935476959,
0.2867956161,
0.1278293431,
-0.3168067336,
0.2851983905,
-0.2700602114,
-0.4956004024,
0.2250655591,
-0.1028251946,
-0.1102602482,
-0.0529993586,
-0.2232200503,
-0.2524753213,
-0.3226302266,
0.10747125,
0.0381970592,
0.3169089556,
-0.1035540998,
-0.0039307997,
-0.1443035901,
-0.2486411333,
0.0522288606,
0.3439495862,
0.0143341254,
0.0409763306,
-0.1483405381,
-0.2662767172,
-0.3803995252,
0.4236619473,
0.1634953022,
0.4395975173,
-0.2032950222,
-0.1815981865,
0.3152520359,
0.0375743471,
0.4597313702,
-0.0057463292,
0.0575061031,
-0.0323107466,
0.0679837689,
-0.141029954,
-0.1038111001,
-0.1724367291,
0.4739083648,
0.3283729553,
0.3694815338,
0.1661264747,
-0.0821783096,
-0.0833796263,
-0.0226206947,
-0.2463689893,
0.1327558905,
-0.4000184536,
-0.3016057312,
-0.741936326,
0.1716013551,
0.2297463566,
0.1126282811,
0.0381699577,
0.0588926673,
-0.0580216423,
0.1704585105,
0.0642804876,
0.0105290748,
0.3151354194,
0.1231191456,
0.0397389606,
0.0812843442,
0.3094007373,
0.1304791123,
0.3722920716,
-0.2756944001,
-0.2082938254,
0.2713946104,
-0.135817498,
0.4039909542,
0.5712550282,
-0.2827677131,
0.2027136385,
-0.0318036564,
-0.1041083336,
-0.2370409369,
-0.0971107185,
0.4168575406,
-0.0461196117,
-0.0898675919,
-0.2518374622,
-0.1096385047,
0.0374987423,
-0.267609179,
0.2155769616,
-0.1272413731,
-0.1754097342,
0.28136608,
0.0854728222,
1.1278805733,
-0.1698312759,
0.3113790452,
0.0243061483,
-0.2841434181,
0.0148330443,
-0.0074553937,
0.3938036561,
-0.3412926793,
-0.2458898127,
0.0506226122,
-0.1569849253,
-0.1203958839,
0.1006700248,
0.0696708411,
0.0256380513,
0.1794238687,
0.0938360244,
0.0987193063,
-0.089983888,
0.1407267749,
-0.0488822274,
0.0103361197,
0.0164594837,
0.1089333892,
0.2242485881,
-0.1826722324,
-0.0617937632,
-0.0259382799,
0.0945738107,
-0.3339994252,
0.0826491565,
-0.8847638965,
0.2795904875,
-0.523719728,
-0.1967056692,
0.0685230196,
0.4826602638,
0.1073738188,
0.1613192856,
-0.0332431942,
0.161660701,
0.5459962487,
-0.0964550748,
0.0235431977,
-0.0513971969,
0.2007152438,
-0.1294845045,
-0.2107152045,
0.0201182012,
-0.0996212363,
-0.3269334435,
-0.2084341496,
0.4338734448,
0.2239688784,
0.002914004,
-0.1283278614,
-0.1845462769,
-0.2664199471,
-0.1747767478,
0.0659529716,
-0.190712437,
-0.1345563382,
0.2007713467,
-0.4961920381,
-0.6412081718,
0.212649554,
0.2699280381,
-0.1814335436,
0.0538586825,
0.5651238561,
-0.1387380511,
-0.2504098415,
-0.1887463033,
0.1547123641,
-0.3191346526,
-0.5244015455,
0.1637368649,
-0.0067651086,
0.1218094528,
0.0170325488,
-0.0523424298,
0.2090025097,
0.108875066,
-0.1975927055,
-0.5801059604,
-0.3217695951,
0.0204441715,
-0.0270667411,
0.0711363554,
-0.1928686202,
0.0724067539,
-0.0020999536,
-0.3044888675,
-0.2470370829,
0.2354737371,
-0.1681192815,
0.0797646418,
0.1445335895,
-0.2157518268,
0.0955564529,
0.120105125,
-0.0588398501,
0.1646690071,
-0.0294886492,
-0.1312184483,
-0.1284127682,
0.1852894425,
0.0036394838,
0.0240730196,
-0.0250856131,
-0.5244276524,
0.0952148736,
-0.0696006045,
0.0878884792,
0.3729644418,
-0.0400489941,
-0.1816286296,
-0.1431842446,
0.2240915447,
0.029074505,
0.1879342794,
-0.3237575591,
0.0476396345,
0.1514429152,
0.1387568414,
0.0818119198,
0.08032424,
-0.1212567464,
-0.1599748135,
0.3411810398,
0.1043948382,
-0.2243650258,
-0.1746525466,
-0.1480349451,
-0.0895792991,
0.3540415168,
0.36586532,
0.0026511848,
0.0847168788,
0.3677082062,
0.0505967885,
-0.1701990366,
0.1099452525,
0.0946588963,
-0.644577086,
-0.210069716,
0.3070207238,
0.0962800756,
-0.0129732415,
0.1106999815,
0.0291548818,
-0.1314840019,
-0.2131494582,
-0.0594349205,
0.2995707095,
0.0880745426,
0.0795414597,
0.2046572119,
0.1369892955,
0.3034843802,
0.3255366087,
-0.2799257338,
0.1091091335,
0.1488375366,
-0.064490892,
0.1530718505,
-0.3837521672,
-0.1395328343,
-0.1055234969,
-0.1366042197,
-0.0333441943,
-0.3139843345,
0.6491676569,
-0.31667009,
-0.1552523077,
0.0183506235,
0.3227443695,
-0.2984808385,
-0.2541522384,
-0.1724835932,
-0.1527473181,
-0.2804697752,
0.1022048891,
0.0635207444,
0.3036481142,
-0.0251768492,
0.0312173218,
-0.072061494,
0.0474082157,
-0.3547355235,
0.351282239,
-0.0333036408,
-0.4196988046,
0.0542929173,
0.0418185703,
0.1526836753,
0.1329926252,
0.0545925051,
0.4572567642,
0.2488875389,
-0.3006017208,
-0.0312284827,
-0.0436880514,
0.0892657712,
-0.186730355,
0.3113058507,
0.3029292524,
0.1951556653,
0.14812316,
-0.0564434305,
-0.1130296439,
0.045441635,
-0.09774106,
-0.1588469446,
0.1349365115,
0.2942935526,
-0.0361744016,
-0.4966943264,
-0.1367790252,
-0.1217939258,
-0.0749004856,
-0.0802465156,
0.3662534654,
-0.2292342484,
0.1579886079,
-0.1954773664,
0.0516346097,
0.0634434074,
0.3620235622,
0.4153777659,
-0.0330789164,
-0.2347079813,
-0.1802057028,
-0.7036272287,
0.5146066546,
-0.1440583766,
0.0011265427,
-0.196144715,
-0.1153306812,
-0.3221105933,
-0.0725278109,
0.4101198018,
-0.4489622116,
0.270662725,
0.3859275281,
-0.1417348981,
0.1673409194,
0.2593227923,
-0.1482378691,
0.2964640558,
-0.6339464188,
0.1755183935,
-0.1820920855,
-0.0187498182,
-0.096929267,
-0.0914123729,
-0.1733268499,
-0.0728183091,
0.639533937,
0.0932450593,
0.1818897426,
-0.0565465465,
-0.1298191249,
-0.6063084006,
-0.0375935808,
-0.2250325531,
-0.2564146817,
0.2086795866,
0.1860781759,
-0.2442082167,
0.0818968937,
-0.2877626717,
0.1981611997,
0.0348177105,
0.0372910276,
0.0996130407,
0.0312336609,
0.1810392588,
0.148149237,
0.2882187366,
0.1698299348,
0.0136023257,
0.1833907664,
-0.5974608064,
-0.2791656256,
0.5037990808,
-0.6794377565,
-0.5255781412,
0.2935384512,
0.3454279602,
-0.1101916358,
0.1374965012,
-0.3126282692,
0.0519281477,
0.1169420183,
0.1351992786,
-0.4604378045,
0.0032777302,
-0.4373626113,
0.0121841431,
0.1431403756,
0.1368296146,
-0.0396285281,
-0.2378726155,
0.0738468319,
-0.2436503321
] |
https://github.com/huggingface/datasets/issues/590 | The process cannot access the file because it is being used by another process (windows) | @wangcongcong123 thanks for sharing.
(BTW I also solved it locally on windows by putting the problematic line under try except and not using cache... On windows I just needed 1% of the dataset anyway) | Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
``` | 34 | The process cannot access the file because it is being used by another process (windows)
Hi, I consistently get the following error when developing in my PC (windows 10):
```
train_dataset = train_dataset.map(convert_to_features, batched=True)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\site-packages\nlp\arrow_dataset.py", line 970, in map
shutil.move(tmp_file.name, cache_file_name)
File "C:\Users\saareliad\AppData\Local\Continuum\miniconda3\envs\py38\lib\shutil.py", line 803, in move
os.unlink(src)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\saareliad\\.cache\\huggingface\\datasets\\squad\\plain_text\\1.0.0\\408a8fa46a1e2805445b793f1022e743428ca739a34809fce872f0c7f17b44ab\\tmpsau1bep1'
```
@wangcongcong123 thanks for sharing.
(BTW I also solved it locally on windows by putting the problematic line under try except and not using cache... On windows I just needed 1% of the dataset anyway) | [
-0.1660128236,
-0.0378361791,
-0.0557806455,
0.2940468788,
0.3540611863,
0.1522649229,
0.3368765116,
0.1749328077,
0.2806837261,
0.2250537425,
-0.1342353374,
0.2158224583,
-0.1404852271,
-0.1683882922,
-0.1173465475,
0.0108611165,
0.0012657791,
0.0442126207,
0.2459799647,
0.2743891776,
-0.4984658062,
0.0607167408,
-0.3295140862,
0.4237716496,
-0.2919598222,
-0.3280044198,
0.1058605239,
0.3697045147,
-0.0732034817,
-0.1974389851,
-0.100266993,
-0.0513671264,
0.1314433813,
0.4546834528,
-0.0001180091,
0.1998519003,
0.0130153364,
0.0264494643,
0.0207779221,
-0.0899401084,
-0.0563948043,
-0.2078060657,
-0.1073656678,
-0.0660420284,
0.1141615659,
0.0234638005,
0.2015796751,
-0.4571411908,
0.3327831626,
0.4084489346,
0.1968295127,
0.2311187834,
0.1027955115,
-0.1659162045,
0.535613656,
-0.1112562343,
-0.0627503172,
0.633164525,
0.5471025109,
-0.222069636,
-0.1172491461,
0.2611916959,
-0.244472146,
-0.1755070537,
-0.139234513,
-0.2325414121,
0.3693846166,
-0.6541215181,
0.4994103611,
-0.0672554225,
0.2061076015,
0.0079515055,
-0.1065308377,
-0.1031477004,
-0.1989776045,
-0.0291661695,
0.2501770556,
0.4200431705,
-0.1939843148,
0.305706203,
-0.1787292957,
-0.1260135323,
-0.3501565158,
-0.0605799407,
0.438665092,
0.0405458473,
-0.0260064099,
0.3118020594,
0.3198760152,
0.1347720921,
0.2551609576,
0.1011165679,
0.0105903484,
0.2738930583,
-0.3511508405,
0.2632480264,
-0.2067014873,
0.4223090112,
-0.2271409035,
0.1090164483,
-0.109515965,
-0.2546097934,
0.2531680763,
0.3653874099,
-0.0747123808,
0.3147969842,
-0.252848357,
0.1199978143,
0.3119696379,
-0.1192553341,
-0.2826242149,
-0.0470494106,
-0.5266234875,
-0.8045746684,
-0.1368514597,
0.3325534761,
0.35489887,
0.0067581758,
-0.4248268008,
-0.1458635181,
-0.1495653987,
0.1357561052,
0.2901573777,
0.3832622766,
0.0116866417,
0.1563882977,
0.204604134,
0.2144594491,
-0.1127434969,
0.0718336403,
0.092135407,
0.2212533206,
-0.8064487576,
-0.0484172776,
-0.0237101167,
-0.0383471921,
0.0575434491,
0.0691565797,
-0.1150586456,
-0.393950969,
0.360955745,
-0.2968105376,
0.0476293936,
0.2490939051,
0.0270231217,
0.1089757681,
0.3133923709,
-0.1210680678,
0.0360726714,
0.2741179764,
-0.5945296288,
-0.4111283123,
-0.0860511437,
0.0659645796,
-0.0545929596,
0.1561235189,
0.4824935794,
-0.0974191427,
0.2904033959,
-0.3547201753,
0.0470063351,
0.0024047196,
-0.3527669609,
-0.4089996219,
-0.169923991,
0.227552712,
-0.4454243779,
0.1663399786,
0.112391606,
-0.2967840135,
0.2837726474,
0.4242574871,
-0.2254430056,
0.3697234988,
-0.137235254,
-0.1389299929,
0.6785032749,
-0.1987153292,
-0.7276567221,
0.3511733115,
-0.4982522726,
-0.2206845582,
0.3715965152,
0.274224788,
0.3183172345,
-0.0356601849,
0.4256312847,
-0.011666758,
0.125843972,
-0.1144290119,
-0.104187049,
-0.235474959,
0.2226246595,
-0.0457351021,
-0.0473345071,
0.3152664602,
0.3644980788,
0.0500475988,
0.2891132534,
-0.1426453292,
0.3079089522,
0.2466265857,
0.3382253051,
0.4254818857,
-0.1479534656,
-0.1629212052,
0.1703910828,
0.2562147379,
-0.1407580376,
-0.2786247432,
-0.0121228509,
-0.2053183913,
-0.1437114477,
0.2086333781,
-0.5470632315,
-0.2651072443,
0.0763247609,
0.0634870976,
-0.0284605995,
-0.1560456753,
0.0889444798,
0.0087251514,
-0.0728097707,
-0.0971505642,
-0.12181402,
0.0821681842,
-0.3950128853,
-0.0846488625,
-0.2371075302,
0.1842453927,
0.2675737739,
0.1738246381,
-0.1304075718,
0.1893813014,
0.1834191382,
0.1690434217,
0.2318519652,
0.0930388719,
-0.0262413211,
0.0120447427,
0.2170074433,
-0.0181246568,
0.0920293853,
0.0306708366,
0.3035515249,
0.0972557664,
-0.0145240501,
0.3563965261,
0.0089261159,
0.2917684317,
-0.0294112451,
0.0325311497,
0.1224768534,
0.2409924269,
0.2025084347,
0.1058619395,
-0.026107274,
0.0663526878,
0.0395245552,
-0.0594214238,
0.415440172,
0.0949324071,
-0.0369529165,
-0.000652915,
-0.1412417442,
0.0572683662,
0.0906977728,
0.2913497686,
0.3895950317,
-0.0294963792,
-0.0036333706,
0.0386688933,
0.1876126826,
-0.0683159456,
0.1674477458,
0.2045693994,
0.3450442851,
0.1418385506,
0.0323923901,
0.1896013021,
0.0150765181,
-0.3164550066,
0.1753633469,
0.3392403722,
-0.2660367489,
0.1201156005,
-0.1579216719,
-0.0410789102,
-0.3458781838,
0.1223575026,
-0.0766528994,
0.0178314373,
-0.0643590018,
0.4527755678,
0.0092967302,
0.1905374229,
-0.2618512511,
-0.050455749,
-0.1107771993,
-0.1444198638,
0.2394907475,
-0.2877011895,
-0.1475096643,
0.0261553526,
0.4310840368,
-0.3701071441,
0.3875532448,
0.0265529007,
0.1025779694,
-0.3318635225,
-0.0847976357,
-0.1104739755,
0.0069138166,
0.0146128368,
-0.0169817954,
0.1785102487,
0.1168060899,
-0.4084202349,
0.3876774311,
-0.2849361598,
-0.1894251704,
0.0382143706,
-0.2024390101,
-0.3295211196,
-0.0973661095,
-0.1877684593,
-0.3236173987,
-0.3973132372,
0.4519201815,
-0.4148598313,
0.1817364097,
-0.0535674058,
0.0407036357,
0.2566218674,
0.134456411,
0.1836618185,
-0.0362399742,
-0.0234085694,
-0.1205167174,
-0.1703294814,
-0.171661377,
0.2624775171,
0.232247889,
-0.1153167337,
0.3312908113,
-0.1592458785,
-0.2347827405,
-0.3380506635,
0.0444341153,
-0.0777585655,
-0.0934741125,
0.4242714345,
0.1935464442,
-0.0781685784,
0.007209748,
-0.1666395813,
0.1691451818,
-0.0822763145,
0.0318534151,
0.2952129841,
0.5357869864,
0.2394005954,
0.4260257185,
0.3088310063,
0.2428978384,
0.1612041295,
-0.2039854676,
0.1781502962,
0.0811829418,
-0.3624287546,
0.3248822689,
-0.1848443896,
-0.5356465578,
0.2135559618,
-0.1303967535,
-0.1177756935,
-0.1384094656,
-0.2869180441,
-0.3454824388,
-0.1585710645,
0.0306950528,
0.0713225156,
0.3649421632,
-0.1111488789,
-0.0920346752,
-0.0814014226,
-0.2938338816,
0.0752840489,
0.3776855767,
0.0201361198,
0.1086256802,
-0.1931685954,
-0.3319660425,
-0.3213919699,
0.456210494,
0.1692187041,
0.4438437521,
-0.4118529558,
-0.0896594599,
0.3133985996,
0.124721989,
0.5701314211,
-0.0519175008,
0.0445694663,
-0.1224260107,
0.0376481563,
-0.0938314646,
-0.0465224721,
-0.1349651515,
0.515166223,
0.4262578189,
0.3858271539,
0.2570057213,
-0.0518761463,
-0.0383304358,
0.0106329918,
-0.2539951801,
0.0637409389,
-0.2831494808,
-0.2379432321,
-0.655843854,
0.1182365716,
0.1682168245,
0.1619085371,
0.1417959183,
0.0898208916,
0.0581242815,
0.1918322444,
0.144683823,
0.0218333825,
0.2823257148,
0.0339357294,
0.0422184989,
0.1069469005,
0.4538535774,
0.1563456357,
0.3275200129,
-0.2118469924,
-0.1372990757,
0.2419727743,
-0.1597883105,
0.5036763549,
0.4627994597,
-0.2284405529,
0.1845653057,
0.0050520934,
-0.147248596,
-0.3052194118,
-0.1386514455,
0.4399252534,
-0.025714403,
-0.1587969661,
-0.2194194198,
-0.188121438,
0.0239708424,
-0.2828884125,
0.0737631693,
-0.2547091842,
-0.2068546116,
0.2930558026,
0.0863491446,
1.1771053076,
-0.1679235697,
0.2396938801,
-0.1062491536,
-0.1760233641,
0.0520631671,
-0.0099006593,
0.3472300768,
-0.3507134914,
-0.2937559485,
-0.0666394383,
-0.2492952198,
-0.0179036967,
0.1763152927,
0.0670564324,
0.0652061924,
0.1141453236,
0.0944146961,
0.1545523107,
-0.1133691147,
0.2104414999,
0.0443561338,
0.0469822288,
0.0256701708,
-0.0007111244,
0.3377895057,
-0.2275868952,
-0.0417266749,
-0.0867516771,
0.1854973435,
-0.337562561,
0.1208255887,
-0.8556825519,
0.2725380063,
-0.5097429752,
-0.1789047718,
0.1088638753,
0.5916884542,
0.2322083563,
0.0948024318,
-0.0732184723,
0.1410886943,
0.4008792043,
-0.1539737433,
0.1750456244,
0.0856609792,
0.1558367461,
-0.2459355593,
-0.3225770295,
0.0867201611,
-0.0374827348,
-0.2703436017,
-0.2978819907,
0.2684125006,
0.3056305647,
-0.0906071961,
-0.3078661561,
-0.3610242605,
-0.1902945489,
-0.1502479017,
0.058169309,
-0.1451315284,
-0.1091107205,
0.1210772693,
-0.465410322,
-0.6315051913,
0.1588023901,
0.2614535391,
-0.1378864944,
-0.0487120599,
0.4710261822,
-0.1170893386,
-0.3067557514,
-0.1757040918,
0.1382467598,
-0.3328717947,
-0.4727422595,
0.0567258224,
0.0327364355,
0.1449121237,
-0.0046046861,
-0.0250373892,
0.3608404994,
0.0328285731,
-0.1382515877,
-0.6702957153,
-0.3509265184,
-0.0255260654,
-0.0082489904,
0.0349923223,
-0.2053299248,
0.0162953585,
0.1032143161,
-0.3115547597,
-0.2290213108,
0.185183242,
-0.225259915,
0.0900643319,
0.2395978123,
-0.2171785831,
0.101047501,
0.0843686908,
-0.0757304356,
0.0543478951,
-0.0631412193,
-0.1334276497,
-0.08918681,
0.1684434712,
0.0173844062,
-0.0289026145,
-0.0389247015,
-0.4502803683,
0.1195159703,
-0.0200904161,
0.0714455545,
0.3995409608,
0.0297898613,
-0.085791111,
-0.1079764292,
0.1343515217,
-0.0954779238,
0.251891762,
-0.3170963526,
0.1222262979,
0.2177689075,
0.1003889889,
0.1017370969,
0.0451824367,
-0.2046894282,
-0.2187565714,
0.4051787257,
0.0770385563,
-0.1724357605,
-0.2021102905,
-0.0457877442,
0.0469362885,
0.318251133,
0.214539364,
0.0239804611,
0.1379546821,
0.3166843951,
0.0569055863,
-0.1337970793,
0.0943684429,
0.2925776243,
-0.584567368,
-0.3134316504,
0.1808789074,
0.1375525594,
0.1393168122,
0.1402133703,
-0.015651729,
-0.0274777301,
-0.2047492117,
-0.068604663,
0.2648797631,
0.1924529225,
0.1821687371,
0.1630935967,
0.2138555646,
0.2589558065,
0.3916673064,
-0.1995450854,
0.0571530536,
0.2089236826,
-0.0143969953,
0.1258682609,
-0.4021164775,
-0.0961163193,
-0.0148463249,
-0.1273845732,
-0.0578611754,
-0.3358652294,
0.6804256439,
-0.2341681272,
-0.11134056,
0.1060932428,
0.2795391381,
-0.3370934725,
-0.2484963089,
-0.1900670677,
-0.1407717615,
-0.2386082262,
0.0732299984,
0.0666422397,
0.3015090525,
-0.0395284481,
0.0005211178,
-0.107835263,
0.0987059399,
-0.2845022082,
0.3806559741,
-0.0815793276,
-0.3826129138,
0.0141810644,
-0.0543740168,
0.1845443994,
0.2824124098,
0.0918018371,
0.4218007922,
0.2285345942,
-0.3425937593,
-0.0770234317,
-0.1309373975,
0.0451754183,
-0.2144872546,
0.3496633768,
0.2967483401,
0.2587936223,
0.2340887785,
-0.021704074,
-0.0714710951,
0.0370590687,
-0.074713394,
-0.1455678642,
0.1722254455,
0.3018598258,
-0.0354784727,
-0.5675347447,
-0.0313378721,
-0.0884142891,
-0.1225685179,
-0.104158245,
0.3175880313,
-0.2436787188,
0.2089742273,
-0.0893611535,
0.0359866321,
0.0207202584,
0.2716802657,
0.4336246848,
-0.0144854598,
-0.2830019891,
-0.241605252,
-0.6190909743,
0.5306825042,
-0.0851750672,
-0.0155679062,
-0.1069780439,
-0.2263485193,
-0.3635813892,
-0.0004374888,
0.3084353805,
-0.3861482441,
0.1735991836,
0.328905046,
-0.0795240253,
0.1127684712,
0.2845210731,
-0.0676445216,
0.1665806621,
-0.536942482,
0.2253510356,
-0.1526335776,
-0.0399997644,
-0.2405834496,
-0.0501347184,
-0.2626915574,
-0.1540220529,
0.5076759458,
0.1767826825,
0.408090651,
-0.1407745183,
-0.0308869556,
-0.4318880439,
-0.1114619076,
-0.095398508,
-0.2588637173,
0.2124846131,
0.1325123459,
-0.2664831877,
0.1283311099,
-0.2662895024,
0.2366713285,
0.013893038,
0.0539830551,
0.0401119739,
-0.0055826381,
0.1839081496,
0.059253782,
0.4290370047,
0.1012138277,
0.1110894978,
0.1096931994,
-0.5186725855,
-0.1349039823,
0.4938792884,
-0.5808688402,
-0.5136084557,
0.3696514368,
0.2703975439,
-0.1672360301,
0.1247319579,
-0.1787785441,
-0.0103749186,
0.2156499624,
0.1081308872,
-0.3498935699,
0.0447330438,
-0.3796355724,
0.0116811097,
0.1946568936,
0.0808168203,
-0.0623396002,
-0.2462143302,
0.0799525231,
-0.2100165337
] |
https://github.com/huggingface/datasets/issues/580 | nlp re-creates already-there caches when using a script, but not within a shell | Couln't reproduce on my side :/
let me know if you manage to reproduce on another env (colab for example) | `nlp` keeps creating new caches for the same file when launching `filter` from a script, and behaves correctly from within the shell.
Example: try running
```
import nlp
hans_easy_data = nlp.load_dataset('hans', split="validation").filter(lambda x: x['label'] == 0)
hans_hard_data = nlp.load_dataset('hans', split="validation").filter(lambda x: x['label'] == 1)
```
twice. If launched from a `file.py` script, the cache will be re-created the second time. If launched as 3 shell/`ipython` commands, `nlp` will correctly re-use the cache.
As observed with @lhoestq. | 20 | nlp re-creates already-there caches when using a script, but not within a shell
`nlp` keeps creating new caches for the same file when launching `filter` from a script, and behaves correctly from within the shell.
Example: try running
```
import nlp
hans_easy_data = nlp.load_dataset('hans', split="validation").filter(lambda x: x['label'] == 0)
hans_hard_data = nlp.load_dataset('hans', split="validation").filter(lambda x: x['label'] == 1)
```
twice. If launched from a `file.py` script, the cache will be re-created the second time. If launched as 3 shell/`ipython` commands, `nlp` will correctly re-use the cache.
As observed with @lhoestq.
Couln't reproduce on my side :/
let me know if you manage to reproduce on another env (colab for example) | [
0.0140549019,
0.1271243691,
0.0085560791,
0.0359592512,
-0.0078009535,
-0.2637969255,
0.1551123112,
0.1553375721,
0.3577844203,
-0.1724362075,
-0.1574477404,
0.2879820466,
-0.0159158316,
-0.140978992,
0.3365516067,
0.1459570825,
-0.0274405777,
0.1052430421,
0.1590005159,
-0.0974335074,
-0.0484842584,
0.1789570004,
-0.1397800744,
-0.1683490276,
-0.0173318088,
0.0266730469,
-0.02247582,
0.1550354064,
0.1056407541,
-0.4688459635,
0.285412997,
-0.0673119724,
-0.0950462222,
0.1259216666,
-0.0001123684,
-0.1467733979,
0.2711631656,
-0.0826238692,
-0.260833621,
-0.1274133474,
0.1139277816,
-0.3116104305,
0.3685198724,
-0.1763195097,
-0.0037387908,
0.152594462,
0.1377797276,
-0.1676028818,
0.5938946605,
0.3666622043,
0.2137850225,
0.2000361234,
-0.2147933245,
0.1976128817,
0.2465294451,
-0.008641012,
-0.1001662463,
-0.1787398458,
-0.012323413,
-0.3991583884,
-0.1550950706,
0.4522331357,
-0.2373663783,
0.0255575217,
0.1741414517,
0.1520870626,
0.1697422713,
-0.3322789371,
0.0382099003,
-0.1236507297,
0.0047658347,
-0.3702813685,
-0.162206769,
-0.3918679357,
-0.1547728479,
-0.5328942537,
0.2281318605,
0.2824667096,
-0.0714172944,
0.0352048203,
-0.1941296309,
0.138889268,
0.1720047444,
0.0363139026,
0.2497338504,
0.4306166768,
0.0625299513,
0.0532182306,
0.1106563434,
-0.0548589006,
0.2229955941,
-0.3569899201,
-0.0537065081,
0.3228700161,
-0.2641915381,
0.0885869712,
0.3014730513,
0.4047351778,
-0.0454725027,
0.2235714197,
0.1734755337,
-0.0480190292,
-0.0876519084,
0.2253357619,
0.0472554602,
0.3800195456,
0.3655146956,
0.047801096,
0.090415746,
-0.0108134151,
-0.358879149,
-0.0273606349,
0.3107823133,
0.1221773326,
0.4281065762,
-0.0614236072,
-0.0855408534,
-0.0838733912,
-0.221129775,
0.16041632,
-0.1021044329,
-0.108369723,
0.262014538,
0.3968602121,
0.0097244084,
0.1738792807,
0.0508841574,
-0.0545450933,
-0.4351883531,
-0.1983557642,
-0.1743291318,
0.1057347059,
-0.2974104285,
0.1877690256,
0.4734044373,
0.2011060268,
0.4490162134,
0.1165862381,
-0.2418128848,
-0.2520127296,
0.3037591577,
-0.1120485589,
0.4608232081,
0.2307963967,
0.196698308,
0.0357480347,
0.2566693723,
-0.2837710381,
-0.1765891165,
0.13235946,
-0.14808999,
0.0681456998,
0.4464126527,
0.1733051091,
-0.0703176185,
0.0103105139,
0.3509062827,
0.075367108,
0.3642427921,
-0.5678486824,
-0.0792807266,
-0.4126443863,
-0.3637129664,
-0.251507014,
-0.0087797716,
0.4585022032,
-0.1346433461,
-0.1648875028,
-0.1391751468,
0.1942669749,
0.2277395874,
0.396407187,
-0.1332821399,
0.0929472297,
-0.1476305127,
0.0661582053,
0.6278778315,
-0.4900431633,
-0.3707662225,
0.1228373498,
0.0793690681,
0.3197265863,
0.1245158315,
0.2428550571,
-0.0670706928,
-0.2258002311,
0.2498880625,
0.3515285552,
0.0406156927,
0.0822370574,
-0.2879507542,
0.3968782425,
0.3258949816,
-0.201641798,
-0.1354913414,
0.2156067044,
-0.1007499695,
0.0996772274,
0.0922813416,
0.0136361113,
0.1314319372,
0.0401057638,
-0.057040751,
-0.0098582394,
0.1718410403,
-0.0933760852,
-0.0365484692,
0.0334529877,
-0.2740912735,
-0.2214099169,
0.1858892143,
-0.1368452013,
0.0164679773,
-0.1372162253,
-0.1522571146,
-0.4594000578,
0.126447767,
0.3427808285,
0.3462296724,
-0.1067049801,
-0.2199795246,
0.5316772461,
0.1187796146,
-0.0748463422,
-0.2673672438,
-0.0871284753,
-0.1514259428,
-0.2447950542,
-0.2198596746,
0.1518072188,
0.209010005,
0.0987170041,
-0.1557875276,
0.1926580817,
0.2483173311,
-0.1676100641,
-0.2118836939,
0.0143154114,
0.0387563705,
0.312590301,
0.0876140669,
0.103073284,
0.1594627053,
-0.1032633334,
-0.1650413722,
0.3819601536,
0.1783607304,
0.0721179098,
0.1815749109,
0.0295199677,
0.3982233703,
-0.2489923388,
-0.0253767371,
-0.2632981539,
0.404783994,
-0.1850561053,
0.0899650827,
0.0454691797,
-0.2181368768,
0.3232726455,
0.4830144942,
0.275750041,
0.0691799745,
0.0878766179,
-0.0159709156,
-0.3959515095,
-0.0724546984,
0.4940244555,
0.4645234346,
0.2245409489,
-0.0549593754,
0.1367131472,
-0.1678970456,
-0.2303490341,
0.1029077917,
-0.198198691,
-0.0195110049,
0.1492988318,
0.3859744072,
0.0612339489,
-0.333663255,
0.0843748003,
-0.0284439195,
-0.1334322542,
-0.1216018051,
0.3592764139,
-0.3702670038,
-0.4869051576,
-0.130043149,
0.2459070683,
-0.1161152422,
-0.098821491,
0.0744908154,
0.3453322351,
0.0831705555,
0.1217839792,
-0.0779095143,
0.2753457427,
-0.1485339701,
-0.1314698458,
-0.0340485275,
-0.2705313563,
-0.3422708213,
-0.0438279882,
0.1777163744,
0.0886244625,
0.3379087448,
0.0205063447,
-0.0477558523,
-0.1575674862,
-0.3425694704,
0.0794402808,
0.069145903,
0.2530016899,
-0.0915526003,
-0.3343282044,
-0.3508034945,
-0.0153950937,
0.1838686317,
-0.3609213531,
-0.1059565395,
0.0877096206,
0.0262167007,
-0.1499073654,
-0.3578155637,
-0.3398054838,
-0.2054051012,
-0.1251428276,
-0.0889983252,
0.2696053386,
0.2463212907,
0.5280355811,
-0.4274567366,
-0.1854810715,
-0.3738568425,
0.2974689603,
-0.2761280239,
0.1026843488,
0.0667226911,
-0.2120819092,
-0.269400388,
-0.1537082046,
-0.2264173925,
-0.0149642034,
0.1810591072,
-0.4297279716,
-0.4037624896,
-0.1235877797,
-0.1860661358,
0.0473243408,
0.1130345538,
0.4327493906,
-0.0400286503,
-0.1123490557,
-0.441801697,
-0.3570385575,
0.1656186879,
-0.0183239765,
0.2703089416,
-0.1238624305,
0.3400832713,
-0.0012423322,
0.0214988776,
0.290329814,
0.0951482654,
0.3572483063,
0.1234290823,
0.3901818395,
-0.2690263391,
-0.2034894377,
-0.0901881456,
0.0914636999,
-0.0907983184,
0.3345511854,
0.1480843574,
-0.3454191983,
-0.1368839145,
-0.1049649715,
-0.2018757313,
-0.1158563197,
0.0805863589,
0.0533376634,
0.2972948849,
0.5297687054,
0.1124679595,
-0.3232437968,
-0.144284904,
-0.3682996631,
-0.0390044264,
0.3265336752,
0.1358307153,
-0.2818974257,
-0.0898186564,
-0.2725958526,
-0.0929438472,
0.0633994341,
0.3947660029,
-0.2736209929,
-0.1881749332,
0.1198271364,
-0.0901691169,
0.3416713476,
-0.1776979417,
0.1468311548,
0.2142213583,
-0.0172100663,
-0.30045259,
0.0946971327,
-0.166727379,
0.3412582874,
0.6077417731,
0.1224250644,
-0.0647706985,
-0.0097880512,
-0.12688604,
0.2940235436,
-0.2101611346,
-0.2329930067,
-0.170764178,
-0.1079778448,
-0.1880625188,
-0.0429776162,
-0.1812501401,
-0.1158417016,
0.1952056289,
0.343896687,
0.1889824271,
0.069054611,
0.0545896254,
0.1227548867,
0.3098185062,
-0.0418688133,
0.1788338274,
-0.3252726197,
-0.0026632724,
-0.5377194881,
0.3805283308,
-0.1568693519,
0.0011064969,
-0.1874193698,
0.1442612261,
-0.0284360424,
0.2292056978,
0.0389026552,
-0.0094719343,
-0.1985987723,
-0.1174348742,
0.1762853861,
-0.1198257655,
0.0579109341,
-0.192418769,
0.0621137619,
-0.425914377,
0.220820874,
0.0800802857,
-0.0958824456,
0.1784724891,
-0.4109882712,
-0.2472117245,
0.5547169447,
-0.1747342348,
0.6807312369,
0.2325864881,
0.3097768426,
0.2408890575,
0.0989024639,
0.6356297731,
-0.0390549749,
0.4298945367,
-0.3387109935,
0.2573567331,
0.1301667094,
-0.1957309544,
-0.2928290963,
-0.1590917557,
-0.2982915342,
0.4382793307,
0.0669863671,
-0.2236719877,
0.1064551398,
0.2793868184,
-0.0087800622,
-0.2717784345,
-0.4540950656,
0.197549358,
-0.1337620765,
0.1832055002,
-0.0059471689,
0.0134422891,
0.1246392429,
-0.2137923241,
0.0419073552,
0.3979532123,
0.0236714408,
0.159816891,
0.141333282,
-0.2615764737,
-0.3126146793,
0.0931331217,
0.0415091664,
0.0123314662,
-0.0579654798,
0.2015196979,
0.2126499414,
0.2799979448,
0.0238381326,
-0.072858654,
0.433111161,
-0.1247007102,
-0.0739284158,
-0.4255560637,
-0.1259182245,
-0.2914917767,
-0.0023658574,
0.4447729886,
0.138723582,
0.1159524992,
-0.1770389378,
0.146403715,
-0.2158009559,
-0.1990840733,
0.169498682,
-0.0036470089,
-0.1287612021,
0.2478408068,
-0.0036253636,
-0.1904655248,
0.0074084513,
0.0963301733,
0.0952237993,
0.1120963469,
0.3333142102,
0.0039070547,
-0.142027095,
-0.218435958,
-0.4068239331,
-0.3128978312,
-0.2059399486,
0.2505104542,
0.0564306118,
-0.0420043468,
0.2730363905,
-0.2061293423,
-0.1348136663,
0.0696014911,
-0.1859170645,
-0.4265006781,
0.185575068,
-0.0669687614,
-0.2016600072,
-0.2573543191,
0.4571641386,
0.2771079242,
0.028578721,
0.2128783911,
-0.3180311322,
-0.1352419555,
0.1126767397,
0.3225848973,
0.0394786745,
-0.1526791155,
-0.1495835483,
0.09550111,
-0.0041967072,
0.1972430199,
-0.3190488517,
-0.2811467648,
-0.0855114982,
0.1044002548,
-0.0046491139,
-0.1303960979,
-0.2635140419,
-0.041873794,
0.0824362636,
-0.3643062115,
-0.0301250368,
-0.0617691129,
-0.1133385897,
-0.064100787,
-0.1500823349,
0.1415188909,
0.3738205135,
0.0577139743,
-0.3782216609,
-0.1485736668,
0.0266932994,
0.1841717958,
0.0111784935,
0.0454671085,
-0.3978870511,
-0.1515817642,
-0.1233685613,
0.1552700251,
0.1819798797,
-0.3702734709,
-0.0786005408,
-0.0992121249,
0.2283604592,
0.2684123516,
0.020662237,
-0.0115409978,
0.2399544716,
0.1466329247,
-0.296782434,
0.0233713798,
0.3732441068,
0.159475565,
0.0646624863,
0.1382669806,
0.2581617832,
-0.0944324583,
0.3471425176,
0.1362630576,
0.4831186831,
-0.596288681,
-0.0016914569,
-0.1406368613,
0.2338178158,
0.0332976803,
0.2633267641,
-0.1763096005,
0.0219205208,
0.4350862801,
0.1924681365,
0.3204327822,
0.3400511742,
-0.0985441953,
-0.297198385,
-0.1915214062,
-0.1900561005,
-0.0022810921,
0.0494241901,
0.1495680958,
-0.0016233176,
0.0589984357,
-0.158104822,
-0.1077885479,
-0.1957950592,
0.4848645926,
-0.0400334597,
-0.1616547555,
-0.0043457597,
0.057285428,
0.2180640996,
0.1309998631,
0.1196315438,
0.4419144392,
0.0766543373,
0.0482814237,
-0.0525678732,
-0.285862565,
-0.2747984529,
0.0684227943,
0.1314975619,
-0.2465024889,
0.4550006688,
0.1307494938,
0.1033759713,
-0.1723935902,
0.156948939,
0.5075447559,
0.4611145258,
-0.0035804268,
0.0155276582,
-0.2227507532,
-0.0329941399,
-0.3223071694,
0.3511273563,
0.2794366777,
-0.0107296128,
-0.0043519214,
0.0595916808,
-0.0882379711,
0.1745176315,
0.2891909778,
-0.2743090987,
-0.1138385683,
0.3731924295,
0.255615145,
0.0462460369,
-0.4124015272,
0.2220691442,
-0.4528362751,
0.1879251003,
0.3555693626,
0.1741960049,
0.2492039055,
-0.3650428057,
0.0916590244,
-0.0503004938,
0.2403858751,
0.1078094542,
0.050562948,
-0.0849490762,
0.0670372099,
-0.7058619261,
0.0267193466,
-0.4562802613,
-0.1184800118,
-0.1871310472,
0.3274883032,
-0.3748756945,
0.0732663721,
-0.130503431,
-0.21999982,
0.1314417273,
-0.3093395829,
-0.2394203395,
-0.0231264997,
0.1688391417,
0.0829378814,
-0.0250740759,
-0.232698679,
-0.0215653069,
-0.4717829823,
0.2197502255,
0.0933377668,
-0.2499943674,
0.1657355726,
0.6163631082,
0.1083634049,
0.4907679856,
0.4504414797,
-0.0541111082,
-0.0049737096,
-0.3906281888,
-0.1123803705,
-0.0764646158,
-0.310505271,
0.0192334224,
0.3926124275,
-0.4182899594,
-0.0872883424,
-0.2374892682,
0.4055713117,
-0.1793858111,
-0.0698672682,
0.1784395128,
0.0141683146,
-0.1195703745,
0.3620306253,
-0.2111617327,
0.4743292034,
-0.0432454981,
0.3824566603,
-0.3174718022,
-0.3478339911,
0.3471218944,
-0.6071429849,
-0.3471952975,
0.0734205768,
0.2991021574,
0.0560427904,
-0.4155088663,
-0.2385804653,
-0.0634933561,
0.0880295038,
-0.0643562824,
-0.2337218225,
0.1836695969,
-0.3016315699,
-0.0839674249,
0.0482024699,
-0.2365954816,
0.3630093932,
-0.3723289967,
0.2081934959,
-0.0050061569
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | Some wikipedia languages have already been processed by us and are hosted on our google storage. This is the case for "fr" and "en" for example.
For other smaller languages (in terms of bytes), they are directly downloaded and parsed from the wikipedia dump site.
Parsing can take some time for languages with hundreds of MB of xml.
Let me know if you encounter an error or if you feel that is is taking too long for you.
We could process those that really take too much time | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 88 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
Some wikipedia languages have already been processed by us and are hosted on our google storage. This is the case for "fr" and "en" for example.
For other smaller languages (in terms of bytes), they are directly downloaded and parsed from the wikipedia dump site.
Parsing can take some time for languages with hundreds of MB of xml.
Let me know if you encounter an error or if you feel that is is taking too long for you.
We could process those that really take too much time | [
0.2373759747,
-0.2088859379,
-0.1352212727,
0.4252156615,
0.1233767122,
0.2758942246,
0.1313818693,
0.1356952488,
0.7319018841,
-0.2289891392,
0.0522105247,
0.0235071518,
0.1044933945,
-0.2209718972,
0.0679350048,
-0.155310601,
0.0418751724,
-0.1114458442,
0.1551071107,
-0.361084193,
-0.3178488016,
0.238444075,
-0.3044134974,
0.0057741683,
-0.2436925173,
0.23083359,
0.0614725053,
-0.1755259931,
0.0371082723,
-0.3282754421,
0.1453270018,
0.2548289001,
0.2869494855,
0.1309534013,
-0.0001185968,
-0.1286965758,
0.7042649388,
-0.1099033207,
-0.4838132262,
-0.1212518215,
-0.1014470905,
-0.4541420341,
0.1855035424,
-0.1290797889,
0.0664207339,
-0.2606803775,
0.2823223472,
-0.5186070204,
0.1401317567,
-0.0058512837,
0.1931751072,
-0.2319423854,
0.040468514,
0.1306500435,
0.3112445474,
0.0921859294,
0.1721142232,
0.1484162211,
0.0676717535,
-0.2165158689,
-0.0645690039,
0.1268206537,
-0.1644361913,
-0.0737822056,
0.1959549338,
-0.1150275171,
-0.1769632101,
-0.6844718456,
0.1788195074,
0.2517310083,
0.7159074545,
0.0819104612,
-0.3249860406,
-0.2314270288,
0.0833556578,
0.0490162373,
0.1674250215,
0.3484606147,
-0.1044913381,
-0.0879917741,
-0.0450238623,
-0.3324612379,
-0.0504330993,
0.5696835518,
-0.1381941885,
0.3546817601,
0.1198180169,
0.1225395948,
0.0333933234,
-0.1997120231,
-0.2386390269,
-0.227380991,
0.4235842526,
0.4048974514,
-0.2517802715,
0.4409762025,
-0.0247043297,
0.2985454202,
0.0263238177,
-0.1559598446,
-0.2274287492,
0.2422036976,
-0.1045708209,
0.1780498773,
0.1330904067,
-0.098568812,
0.2461398542,
-0.1657911241,
0.3742339611,
-0.0247804932,
-0.1900201142,
0.0311532021,
-0.3043865263,
-0.1849524081,
-0.1912525296,
-0.0971258581,
-0.0666857958,
-0.3103951812,
0.158936277,
0.1572279036,
-0.2838954031,
-0.2617974281,
-0.1157062426,
0.2324320376,
-0.1103897169,
0.2289989591,
0.1797439307,
0.3324599266,
-0.4585289657,
-0.0347171798,
0.0760529339,
-0.0932944119,
-0.1863998473,
0.1857512742,
0.3903912306,
0.1105757505,
0.3006480336,
-0.065598838,
0.0075085601,
-0.1033305898,
-0.0583555028,
-0.2353231758,
-0.0578202829,
0.0070611797,
0.1893651634,
0.5287055969,
0.020435771,
-0.2688874006,
-0.155916661,
0.1354003996,
-0.1534601599,
0.3921815157,
0.0703722537,
0.1048097834,
-0.30715698,
-0.1301629692,
-0.2754411399,
0.288240999,
0.1407091618,
-0.1476815194,
-0.0730015188,
-0.2850125134,
-0.155377835,
-0.0928239226,
0.3056328893,
0.5948290825,
-0.2702883482,
-0.1211092025,
-0.1699473262,
0.1433557421,
0.3089426458,
0.2673923969,
0.0071028918,
0.2469746321,
-0.1199263036,
0.1096141785,
0.2730518281,
0.0004283413,
0.0255030133,
-0.0254085846,
0.2584196925,
0.2935867608,
-0.043168921,
0.0379173234,
0.2038618624,
0.1188982874,
0.1142758876,
0.559484303,
0.1990443617,
-0.0339214616,
-0.3283989429,
0.0008387864,
0.5573418736,
0.3541517854,
0.1461986601,
-0.2276218981,
0.0513214208,
0.3036474884,
0.1359985471,
-0.0113094114,
0.0927649438,
0.3572766781,
-0.3393719792,
0.2750452459,
0.2403737605,
0.0269903839,
-0.5525111556,
0.236906305,
0.0318739861,
0.1359110326,
-0.0013063252,
0.1318640858,
-0.2870848179,
-0.1605426222,
-0.2156868875,
0.2611428499,
0.0749469697,
0.170305416,
-0.1136760265,
0.4636846185,
0.0636296719,
-0.1318300664,
0.0043711811,
-0.1513061076,
-0.5514067411,
0.412466675,
0.0465183482,
-0.066319488,
0.0748854131,
0.1963202953,
0.2826355994,
-0.1605143845,
-0.0965555757,
-0.2543490529,
0.3377592564,
0.2528760731,
0.2167096436,
-0.07920596,
0.2825823426,
-0.4112242758,
0.2441623956,
0.2295996845,
0.1425693184,
-0.2165513039,
-0.0493409075,
0.2202328742,
-0.1231866777,
0.4874269366,
0.0636299029,
0.0318967178,
0.3125786483,
0.0732511058,
-0.1109993905,
-0.0760104358,
0.5720252395,
0.1005096063,
0.0688268021,
0.2869987488,
-0.2082105577,
-0.1159857363,
0.5724143982,
0.0600976199,
0.0017769951,
0.2686590254,
-0.1495233476,
-0.0875655785,
-0.0348373689,
-0.2696870565,
-0.2388120294,
0.1612473726,
0.0583881848,
-0.433024615,
0.0328254066,
-0.1703620851,
-0.0219371468,
0.1148252934,
0.2563504577,
0.128282398,
0.1700766981,
-0.0760188103,
-0.2888401747,
-0.0338466018,
0.0780998245,
0.1424240619,
-0.1322092116,
0.0680788383,
-0.3862012625,
-0.3865298033,
-0.2589558661,
-0.2271350622,
-0.4857495129,
-0.4379449785,
-0.1076671854,
-0.023560442,
0.1001372933,
0.0911558419,
-0.0419986919,
-0.0005247146,
-0.228249073,
-0.1849109977,
-0.3860183954,
-0.4345355034,
-0.3712605536,
-0.0285548177,
0.5293227434,
0.1552811563,
0.1042221859,
-0.040895097,
-0.1794217825,
-0.1444324106,
-0.3140220344,
-0.2439804077,
-0.3181400299,
0.1540390104,
-0.3206979632,
0.4155473411,
-0.1617508233,
-0.0525789373,
0.350143075,
0.0088330656,
0.0303260982,
0.2139814198,
0.2482555062,
0.1368110925,
0.0138353705,
-0.4436650872,
-0.247626707,
-0.2580003738,
-0.0753561556,
0.1386438161,
-0.0286487155,
0.0223310664,
-0.1088261008,
0.0438172594,
0.1129067689,
0.0133074708,
-0.0412305407,
0.3525741696,
0.5122916102,
-0.0197110046,
-0.3502420187,
0.1122013032,
0.1237488836,
0.0851644576,
0.1428358555,
-0.5253460407,
0.2483902574,
-0.2274439335,
0.0896891505,
0.1307523996,
-0.1152568236,
-0.0421737283,
-0.1626755297,
0.1069582775,
0.0575446784,
-0.1961790621,
-0.2151119709,
-0.0228670761,
0.1928331852,
0.0768846571,
0.3748748004,
-0.3542457521,
0.6905221939,
0.1530367881,
0.1087187678,
0.2486216575,
0.2776161134,
-0.0156387817,
-0.1505483985,
-0.2539336681,
0.0796208456,
-0.0529615134,
-0.0789798275,
0.3414696753,
0.0570948087,
-0.4630897939,
-0.1659130156,
0.1838310659,
-0.3000290394,
-0.244790107,
0.1687376052,
-0.2768782675,
0.1899993718,
-0.0978068262,
-0.0024239123,
0.0266597643,
-0.4019005299,
-0.0604732521,
0.2074945569,
0.4596831203,
0.1586942971,
-0.3370334506,
-0.4819996357,
-0.2602394223,
0.2074867189,
-0.0078572817,
0.5056963563,
-0.2882928848,
0.0243108571,
0.1711665392,
0.1254116297,
0.1907418966,
-0.2676234245,
-0.1190827265,
0.4583577514,
0.0570354611,
-0.6286125779,
-0.0129508749,
-0.1313426197,
0.1177797392,
0.3597905934,
0.1823291332,
-0.4848778248,
-0.262434572,
-0.1122145206,
0.623413384,
-0.180888176,
-0.0617819726,
-0.4719060957,
-0.1900830269,
-0.3897497356,
-0.1754595786,
-0.1965723932,
0.1376360357,
-0.2524030805,
-0.0068888888,
-0.0342989415,
0.2318182439,
0.126101315,
0.1136710495,
0.1185021996,
0.2835501432,
-0.0647994876,
0.1833140254,
0.1066989899,
-0.1316338331,
0.3084094524,
0.0355663449,
-0.1813282967,
-0.0142123094,
-0.1458049417,
0.2586129308,
0.0000790618,
0.1404331774,
0.0707284585,
0.077915296,
-0.161607787,
-0.075961709,
0.0272245593,
0.2273465544,
-0.0392967835,
-0.3217106462,
-0.3995398581,
0.6012830138,
0.0243771896,
-0.077306971,
0.2754059136,
-0.0577726327,
-0.3797739148,
0.476683706,
0.0389676057,
0.8272894025,
-0.4226619303,
0.0835662186,
-0.144453451,
0.1735060215,
0.4089494348,
0.0086707398,
0.0363028459,
-0.2705718279,
0.3124825358,
-0.0021860227,
0.0716993511,
0.0022460958,
0.1210936606,
-0.1676384956,
0.2851982117,
-0.1164298281,
-0.0431562327,
0.0004540272,
0.5521715879,
0.1862846315,
-0.3387275934,
-0.3292573094,
0.0565183423,
-0.190905422,
0.3705631196,
-0.0544363223,
0.1684118509,
-0.0676844567,
0.037142165,
-0.2191144824,
0.138349995,
-0.073887676,
0.2030121982,
-0.2177363783,
-0.2613620758,
0.347024858,
0.4555531442,
-0.0610573329,
0.2965241671,
-0.1833681315,
0.1614168137,
-0.1743263453,
-0.2580778897,
-0.0258770715,
0.0392287485,
0.0128865596,
0.0260215029,
-0.3502266407,
0.0602126382,
-0.0239022598,
0.0278151631,
-0.4295762181,
0.1425747722,
0.0463308617,
0.1478118151,
-0.2561361492,
-0.0156789608,
-0.1960350573,
0.0551939532,
0.1045235246,
0.0453432947,
0.2451972514,
0.1529278755,
0.4280246496,
-0.2785581052,
-0.093260549,
0.2778621018,
0.0488080755,
0.0328174978,
0.7834225893,
0.37939623,
-0.3779435754,
-0.2083092034,
-0.3031992018,
-0.4560177326,
-0.3899917305,
-0.2934160829,
0.1242029667,
0.3839826584,
-0.2075719535,
-0.1481449455,
-0.0268959757,
-0.4575161636,
-0.0013405755,
-0.6665330529,
0.0549345464,
0.3396061063,
0.049638778,
-0.002272197,
-0.0495140776,
-0.4748335779,
0.1891739815,
0.0927153155,
-0.1784785837,
-0.0465925038,
-0.2098364234,
0.1940190792,
0.2902837396,
-0.0305014662,
-0.1945331991,
-0.124561958,
0.0715635419,
-0.2864049971,
0.1808193773,
-0.1599362642,
0.0942241848,
0.150822714,
0.1461358368,
-0.1049812287,
-0.1656459123,
-0.4103102088,
0.4030149579,
-0.3441687524,
-0.1329702139,
-0.1333862841,
-0.0661589205,
0.2610637844,
-0.0318381265,
0.230648756,
-0.0626975447,
0.1877811253,
-0.0556442216,
-0.0242140405,
0.273647368,
0.0275774375,
0.6309031248,
0.1270672828,
-0.4748727083,
-0.067262508,
-0.0513815247,
0.1632097811,
0.2519102991,
-0.164630726,
0.383777529,
0.3636279106,
-0.0786025673,
0.5536407828,
0.0051681176,
-0.0831235945,
0.2831913233,
0.0924540833,
-0.1920318902,
0.102649942,
0.630894959,
0.1232616305,
-0.1636998057,
0.1788726747,
0.1540611982,
-0.2622278333,
0.006137792,
-0.0735290349,
0.1782837212,
-0.0653235614,
0.1384172291,
-0.0279025361,
0.2722363472,
0.0861176327,
0.041889362,
0.1025840044,
-0.1573581845,
0.0006600512,
0.1196803004,
0.1837934852,
-0.0361657701,
0.1555348039,
-0.2045337111,
-0.3261487782,
0.0495487414,
0.4282368124,
0.0327401236,
0.2258840799,
-0.0281018838,
0.1899500191,
0.164109692,
-0.2246440351,
-0.1283998936,
0.2254067361,
0.038607765,
-0.2214491069,
-0.1599896848,
-0.0641021878,
-0.1283914596,
-0.0061339736,
-0.020288568,
-0.2115823925,
0.018539222,
0.4602552056,
-0.1359459907,
-0.5251308084,
0.5226481557,
-0.2327723801,
0.2248285711,
-0.1313632876,
0.1586135328,
-0.0202143453,
-0.0204818696,
-0.0203273334,
0.1130883619,
0.0961167589,
0.3027555943,
-0.4491689801,
0.0732994378,
-0.0639272481,
0.1251826435,
-0.0428361148,
0.501122117,
0.0970259458,
0.2795987725,
0.248376593,
0.1094288081,
-0.0767134503,
-0.2166142464,
0.2603474855,
-0.0639568269,
-0.1150570065,
0.0645177513,
-0.1318162978,
0.021289207,
-0.3001454771,
-0.0776858926,
-0.3329939246,
0.1958841085,
0.2172614485,
0.1616083682,
0.0328016467,
-0.0893491283,
0.0220450908,
-0.1579571515,
0.682708025,
0.5152643323,
0.3482386768,
-0.2263832539,
-0.1950801015,
-0.5486900806,
-0.0545452461,
-0.3532509506,
-0.0403816961,
0.2484323382,
0.2327830791,
0.106583178,
0.0634973049,
0.0282981284,
-0.1706915945,
-0.1625121832,
0.2500011027,
-0.3407207727,
-0.1809643358,
-0.1340543032,
0.0768587515,
-0.3858618438,
-0.1097127497,
0.1934159994,
-0.258818239,
-0.0511221886,
-0.0512897521,
-0.0344719142,
0.0920767561,
0.2395561188,
0.2278838754,
0.032222636,
0.6150702834,
0.0347037688,
0.0201278999,
-0.2817746997,
-0.2000229061,
0.1077024415,
0.3866239786,
-0.2164959908,
0.2973263264,
-0.1123958975,
0.1421589404,
-0.2420966923,
0.3953527808,
-0.1706998199,
0.0467420816,
0.030388996,
-0.2690484822,
-0.0076744631,
-0.1595904827,
0.0385249443,
0.2603837252,
-0.0615826771,
0.0413656905,
-0.3012877703,
-0.1222680062,
0.1400860548,
-0.353915453,
-0.4278775156,
0.0277094916,
0.1384716034,
0.192981407,
0.2279360443,
-0.3475381732,
0.03053312,
0.4040949345,
-0.0450158119,
0.0737328306,
0.1564312279,
0.0955428481,
-0.2031617463,
-0.1827108562,
-0.1231420934,
0.0558166914,
-0.0470122471,
-0.1533837467,
-0.1262532622
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | Ok, thanks for clarifying, that makes sense. I will time those examples later today and post back here.
Also, it seems that not all dumps should use the same date. For instance, I was checking the Spanish dump doing the following:
```
data = nlp.load_dataset('wikipedia', '20200501.es', beam_runner='DirectRunner', split='train')
```
I got the error below because this URL does not exist: https://dumps.wikimedia.org/eswiki/20200501/dumpstatus.json. So I checked the actual available dates here https://dumps.wikimedia.org/eswiki/ and there is no 20200501. If one tries for a date available in the link, then the nlp library does not allow such a request because is not in the list of expected datasets.
```
Downloading and preparing dataset wikipedia/20200501.es (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.es/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/load.py", line 548, in load_dataset
builder_instance.download_and_prepare(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 462, in download_and_prepare
self._download_and_prepare(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 965, in _download_and_prepare
super(BeamBasedBuilder, self)._download_and_prepare(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 422, in _split_generators
downloaded_files = dl_manager.download_and_extract({"info": info_url})
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/download_manager.py", line 155, in download
downloaded_path_or_paths = map_nested(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/py_utils.py", line 163, in map_nested
return {
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/py_utils.py", line 164, in <dictcomp>
k: map_nested(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/py_utils.py", line 191, in map_nested
return function(data_struct)
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/download_manager.py", line 156, in <lambda>
lambda url: cached_path(url, download_config=self._download_config,), url_or_urls,
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/file_utils.py", line 191, in cached_path
output_path = get_from_cache(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/file_utils.py", line 356, in get_from_cache
raise ConnectionError("Couldn't reach {}".format(url))
ConnectionError: Couldn't reach https://dumps.wikimedia.org/eswiki/20200501/dumpstatus.json
``` | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 252 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
Ok, thanks for clarifying, that makes sense. I will time those examples later today and post back here.
Also, it seems that not all dumps should use the same date. For instance, I was checking the Spanish dump doing the following:
```
data = nlp.load_dataset('wikipedia', '20200501.es', beam_runner='DirectRunner', split='train')
```
I got the error below because this URL does not exist: https://dumps.wikimedia.org/eswiki/20200501/dumpstatus.json. So I checked the actual available dates here https://dumps.wikimedia.org/eswiki/ and there is no 20200501. If one tries for a date available in the link, then the nlp library does not allow such a request because is not in the list of expected datasets.
```
Downloading and preparing dataset wikipedia/20200501.es (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.es/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/load.py", line 548, in load_dataset
builder_instance.download_and_prepare(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 462, in download_and_prepare
self._download_and_prepare(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 965, in _download_and_prepare
super(BeamBasedBuilder, self)._download_and_prepare(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 422, in _split_generators
downloaded_files = dl_manager.download_and_extract({"info": info_url})
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/download_manager.py", line 155, in download
downloaded_path_or_paths = map_nested(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/py_utils.py", line 163, in map_nested
return {
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/py_utils.py", line 164, in <dictcomp>
k: map_nested(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/py_utils.py", line 191, in map_nested
return function(data_struct)
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/download_manager.py", line 156, in <lambda>
lambda url: cached_path(url, download_config=self._download_config,), url_or_urls,
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/file_utils.py", line 191, in cached_path
output_path = get_from_cache(
File "/home/gaguilar/.conda/envs/pytorch/lib/python3.8/site-packages/nlp/utils/file_utils.py", line 356, in get_from_cache
raise ConnectionError("Couldn't reach {}".format(url))
ConnectionError: Couldn't reach https://dumps.wikimedia.org/eswiki/20200501/dumpstatus.json
``` | [
0.3074045777,
-0.1131630167,
-0.1143274307,
0.2666958869,
0.0679332167,
0.2159504443,
0.1886424869,
0.1882865131,
0.602712512,
-0.2003640383,
0.0329799801,
0.0883670822,
0.1432430446,
-0.2862558067,
0.037924841,
-0.1857150793,
0.0711626261,
-0.052669663,
0.0999961644,
-0.3232584596,
-0.3721323013,
0.2316432595,
-0.3498652875,
0.0395477638,
-0.294998318,
0.3205526769,
-0.1095637232,
-0.1376002431,
0.0048025064,
-0.3297348619,
0.1528248042,
0.2533640563,
0.3197316825,
0.2253635973,
-0.0001195317,
-0.1390187442,
0.7226003408,
-0.1402970552,
-0.5811547041,
-0.1576964855,
-0.3304522634,
-0.4744783044,
0.1427855939,
-0.2372420281,
0.1085568219,
-0.1610046476,
0.2901948988,
-0.3729116917,
0.2277099788,
0.1040741205,
0.172093451,
-0.0161262006,
0.053266041,
0.0327016115,
0.3107928634,
0.1675488353,
0.1112301052,
0.0960420966,
0.0196499974,
-0.2150571644,
0.1037459821,
0.0984233618,
-0.1336672753,
-0.1492943764,
0.1253487468,
-0.047249306,
-0.1610985845,
-0.6163538098,
0.1744620055,
0.2040955424,
0.6709716916,
0.0023053875,
-0.4189355969,
-0.1440743804,
-0.0224520601,
0.0209413916,
0.1702198386,
0.3295254707,
-0.0454611965,
0.0398624949,
-0.002712246,
-0.2038346827,
-0.0404359549,
0.7206987143,
-0.0264513865,
0.4018862844,
0.1202321351,
0.1144706085,
-0.0478506573,
-0.1456206441,
-0.1257711798,
-0.3335694373,
0.4209320247,
0.4332467914,
-0.1695612073,
0.2906306088,
-0.0857508332,
0.2263406962,
-0.0529664457,
-0.1909825951,
-0.1185326874,
0.2109176517,
-0.0961130112,
0.1913408041,
0.3146862388,
-0.1448125988,
0.2309996188,
-0.0167628713,
0.3436276019,
0.0785778314,
-0.1617885232,
-0.0236773491,
-0.2893207073,
-0.2270642966,
-0.2285065353,
-0.082909897,
0.0350419134,
-0.2333208919,
0.2474828064,
0.109667778,
-0.2384319156,
-0.2751594782,
-0.1687631607,
0.3228849173,
-0.1710447371,
0.2898727953,
0.2068962306,
0.3334831595,
-0.439333111,
-0.0987039581,
0.0050609894,
-0.0328751579,
-0.2345453799,
0.2383076102,
0.2733900249,
0.1644215137,
0.3222700357,
0.063974753,
-0.2206519693,
-0.1693531275,
-0.0159265399,
-0.1536653787,
-0.079220973,
0.0881712958,
0.2245720923,
0.5646495819,
0.0242779441,
-0.2164817154,
-0.183021605,
0.1098444611,
-0.1811425388,
0.2454308569,
-0.0224610716,
0.0989767238,
-0.3706939816,
-0.0535049736,
-0.2598142624,
0.1856043935,
0.1031377912,
-0.1530772448,
-0.0132440813,
-0.2585119009,
-0.1633185446,
-0.015355384,
0.2881459892,
0.5672476292,
-0.1877698898,
-0.1831030995,
-0.2109178603,
0.156040743,
0.3122497499,
0.2041436732,
-0.1045733914,
0.326251328,
-0.138469249,
0.0980196595,
0.3381269574,
-0.0435736813,
0.0776510015,
0.0450998433,
0.2526732683,
0.2135819793,
-0.0926934332,
-0.0261052717,
0.1963849962,
0.0178854875,
0.1415541768,
0.4359197021,
0.3257624209,
-0.0453437269,
-0.3747332692,
-0.0472108498,
0.3917995393,
0.2488120347,
0.1009586006,
-0.1692658216,
0.1102250144,
0.5450978875,
0.1534945965,
0.0131800864,
0.1428937316,
0.4438538849,
-0.3199231625,
0.2711729407,
0.2049236149,
0.0090838149,
-0.528193295,
0.2005948573,
0.0112967119,
0.1782013774,
0.0175965689,
0.0525914729,
-0.3461594582,
-0.1132776886,
-0.2011069357,
0.2239729762,
0.0632471144,
0.2544427216,
-0.0102853626,
0.5279009938,
0.0372150242,
-0.3473694324,
-0.0004898161,
-0.10191416,
-0.5910497904,
0.4307764769,
-0.0491723046,
-0.0058679022,
0.0674811602,
0.2370138913,
0.3196125031,
-0.1044643074,
-0.040217977,
-0.2232317328,
0.3271291554,
0.3333980739,
0.1873662919,
-0.0968204141,
0.308154881,
-0.3724740446,
0.1896897256,
0.3602354825,
0.1580127925,
-0.2589325905,
0.0363305435,
0.3114947081,
-0.0588048212,
0.3653421998,
0.0718836263,
0.0265556499,
0.3908677399,
-0.0897051468,
-0.1538591087,
-0.2190162241,
0.4759977758,
0.1982604563,
0.0277490392,
0.385851413,
-0.1408334821,
-0.1522123963,
0.5553581119,
-0.0403649546,
-0.0501563251,
0.2276263535,
-0.1491428614,
-0.1826261133,
0.0057154074,
-0.2520776987,
-0.1125380173,
0.1228654906,
0.0823424011,
-0.3532346189,
-0.0523492359,
-0.2490395755,
0.0300014541,
0.0911751166,
0.0588248819,
0.1508001387,
0.117238231,
-0.0549026951,
-0.3138917685,
0.0181932449,
0.0604087561,
0.1874243766,
-0.1832009256,
0.0711917356,
-0.4476851225,
-0.3566981256,
-0.2708429694,
-0.3444027901,
-0.3639487922,
-0.5259603262,
-0.0327019244,
0.0293998122,
-0.062146686,
0.2045756876,
-0.1936120987,
0.0379360765,
-0.2525601387,
-0.0846444815,
-0.4149160385,
-0.4787172377,
-0.3776291311,
-0.072094202,
0.5312838554,
0.2609657943,
0.0445943028,
-0.1121906042,
-0.2606807947,
-0.2162500471,
-0.3026232123,
-0.127606377,
-0.1894435436,
0.1767022014,
-0.2689675093,
0.4213788509,
-0.0727720261,
-0.2691786885,
0.3827325404,
-0.0300242864,
0.0335946977,
0.1892680973,
0.161152184,
0.1392670274,
-0.0332659855,
-0.4139074683,
-0.3171476424,
-0.2624529004,
-0.180449903,
0.0568667129,
-0.0098886155,
0.1248530746,
-0.0625322387,
0.0444016196,
-0.0467397794,
-0.0289731361,
-0.1634235829,
0.2337405384,
0.5412667394,
-0.1276023686,
-0.4130741954,
0.1248717159,
-0.102524519,
0.0197152514,
0.021359317,
-0.4710839093,
0.1845600754,
-0.1556140184,
-0.0356669165,
0.2318305969,
-0.0825890899,
-0.0668094456,
-0.1799344867,
0.0941501036,
-0.0223425031,
-0.1061194465,
-0.1568671465,
-0.0057120482,
0.2745942175,
0.1639137268,
0.3346800208,
-0.3372350931,
0.7470002174,
0.2176746577,
0.0936313123,
0.3313852251,
0.2281757742,
0.1476374567,
-0.1205984205,
-0.3009839058,
0.0003936961,
0.0358179286,
-0.1016243249,
0.4037689269,
0.119124487,
-0.4371236861,
-0.1979615092,
0.1191070452,
-0.3471536636,
-0.3639637828,
0.1728972346,
-0.0917730704,
0.1023439616,
-0.0137912109,
0.0246037468,
-0.0789723694,
-0.3898309767,
-0.0925444812,
0.2616246939,
0.3869503438,
0.1271240711,
-0.5072199106,
-0.3176080287,
-0.369679451,
0.1878493726,
-0.0156761818,
0.5833176374,
-0.2090269327,
0.0465388,
0.1606327295,
0.303692311,
0.2516658902,
-0.3141663373,
-0.0480246991,
0.3816940784,
-0.0033970475,
-0.6274210811,
0.0392708331,
-0.1354999989,
0.1793412268,
0.4498899877,
0.2873637676,
-0.4655568004,
-0.2426207662,
-0.0170389451,
0.5938121676,
-0.1884110868,
-0.1671383232,
-0.3821139038,
-0.2638598084,
-0.3691655695,
-0.3094538748,
-0.063961938,
0.1736784577,
-0.179874301,
-0.0858215392,
-0.0537276864,
0.219853133,
0.0736371577,
0.0636088178,
0.1167482287,
0.2804916203,
-0.0188218541,
0.2255708724,
-0.0031967433,
0.0051292386,
0.3203226328,
-0.095573768,
-0.3046877682,
-0.018382119,
-0.1032421738,
0.2520045042,
0.12878564,
0.222788617,
0.0742223188,
0.126416117,
-0.1507890821,
-0.068966791,
0.0607988983,
0.213290751,
-0.0428385548,
-0.3614949882,
-0.4229006171,
0.5458522439,
0.0162439272,
0.0175946951,
0.3845963478,
-0.0831713006,
-0.401217401,
0.4881592095,
-0.0436767004,
1.0038048029,
-0.3012089729,
0.1526128203,
-0.0668858215,
0.3107735515,
0.4644117951,
0.1293581426,
-0.0028780978,
-0.2208915949,
0.2979341447,
-0.0102669597,
0.1015672982,
0.0109867854,
0.1999207437,
-0.1830207407,
0.3646386266,
-0.0610993132,
0.1480945647,
0.0709360465,
0.5119071007,
0.2190015167,
-0.2706772685,
-0.4574081898,
0.0575844646,
-0.1003013253,
0.4405318201,
-0.0420639366,
0.0510945432,
0.001219742,
-0.0505814664,
-0.2098078728,
0.1146255881,
-0.0174276344,
0.2365612239,
-0.1562308371,
-0.1477051973,
0.0835237652,
0.5369325876,
-0.0238238536,
0.2082616836,
-0.1498304605,
0.1094389483,
-0.2827110589,
-0.2716313601,
-0.1630763412,
0.1041026041,
0.0436499044,
0.0404535234,
-0.4021266699,
0.1497724503,
-0.0398561284,
-0.0782060176,
-0.2885353267,
0.1645686626,
-0.1855713725,
0.1062676609,
-0.4349034131,
-0.0446735099,
-0.1654358357,
-0.0457041785,
0.1009832025,
0.0029673502,
0.2235195339,
0.100619927,
0.3073588014,
-0.2967174053,
-0.1264646202,
0.307308197,
-0.1297167838,
0.0496775359,
0.7562810779,
0.2978057265,
-0.3327909112,
-0.1673115939,
-0.1610653549,
-0.456104815,
-0.4126521647,
-0.2062852979,
0.1005488411,
0.3791308403,
-0.2177531272,
0.0214582868,
0.0231132451,
-0.3728578687,
-0.006750688,
-0.671399653,
0.113052994,
0.3629443944,
0.1078157723,
0.0478670672,
-0.0269310027,
-0.3612903357,
0.1439277381,
0.0542996898,
-0.1796965152,
-0.0027685836,
-0.2716398835,
0.2428218424,
0.1177799106,
-0.0710613281,
-0.1670646369,
-0.178117007,
0.0492536612,
-0.2977860868,
0.1928613633,
-0.1315276772,
-0.0634017736,
0.1811519265,
0.2159472108,
-0.1296366453,
-0.2046409696,
-0.376655221,
0.4240160882,
-0.3951970935,
-0.2933063805,
0.1064595506,
-0.0215578936,
0.3240564466,
0.2239301205,
0.1515512317,
-0.0383065864,
0.1920630336,
0.0011909083,
0.1995297372,
0.3148196638,
0.079180494,
0.5858975053,
0.1241863891,
-0.3300995529,
-0.0889592022,
-0.1079259515,
0.0873938203,
0.2444919497,
-0.0481133163,
0.3211019635,
0.4410488009,
-0.037698552,
0.5469149351,
-0.0492978059,
-0.0506540723,
0.3139182627,
0.0774446279,
-0.1749256551,
0.0777032748,
0.6115117669,
0.0843544006,
-0.1762634069,
0.149962008,
0.1087768823,
-0.3841847777,
0.1678220034,
0.0111341625,
0.1982860118,
-0.0412450098,
0.2548068464,
0.0230466388,
0.248361364,
0.1717457473,
-0.0276453644,
0.0706361681,
-0.0424116924,
-0.1481896341,
0.0432387888,
0.1219863221,
-0.0025727665,
0.2388393879,
-0.2227487266,
-0.345796138,
0.0852579325,
0.4066887498,
-0.0035727136,
0.2401837558,
-0.0093102194,
0.1591867507,
0.1050686538,
0.0565775447,
-0.2103705555,
0.3899442852,
0.0976409167,
-0.2719574869,
-0.2775014639,
-0.0745849013,
-0.1095885262,
-0.0321727991,
0.0246662237,
-0.297252059,
0.0244531408,
0.4213553667,
-0.1656298935,
-0.5302978754,
0.6537275314,
-0.2060686797,
0.1800954044,
-0.1433716118,
0.1685185581,
-0.0142207295,
-0.0609259754,
0.1010970026,
0.0816316307,
0.1371220648,
0.3256596923,
-0.3377506733,
0.0915210396,
-0.0351752862,
0.0738730654,
-0.0911914706,
0.5608192086,
0.1278091818,
0.1370952427,
0.245810315,
0.0882412642,
-0.0985301286,
-0.2725240588,
0.2466864437,
0.0393857919,
-0.109316051,
0.183976084,
-0.1468200833,
0.0436569899,
-0.1639621854,
-0.0617335364,
-0.3652876616,
0.1352898628,
0.1807591766,
0.0783858746,
0.0874841809,
-0.1390053928,
0.0064076036,
-0.2053586841,
0.5153890848,
0.5274786949,
0.1966548115,
-0.2682804465,
-0.207608968,
-0.6327527761,
-0.0379976481,
-0.4089770913,
-0.0769092888,
0.2512984872,
0.1556713134,
0.0449995771,
0.1244790107,
0.0400829501,
-0.1354501098,
-0.1715446413,
0.2276878953,
-0.4122971296,
-0.1674696058,
-0.0974032432,
0.2109037489,
-0.3606423736,
-0.1823796928,
0.2387323976,
-0.2613448501,
-0.087722674,
-0.1229698211,
-0.0824446306,
0.0811956301,
0.33957389,
0.1680949181,
0.1144554913,
0.5315932035,
-0.0349050313,
0.0782143474,
-0.3346695602,
-0.2836704254,
0.1601033211,
0.3635134399,
-0.2720754743,
0.1439709961,
-0.1673912406,
0.0388081037,
-0.2867979407,
0.4059893191,
-0.2051694989,
-0.0799488872,
0.0568463206,
-0.2158321738,
-0.0427997857,
-0.137801379,
0.0927252024,
0.2488567531,
-0.1494412571,
0.1475374699,
-0.1498862207,
-0.1672075689,
0.1603721976,
-0.4102848768,
-0.3715366125,
-0.0028673224,
0.0432935394,
0.0619983561,
0.2191031128,
-0.3153201342,
0.051587075,
0.4503410161,
-0.009927419,
0.1241251826,
0.1163678318,
0.0562407337,
-0.1464504302,
-0.1525178105,
0.0188840423,
0.1500610709,
-0.0950556695,
-0.0833322331,
-0.1693374366
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | Thanks ! This will be very helpful.
About the date issue, I think it's possible to use another date with
```python
load_dataset("wikipedia", language="es", date="...", beam_runner="...")
```
However we've not processed wikipedia dumps for other dates than 20200501 (yet ?)
One more thing that is specific to 20200501.es: it was available once but the `mwparserfromhell` was not able to parse it for some reason, so we didn't manage to get a processed version of 20200501.es (see #321 ) | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 77 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
Thanks ! This will be very helpful.
About the date issue, I think it's possible to use another date with
```python
load_dataset("wikipedia", language="es", date="...", beam_runner="...")
```
However we've not processed wikipedia dumps for other dates than 20200501 (yet ?)
One more thing that is specific to 20200501.es: it was available once but the `mwparserfromhell` was not able to parse it for some reason, so we didn't manage to get a processed version of 20200501.es (see #321 ) | [
0.1844168305,
-0.1612695456,
-0.1408634186,
0.3425939381,
0.0797081068,
0.2668135166,
0.1986879259,
0.1461934447,
0.7380710244,
-0.2608356178,
0.1171043888,
0.1079090461,
0.0482875109,
-0.1868226826,
0.0673660487,
-0.2168417424,
0.1130852848,
-0.1392920911,
0.0121249538,
-0.2986242175,
-0.3755387962,
0.2062978446,
-0.3456493616,
-0.0187359825,
-0.2716133595,
0.2526730597,
-0.0286886878,
-0.1416579783,
0.075829193,
-0.3684401512,
0.1234101132,
0.2239327878,
0.3555548489,
0.1723899841,
-0.0001185649,
-0.1505399644,
0.6605799198,
-0.1414552182,
-0.564855814,
-0.1011556834,
-0.1321351379,
-0.4575916827,
0.2321636379,
-0.1510668248,
0.0153259039,
-0.2116500437,
0.2019939423,
-0.4954568446,
0.1956824511,
0.1200144589,
0.1927446723,
-0.0933784768,
0.0861299336,
0.0984009877,
0.2837165594,
0.0862575769,
0.1070942432,
0.0916266218,
0.0906863511,
-0.2530975938,
0.0317127444,
0.1068281531,
-0.1099939272,
-0.1209024489,
0.1942080259,
-0.0663056597,
-0.0168339945,
-0.640378654,
0.1275741756,
0.1657273769,
0.7213551998,
0.0507112406,
-0.3950339556,
-0.2233385593,
-0.0087342896,
0.0409738049,
0.1678938568,
0.247513324,
-0.0853279307,
-0.0411405712,
0.0339019038,
-0.2376618832,
-0.0329023972,
0.6365787387,
-0.0692420304,
0.4427609146,
0.1174649298,
0.1288728416,
0.0202501453,
-0.2535963655,
-0.181210205,
-0.258761704,
0.418168366,
0.4655200839,
-0.1745604277,
0.3636752963,
-0.0123904087,
0.3851034343,
-0.0386430137,
-0.1539081782,
-0.2357908487,
0.2513464093,
-0.1333470494,
0.1615582556,
0.2684193552,
-0.1073283106,
0.2236351967,
-0.1620292664,
0.3467641473,
0.0117432512,
-0.1397833228,
0.0552119687,
-0.2117568105,
-0.2181309313,
-0.1720182598,
-0.0451151058,
-0.0281928629,
-0.2950693965,
0.257284224,
0.1664820164,
-0.2195026577,
-0.3274431825,
-0.17490381,
0.2692574859,
-0.1951354146,
0.2675599456,
0.1724327207,
0.2840559185,
-0.4226953685,
-0.1039366573,
0.0161603615,
-0.0995110869,
-0.2571169734,
0.2204844058,
0.3558717668,
0.148631081,
0.2602375448,
0.0523076952,
-0.0767714828,
-0.149164021,
-0.0647038594,
-0.1634992659,
-0.0249820612,
0.0091658896,
0.1766339093,
0.5606012344,
0.0988833755,
-0.2323978841,
-0.1752841175,
0.1039436236,
-0.0761017352,
0.3082157969,
0.0369331539,
0.1233542562,
-0.3571023643,
-0.1046689078,
-0.2986971438,
0.2197834849,
0.1202819794,
-0.2009952962,
-0.0416012816,
-0.3253329098,
-0.1707212776,
-0.0548554882,
0.311358422,
0.5726600885,
-0.2665931582,
-0.1806460917,
-0.1147185862,
0.108762905,
0.2076974213,
0.2036616951,
-0.052558884,
0.2579967082,
-0.1294811666,
-0.0633766502,
0.2858008146,
-0.0579140484,
0.1067652702,
-0.0260217451,
0.2796827853,
0.3171771467,
-0.1271193027,
0.0283080898,
0.1985155195,
0.0353314281,
0.083256036,
0.4870527685,
0.2065674365,
-0.0626081154,
-0.3746047616,
-0.0362063348,
0.5943416953,
0.3290399313,
0.1499204934,
-0.1518607289,
0.0305369422,
0.3674328327,
0.1763263941,
0.0274733789,
0.1374640167,
0.3568583131,
-0.355627656,
0.3041114211,
0.2620889843,
-0.0356628187,
-0.5659896135,
0.2383674681,
0.1369166672,
0.1958185434,
0.0540326238,
0.0960886329,
-0.2962385416,
-0.1357353926,
-0.2376847863,
0.2135418206,
0.0795718879,
0.1882346123,
-0.0467987806,
0.5260183215,
0.0267843753,
-0.0761937052,
0.0167777315,
-0.1406953782,
-0.5490185022,
0.3648962975,
-0.0192005225,
-0.0789619833,
0.0225066617,
0.218326807,
0.3292413354,
-0.1560717523,
-0.0868924409,
-0.2810144126,
0.3929999471,
0.2565650046,
0.1176102012,
-0.0645648688,
0.2742713392,
-0.3607207537,
0.231712386,
0.3237191737,
0.1243613586,
-0.250816524,
-0.0573162585,
0.3349548876,
-0.1451158226,
0.4635528922,
0.0357605964,
0.0688564181,
0.3251146674,
0.0063650683,
-0.1108128056,
-0.1346652955,
0.5287484527,
0.1029548943,
0.0637289509,
0.2845214009,
-0.1806773543,
-0.0947576687,
0.5067566037,
0.0490306765,
-0.0480577759,
0.2475452721,
-0.2007910013,
-0.167269066,
-0.0358606167,
-0.2484867871,
-0.1827426106,
0.1321704686,
0.0703619421,
-0.3738311827,
0.0458122641,
-0.1439268291,
-0.0153812915,
0.1464484632,
0.1807278544,
0.0824583322,
0.140347451,
-0.0009784158,
-0.2181271315,
0.0029939488,
0.0622043833,
0.1825275123,
-0.19794631,
0.068149358,
-0.3990386724,
-0.3739185631,
-0.2093367279,
-0.2234027386,
-0.4862129986,
-0.4510010481,
-0.1158515215,
0.0748049468,
0.0489551798,
0.1520547569,
-0.0781796426,
0.0358305871,
-0.2447800934,
-0.0914541259,
-0.4282479584,
-0.5143586993,
-0.4257919788,
-0.0254918467,
0.5400378108,
0.1605176479,
0.095829621,
-0.0363363251,
-0.2658711076,
-0.1976134479,
-0.2801331282,
-0.1823982596,
-0.2643187046,
0.2399840802,
-0.3431634903,
0.3949018717,
-0.0874267817,
-0.0930261388,
0.374070704,
-0.0331536494,
0.0601066053,
0.156441167,
0.185430944,
0.2013897896,
-0.0076233819,
-0.4389494061,
-0.3696663976,
-0.277002722,
-0.0789736211,
0.1606953591,
-0.0580803603,
0.0915826634,
-0.1584428847,
-0.0235234927,
0.0208789222,
0.030520916,
-0.1258709431,
0.2937546968,
0.4754167199,
-0.0671438351,
-0.4101142585,
0.0925853327,
0.0022498667,
0.0188387278,
0.0373717695,
-0.4818992317,
0.1105147749,
-0.1167069301,
0.0338353068,
0.1245821416,
-0.1246043742,
-0.0315152332,
-0.1618933976,
0.1186651736,
0.0365784019,
-0.1993729174,
-0.1834093332,
-0.0256383494,
0.1785930246,
0.0766230673,
0.3736388087,
-0.3757044077,
0.6833339334,
0.1919844151,
0.0517831221,
0.2668401301,
0.2044527233,
0.0208226256,
-0.1411594152,
-0.2687706947,
0.0289676301,
-0.0435254574,
-0.0692293942,
0.3616575003,
0.0561508313,
-0.4470469058,
-0.2149754912,
0.1601341069,
-0.3118797541,
-0.3062431514,
0.1863855869,
-0.2279082835,
0.1966606379,
-0.043939881,
0.0396872908,
-0.0785786733,
-0.4171131849,
-0.0730285794,
0.2312416732,
0.420150578,
0.1305896789,
-0.3580098748,
-0.3635905981,
-0.2295687795,
0.1475470066,
0.0181836002,
0.5453323126,
-0.3247437775,
-0.0259758048,
0.1364895552,
0.1869921088,
0.249187693,
-0.3277448714,
-0.0784447044,
0.5329348445,
0.0247938335,
-0.6945915818,
-0.0137460902,
-0.1425858289,
0.1327878535,
0.3927877843,
0.1861747652,
-0.4104099274,
-0.3094127178,
-0.0620852821,
0.6232077479,
-0.2286631316,
-0.121804595,
-0.3604113162,
-0.084664546,
-0.3967117071,
-0.2190546989,
-0.146990478,
0.1456073076,
-0.2334448099,
0.0663507953,
-0.0082140248,
0.2877654135,
0.1197932661,
0.1248497069,
0.1368209422,
0.3288365602,
0.0264685526,
0.186450243,
0.0177563466,
-0.073914215,
0.24310036,
-0.04463958,
-0.2214606106,
-0.021996513,
-0.173453927,
0.254101485,
0.0243127309,
0.1686673909,
0.0354030728,
0.0936081111,
-0.1219545007,
-0.1373638958,
0.0995344669,
0.2646069229,
-0.0427214839,
-0.3857755661,
-0.4617738724,
0.5700256824,
0.0440340489,
-0.0771907717,
0.3294092119,
-0.0840583742,
-0.4005020261,
0.5531383157,
-0.0370685831,
0.8670902252,
-0.3276721835,
0.1661260724,
-0.1091105789,
0.2213764489,
0.5047304034,
0.0582495928,
-0.0689958036,
-0.2617020309,
0.3780923486,
-0.0033926219,
0.09510196,
-0.0139445756,
0.1382042766,
-0.198690027,
0.3340374827,
-0.1274992824,
0.0509023443,
0.0275453664,
0.5285961628,
0.2370519489,
-0.3180037141,
-0.3905286491,
0.0784166679,
-0.1627969295,
0.3956041634,
-0.0741905868,
0.1031674221,
-0.0298173651,
0.0155841708,
-0.2829708457,
0.1142403185,
-0.0419710055,
0.2151594311,
-0.1797878593,
-0.2497597486,
0.2762182951,
0.4374954104,
0.0755610839,
0.1677907705,
-0.166440025,
0.1357033849,
-0.2448335886,
-0.2289775908,
-0.0646034032,
0.0835582167,
0.110159412,
0.0287703946,
-0.3292124271,
0.0522799194,
-0.0567027628,
-0.0233200565,
-0.389590621,
0.1673529297,
-0.0041855536,
0.0608692877,
-0.3358549774,
0.0115339048,
-0.1193816587,
-0.0414843112,
0.1193094105,
0.0352869444,
0.225026086,
0.2422862947,
0.3342757225,
-0.2829221487,
-0.1138667539,
0.3032530546,
-0.0697611272,
-0.0558018908,
0.7682933211,
0.4745149612,
-0.3394963741,
-0.2241130769,
-0.2571645379,
-0.4388369918,
-0.4500106275,
-0.2344158739,
0.1606295705,
0.4009585381,
-0.1865049452,
0.0555910468,
-0.0034107363,
-0.4493925571,
-0.0002200603,
-0.6345475912,
0.034030363,
0.3447432816,
0.1035901308,
0.0054720296,
0.0012737811,
-0.3435914516,
0.1298698634,
0.0612001717,
-0.1972838342,
-0.062649034,
-0.1562691629,
0.2272825539,
0.3119395375,
-0.0378489122,
-0.1576281637,
-0.1320438087,
0.0860208198,
-0.2972207069,
0.1732494086,
-0.1805263758,
-0.0063105039,
0.1574809104,
0.173773393,
-0.1680447459,
-0.2384559959,
-0.3589778543,
0.4282386899,
-0.3507375121,
-0.1986349821,
-0.0201148614,
-0.0225043297,
0.3182731271,
0.0657491535,
0.2486480176,
-0.0504166223,
0.1881610453,
0.0012930632,
0.1054656506,
0.320145607,
0.0821958482,
0.5838449597,
0.1199806631,
-0.4141645432,
-0.0681127161,
-0.0772438794,
0.0852590352,
0.2501686811,
-0.1658859253,
0.3256095052,
0.396037221,
-0.0062781982,
0.530189991,
-0.0131568909,
-0.0744047761,
0.2766925693,
0.1095916554,
-0.1725286841,
0.0820232779,
0.7015470862,
0.1404873133,
-0.1471317858,
0.1354476213,
0.1613202542,
-0.2493498027,
0.1213670969,
-0.0278706513,
0.2471283823,
-0.0990203172,
0.204326719,
-0.0573811829,
0.2665819526,
0.1177335903,
-0.0330000743,
0.1238399744,
-0.1190440357,
-0.0065642986,
0.1209404916,
0.106617786,
-0.0044656442,
0.1893345118,
-0.1750041544,
-0.3500472903,
0.1187175512,
0.3595322371,
0.0404185802,
0.2055851072,
-0.024321001,
0.255450815,
0.1185596213,
-0.1806192994,
-0.1621672213,
0.2564995885,
0.0781321824,
-0.2098519504,
-0.1287283599,
-0.0651434213,
-0.1099865586,
-0.0071925819,
-0.0591129884,
-0.2668319941,
0.0281606205,
0.4564830661,
-0.0903666764,
-0.5504860878,
0.4089470804,
-0.1690148413,
0.2195013463,
-0.0892806128,
0.1430747658,
0.0180570036,
-0.0755751282,
-0.0185935907,
0.0626497716,
0.1614682674,
0.2659322321,
-0.4696864486,
0.1626975983,
-0.0513631329,
0.0850126147,
-0.0890452564,
0.4808033705,
0.1036760211,
0.2970784903,
0.2520150244,
0.1200083643,
-0.0791588426,
-0.2093759477,
0.2073231637,
-0.0279636402,
-0.0967572927,
0.0871883854,
-0.0559911802,
0.0332676172,
-0.2572045922,
-0.0831968859,
-0.3097165227,
0.1509769261,
0.2160123438,
0.2096534669,
0.0658724606,
-0.0658791885,
0.0165351406,
-0.1457692385,
0.5989710093,
0.5052399039,
0.3864733577,
-0.2828413248,
-0.2142060399,
-0.6079565883,
-0.0683493018,
-0.330047369,
-0.062098898,
0.2401631773,
0.151576072,
0.1486020386,
0.0755125582,
0.0893741101,
-0.1815300584,
-0.127714783,
0.2712644041,
-0.3947214484,
-0.1702538878,
-0.1462320834,
0.1659578532,
-0.3174643815,
-0.0818442628,
0.1850226969,
-0.2791993916,
-0.0646480396,
-0.139565289,
-0.1430627853,
0.0477134138,
0.2876758575,
0.2277488261,
0.0640403256,
0.6330650449,
0.0302949697,
0.0548834503,
-0.3189074993,
-0.2455743253,
0.1151758581,
0.3448651731,
-0.3442027271,
0.2105507106,
-0.1774174571,
0.0714092255,
-0.2868708968,
0.369499445,
-0.1817530841,
-0.0142913163,
-0.0039676614,
-0.2246591449,
0.0378590301,
-0.1431711018,
0.0891863853,
0.2404399365,
-0.1001726687,
0.109026596,
-0.2864099443,
-0.1072930247,
0.1480803192,
-0.3834711015,
-0.4022498131,
-0.0258692801,
0.0717195198,
0.0885523632,
0.180499509,
-0.3865855932,
0.0128928274,
0.4387576878,
-0.0302377902,
0.105872795,
0.1398378015,
0.1077248827,
-0.1861807704,
-0.168511942,
-0.1288341433,
0.091246143,
-0.066511862,
-0.1031779051,
-0.1249066591
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | Cool! Thanks for the trick regarding different dates!
I checked the download/processing time for retrieving the Arabic Wikipedia dump, and it took about 3.2 hours. I think that this may be a bit impractical when it comes to working with multiple languages (although I understand that storing those datasets in your Google storage may not be very appealing either).
For the record, here's what I did:
```python
import nlp
import time
def timeit(filename):
elapsed = time.time()
data = nlp.load_dataset('wikipedia', filename, beam_runner='DirectRunner', split='train')
elapsed = time.time() - elapsed
print(f"Loading the '{filename}' data took {elapsed:,.1f} seconds...")
return data
data = timeit('20200501.ar')
```
Here's the output:
```
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 13.0k/13.0k [00:00<00:00, 8.34MB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 28.7k/28.7k [00:00<00:00, 954kB/s]
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguil20/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 47.4k/47.4k [00:00<00:00, 1.40MB/s]
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 79.8M/79.8M [00:15<00:00, 5.13MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 171M/171M [00:33<00:00, 5.13MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 103M/103M [00:20<00:00, 5.14MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 227M/227M [00:44<00:00, 5.06MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 140M/140M [00:28<00:00, 4.96MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 160M/160M [00:30<00:00, 5.20MB/s]
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 97.5M/97.5M [00:19<00:00, 5.06MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 222M/222M [00:42<00:00, 5.21MB/s]
100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [03:16<00:00, 196.39s/sources]
Dataset wikipedia downloaded and prepared to /home/gaguil20/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50. Subsequent calls will reuse this data.
Loading the '20200501.ar' data took 11,582.7 seconds...
```` | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 202 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
Cool! Thanks for the trick regarding different dates!
I checked the download/processing time for retrieving the Arabic Wikipedia dump, and it took about 3.2 hours. I think that this may be a bit impractical when it comes to working with multiple languages (although I understand that storing those datasets in your Google storage may not be very appealing either).
For the record, here's what I did:
```python
import nlp
import time
def timeit(filename):
elapsed = time.time()
data = nlp.load_dataset('wikipedia', filename, beam_runner='DirectRunner', split='train')
elapsed = time.time() - elapsed
print(f"Loading the '{filename}' data took {elapsed:,.1f} seconds...")
return data
data = timeit('20200501.ar')
```
Here's the output:
```
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 13.0k/13.0k [00:00<00:00, 8.34MB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 28.7k/28.7k [00:00<00:00, 954kB/s]
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguil20/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 47.4k/47.4k [00:00<00:00, 1.40MB/s]
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 79.8M/79.8M [00:15<00:00, 5.13MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 171M/171M [00:33<00:00, 5.13MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 103M/103M [00:20<00:00, 5.14MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 227M/227M [00:44<00:00, 5.06MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 140M/140M [00:28<00:00, 4.96MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 160M/160M [00:30<00:00, 5.20MB/s]
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 97.5M/97.5M [00:19<00:00, 5.06MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 222M/222M [00:42<00:00, 5.21MB/s]
100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [03:16<00:00, 196.39s/sources]
Dataset wikipedia downloaded and prepared to /home/gaguil20/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50. Subsequent calls will reuse this data.
Loading the '20200501.ar' data took 11,582.7 seconds...
```` | [
0.245187819,
-0.1499562263,
-0.1770798862,
0.4052180648,
0.0632279515,
0.2607441545,
0.2458159775,
0.1112543643,
0.6888967156,
-0.2664470673,
-0.021399159,
0.0063391402,
0.0856511965,
-0.2514181137,
0.0070184357,
-0.1802870184,
0.0203863569,
-0.1383436918,
0.1124567688,
-0.2367403656,
-0.2798018456,
0.2774862349,
-0.239590466,
-0.0487780571,
-0.1759131253,
0.2128081173,
-0.0351584479,
-0.2206660807,
0.1427017152,
-0.3392617702,
0.1575164646,
0.28002882,
0.221589312,
0.1722691804,
-0.0001170091,
-0.0706657767,
0.6258941293,
-0.0953068808,
-0.4801304638,
-0.1560154855,
-0.108958751,
-0.4129001796,
0.1846740246,
-0.1244983003,
0.0033279806,
-0.2184668779,
0.2557475567,
-0.5376160145,
0.1487026066,
0.1018919945,
0.1965545267,
-0.1393258572,
-0.0304983929,
0.1324131638,
0.239397198,
0.0642502606,
0.1510795802,
0.1061611474,
0.1205722988,
-0.2250444293,
-0.011399772,
0.14851363,
-0.1639055908,
-0.0380539559,
0.1549710631,
-0.0637799203,
-0.1280990243,
-0.6488240957,
0.1541218162,
0.1783596426,
0.6348540187,
0.0936592072,
-0.3493361175,
-0.2358497381,
0.0220756233,
0.0087389797,
0.1717162728,
0.3690199256,
-0.0816914439,
-0.079589963,
-0.0399337411,
-0.1963656843,
-0.0322663635,
0.5931960344,
-0.0880084336,
0.3404202163,
0.1230518594,
0.1242889464,
0.0851863176,
-0.1302561313,
-0.201021716,
-0.309388876,
0.4458720088,
0.3907280564,
-0.2508178055,
0.4231900275,
0.0125244111,
0.2542316318,
-0.0182334557,
-0.1473667771,
-0.1184969619,
0.2253138274,
-0.1155806035,
0.137496382,
0.1883314252,
-0.1246886253,
0.1700278819,
-0.1154407114,
0.3546128273,
0.0449148454,
-0.1789990366,
0.0283956118,
-0.1855125278,
-0.1656094193,
-0.2223561257,
-0.090387255,
-0.0302752331,
-0.2254109532,
0.1415823549,
0.108444266,
-0.31199947,
-0.3404398859,
-0.1642336249,
0.2511443794,
-0.1807248741,
0.2364732027,
0.1775925606,
0.3219088912,
-0.4573753476,
-0.0067458749,
0.0449120253,
-0.069097966,
-0.2206030041,
0.264252305,
0.4220620096,
0.0948154777,
0.2164131999,
0.0365128443,
-0.0398968905,
-0.090914093,
0.0710610524,
-0.2667185962,
-0.0585899502,
-0.0462975539,
0.2065357715,
0.5741945505,
0.0113001633,
-0.2553768158,
-0.2352945507,
0.0706351697,
-0.0643666312,
0.3766923249,
0.0205882918,
0.1219097823,
-0.2673359513,
-0.1256441772,
-0.2556726933,
0.3742671907,
0.1674691439,
-0.1810793579,
-0.0655050874,
-0.2110246569,
-0.2137751579,
-0.054381799,
0.2509807944,
0.5896444321,
-0.289233923,
-0.1646264642,
-0.1988547444,
0.2035034001,
0.1886523962,
0.2982171178,
-0.0958648175,
0.2337156683,
-0.1778506786,
0.0835024118,
0.2486935556,
-0.0395819321,
-0.0014966354,
-0.025727557,
0.2922065258,
0.3353718221,
-0.0078317448,
0.0580955781,
0.200107336,
0.0750989616,
0.1405767351,
0.6219533086,
0.2040378898,
-0.0073913559,
-0.3208086193,
-0.0502892733,
0.584579587,
0.3552083373,
0.1517483145,
-0.2365608364,
0.0374268927,
0.2481884211,
0.1562709212,
-0.0131510273,
0.0936758816,
0.3756870925,
-0.4390863776,
0.2866102755,
0.1611398458,
0.0143172219,
-0.6067687273,
0.2630676627,
0.0607771277,
0.1707909107,
-0.0318742245,
0.1079960763,
-0.259395957,
-0.2113677561,
-0.2820688188,
0.1819370389,
0.0974049866,
0.2778010964,
-0.0736682042,
0.5513204336,
0.1415987313,
-0.0580296628,
0.1464495361,
-0.1990236789,
-0.6403771639,
0.4351526797,
-0.0246121436,
-0.0105865318,
0.0756668746,
0.1637991369,
0.295705229,
-0.1776615977,
-0.0669075251,
-0.3276806772,
0.2493426502,
0.3236112297,
0.2255865633,
-0.0355045497,
0.3311397135,
-0.3649562597,
0.2097977847,
0.2917330563,
0.1693487167,
-0.2859677672,
-0.053575471,
0.3123854399,
-0.0520768166,
0.5205019712,
0.0842645541,
0.0086805746,
0.3458457887,
-0.0037224218,
-0.1675103158,
-0.0433899313,
0.5325741172,
0.0772651061,
0.0950239822,
0.2842745483,
-0.2170910537,
-0.0199632719,
0.5617516637,
0.1191321462,
-0.0381752849,
0.1991221458,
-0.1175561547,
-0.1439588815,
-0.0568964705,
-0.2177954614,
-0.1218237281,
0.1635315865,
0.1110622734,
-0.4647267759,
-0.001527071,
-0.1823052466,
-0.0400641561,
0.096621424,
0.1555220187,
0.1344240308,
0.1967049539,
-0.0348843113,
-0.2852691412,
-0.1075894311,
0.1431743503,
0.1040026844,
-0.0965625867,
0.1613906622,
-0.403183043,
-0.3990428448,
-0.2320871353,
-0.317655772,
-0.4566810131,
-0.4382843971,
-0.1172615737,
0.0020063613,
0.0365822688,
0.177995652,
-0.0860113129,
-0.004027456,
-0.214853406,
-0.1572208852,
-0.3402950466,
-0.4157640636,
-0.3499435484,
-0.011351265,
0.6082347631,
0.1408375651,
0.1521292925,
-0.0444763675,
-0.3557097316,
-0.1870457083,
-0.2607332468,
-0.2199682444,
-0.261919558,
0.1499442458,
-0.3553150296,
0.33559075,
-0.1427176893,
-0.1477807909,
0.3654437065,
0.0682373121,
0.0449980125,
0.1869267076,
0.2114940733,
0.2520883083,
0.0519435853,
-0.4182102978,
-0.2767594755,
-0.2977213562,
-0.0779389739,
0.0720987469,
-0.0323852859,
0.0740769207,
-0.1342921853,
0.0277840309,
0.0462005474,
0.0540389568,
-0.1266376227,
0.3575995862,
0.4985679388,
-0.0664861053,
-0.3655042052,
0.0541130155,
0.031213358,
0.0838308036,
0.0972239524,
-0.4861778021,
0.1801844686,
-0.2643375993,
0.0509776175,
0.1153757721,
-0.1262594759,
-0.0208537392,
-0.2068906575,
0.0848830044,
0.0921386927,
-0.217036739,
-0.1743112653,
-0.0443455353,
0.1687155813,
0.0408558212,
0.3214992881,
-0.3043215275,
0.7088567019,
0.1946602613,
0.0952937976,
0.2897075117,
0.2598701417,
0.0279642083,
-0.1108570695,
-0.3611118197,
0.0237767994,
-0.0480477512,
-0.0266054235,
0.3631150723,
-0.0062622949,
-0.5170606971,
-0.1891023368,
0.1743411124,
-0.3713747859,
-0.2986271381,
0.2226096243,
-0.3242894709,
0.1422244012,
-0.0391421989,
-0.0105264857,
0.0027752891,
-0.371119976,
-0.1053502709,
0.1779322624,
0.3246058226,
0.1485725641,
-0.3348636031,
-0.3951422572,
-0.3076939881,
0.2421055138,
0.0071048327,
0.4984686971,
-0.2654770911,
0.0471521318,
0.1594443321,
0.1530168504,
0.2365150452,
-0.2597923875,
-0.111195907,
0.5279594064,
-0.027104333,
-0.6189779639,
-0.052665513,
-0.1377343237,
0.1269145757,
0.3149391711,
0.2196881175,
-0.4838065505,
-0.2887220681,
-0.122600466,
0.58764112,
-0.17014727,
-0.0847838745,
-0.4336579442,
-0.2338641733,
-0.3693842292,
-0.1450404972,
-0.1572354138,
0.1778553426,
-0.2349483967,
-0.0459778495,
-0.0219070539,
0.2416186631,
0.1035628319,
0.0267473422,
0.1316095293,
0.2954698205,
0.0465459302,
0.1607414335,
0.0174138583,
-0.1359118521,
0.3154532015,
-0.0129852127,
-0.2080692649,
0.0275540687,
-0.1306414306,
0.1875983775,
0.0917402953,
0.2451943457,
0.1307401508,
0.1095637828,
-0.1335560083,
-0.0489851311,
0.035248708,
0.2923879921,
-0.048934754,
-0.3869997561,
-0.4622569084,
0.5346458554,
0.038188085,
-0.0413820595,
0.3379881978,
-0.1091618836,
-0.3768290281,
0.5159649849,
-0.0695695877,
0.8500660658,
-0.3642222583,
0.2199822664,
-0.1245951653,
0.172277078,
0.4675639272,
0.076316759,
0.0761717707,
-0.2263676822,
0.2811320722,
0.0344469771,
0.0823061764,
-0.0472712405,
0.1280320734,
-0.1903807521,
0.3003565073,
-0.1230537593,
-0.0278717782,
-0.0277451798,
0.5177571177,
0.1137031168,
-0.3444420695,
-0.3903769255,
0.0782159716,
-0.1911293119,
0.3340758979,
-0.0570764951,
0.1440502554,
-0.0817052573,
0.0448232889,
-0.2575120032,
0.1331408024,
-0.0768298432,
0.2991920412,
-0.2658874393,
-0.254170835,
0.2688462436,
0.403113842,
0.0046638791,
0.171574682,
-0.1435803771,
0.1161305308,
-0.265565455,
-0.2525235116,
-0.0746078938,
-0.1055534333,
0.0396233425,
0.0475946218,
-0.303537488,
0.0346583351,
0.0085529238,
-0.0259420276,
-0.430922538,
0.258980751,
-0.0142868403,
0.1846856177,
-0.3098224401,
0.0207594875,
-0.1681014299,
0.0122458935,
0.1065095142,
0.070126459,
0.1924311966,
0.2142221779,
0.2856582701,
-0.2697759271,
-0.0905394554,
0.3106725216,
0.0634704977,
-0.0143202841,
0.7507342696,
0.4354945123,
-0.3659290075,
-0.2016962767,
-0.275102973,
-0.3988817632,
-0.4118564725,
-0.1979344636,
0.0786664486,
0.4254752994,
-0.2341168821,
-0.040556591,
-0.0630697608,
-0.4103247523,
0.045609571,
-0.6656745672,
0.1718958467,
0.356123209,
0.050700739,
0.003544338,
-0.0290384218,
-0.4292137325,
0.1859497875,
0.1663059294,
-0.1968049407,
-0.0389883965,
-0.2543285489,
0.2168640345,
0.2401292175,
-0.1219032109,
-0.1090310663,
-0.0833919495,
0.0853061005,
-0.2562124133,
0.1659387201,
-0.189926669,
0.0360819623,
0.1547966897,
0.1552374363,
-0.0410484225,
-0.1149518341,
-0.445042789,
0.4579296112,
-0.3163776696,
-0.1755691618,
0.0163523033,
-0.0156327747,
0.2527424395,
0.0262186155,
0.1651685834,
-0.0539302938,
0.1831568629,
-0.1122234166,
-0.0250252597,
0.3604781032,
0.0616053864,
0.6041914821,
0.1429426968,
-0.4622178078,
-0.0460663028,
-0.0973881334,
0.1787063777,
0.1860469431,
-0.1229138076,
0.3573132157,
0.3914489448,
0.0029294863,
0.559445858,
-0.0614697747,
-0.1044640616,
0.2407746613,
0.0994288921,
-0.1753642261,
0.0176677816,
0.6052833796,
0.0272110924,
-0.0985854566,
0.1960460544,
0.1467481852,
-0.2734451294,
0.0870824307,
-0.0509237051,
0.182974562,
-0.0898048282,
0.1844635606,
-0.0684339106,
0.2767163515,
0.0790665001,
0.0812537745,
0.099147968,
-0.125711754,
-0.0263299122,
0.0368318148,
0.1100987196,
-0.0732967556,
0.1421321332,
-0.1959451586,
-0.3418850303,
0.1488083601,
0.3849464655,
-0.0385040827,
0.1966329962,
-0.0197483972,
0.2618832886,
0.0570272356,
-0.1767064929,
-0.1262673587,
0.281519264,
0.0860372931,
-0.1537545323,
-0.110394083,
-0.0374433696,
-0.1708140969,
0.0302488953,
-0.0098094018,
-0.1767526418,
0.1158329248,
0.429135114,
-0.1432881802,
-0.5412375927,
0.5037274361,
-0.1810053438,
0.1998378783,
-0.1252733618,
0.1471014768,
0.0336149149,
-0.0736718923,
-0.0792778805,
0.1109912172,
0.0890679657,
0.241163224,
-0.4174543023,
0.1338174343,
-0.0459808037,
0.1110810414,
-0.0803474709,
0.4521906376,
0.1720497608,
0.2157224715,
0.2976250052,
0.1016492769,
-0.1120321378,
-0.1626939625,
0.2548756301,
-0.0762875676,
-0.1740437597,
0.0921579599,
-0.1568005383,
0.0316731781,
-0.2665816844,
-0.0060913227,
-0.3949308991,
0.1219009757,
0.154735148,
0.1300668716,
0.0728922486,
-0.098319307,
0.0153385922,
-0.09372136,
0.6074228287,
0.4792844355,
0.3430703282,
-0.293992281,
-0.1760925949,
-0.6380916834,
-0.0678535029,
-0.3921575546,
0.0492010936,
0.202406764,
0.3104741573,
0.1381626725,
0.0823638067,
0.1153789163,
-0.1095291823,
-0.1194224805,
0.265204519,
-0.4071402848,
-0.1527135223,
-0.1296405643,
0.0743367672,
-0.3396193981,
-0.1283412278,
0.2265630066,
-0.2407399118,
-0.0277349129,
-0.102784574,
-0.1051417962,
0.072988987,
0.2741316855,
0.2015314996,
0.0922096521,
0.5899720192,
0.0433663055,
0.0089176893,
-0.2871468961,
-0.1425204128,
0.1041715741,
0.3150422871,
-0.2458574772,
0.2514476776,
-0.1794993132,
0.0906046852,
-0.2311388999,
0.4558015168,
-0.1921856999,
0.0487006865,
0.0483479127,
-0.2329857945,
0.0329912305,
-0.0703179389,
0.0282707103,
0.2565558255,
-0.0535291508,
0.0707785487,
-0.3227967918,
-0.1854753494,
0.0735797584,
-0.336275965,
-0.4813308716,
0.0286503509,
0.1353666782,
0.1401512921,
0.2730080485,
-0.2980142236,
0.0082409978,
0.436150223,
-0.0367801078,
0.0740417019,
0.2029287666,
0.1122000515,
-0.1739457548,
-0.1829125285,
-0.1639411449,
0.1568364501,
-0.0951533094,
-0.1418356448,
-0.1823758185
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | > About the date issue, I think it's possible to use another date with
> ```python
> load_dataset("wikipedia", language="es", date="...", beam_runner="...")
> ```
I tried your suggestion about the date and the function does not accept the language and date keywords. I tried both on `nlp` v0.4 and the new `datasets` library (v1.0.2):
```
load_dataset("wikipedia", language="es", date="20200601", beam_runner='DirectRunner', split='train')
```
For now, my quick workaround to keep things moving was to simply change the date inside the library at this line: [https://github.com/huggingface/datasets/blob/master/datasets/wikipedia/wikipedia.py#L403](https://github.com/huggingface/datasets/blob/master/datasets/wikipedia/wikipedia.py#L403)
Note that the date and languages are valid: [https://dumps.wikimedia.org/eswiki/20200601/dumpstatus.json](https://dumps.wikimedia.org/eswiki/20200601/dumpstatus.json)
Any suggestion is welcome :) @lhoestq
## **[UPDATE]**
The workaround I mentioned fetched the data, but then I faced another issue (even the log says to report this as bug):
```
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
```
Here's the full stack (which says that there is a key error caused by this key: `KeyError: '000nbsp'`):
```Downloading and preparing dataset wikipedia/20200601.es (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gustavoag/.cache/huggingface/datasets/wikipedia/20200601.es/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 74.7k/74.7k [00:00<00:00, 1.53MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 232M/232M [00:48<00:00, 4.75MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 442M/442M [01:39<00:00, 4.44MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 173M/173M [00:33<00:00, 5.12MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 344M/344M [01:14<00:00, 4.59MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 541M/541M [01:59<00:00, 4.52MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 476M/476M [01:31<00:00, 5.18MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 545M/545M [02:02<00:00, 4.46MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 299M/299M [01:01<00:00, 4.89MB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 9.60M/9.60M [00:01<00:00, 4.84MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 423M/423M [01:36<00:00, 4.38MB/s]
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--lang', 'es', '--date', '20200601', '--tokenizer', 'bert-base-multilingual-cased', '--cache', 'train', 'valid', '--max_dataset_length', '200000', '10000']
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
Traceback (most recent call last):
File "apache_beam/runners/common.py", line 961, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 553, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1095, in apache_beam.runners.common._OutputProcessor.process_outputs
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 500, in _clean_content
text = _parse_and_clean_wikicode(raw_content, parser=mwparserfromhell)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 556, in _parse_and_clean_wikicode
section_text.append(section.strip_code().strip())
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/wikicode.py", line 643, in strip_code
stripped = node.__strip__(**kwargs)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/nodes/html_entity.py", line 63, in __strip__
return self.normalize()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/nodes/html_entity.py", line 178, in normalize
return chrfunc(htmlentities.name2codepoint[self.value])
KeyError: '000nbsp'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/raid/data/gustavoag/projects/char2subword/research/preprocessing/split_wiki.py", line 96, in <module>
main()
File "/raid/data/gustavoag/projects/char2subword/research/preprocessing/split_wiki.py", line 65, in main
data = nlp.load_dataset('wikipedia', f'{args.date}.{args.lang}', beam_runner='DirectRunner', split='train')
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/load.py", line 548, in load_dataset
builder_instance.download_and_prepare(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 462, in download_and_prepare
self._download_and_prepare(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 969, in _download_and_prepare
pipeline_results = pipeline.run()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/pipeline.py", line 534, in run
return self.runner.run_pipeline(self, self._options)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/direct/direct_runner.py", line 119, in run_pipeline
return runner.run_pipeline(pipeline, options)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 172, in run_pipeline
self._latest_run_result = self.run_via_runner_api(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 183, in run_via_runner_api
return self.run_stages(stage_context, stages)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 338, in run_stages
stage_results = self._run_stage(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 512, in _run_stage
last_result, deferred_inputs, fired_timers = self._run_bundle(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 556, in _run_bundle
result, splits = bundle_manager.process_bundle(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 940, in process_bundle
for result, split_result in executor.map(execute, zip(part_inputs, # pylint: disable=zip-builtin-not-iterating
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/concurrent/futures/_base.py", line 611, in result_iterator
yield fs.pop().result()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/concurrent/futures/_base.py", line 439, in result
return self.__get_result()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/concurrent/futures/_base.py", line 388, in __get_result
raise self._exception
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/utils/thread_pool_executor.py", line 44, in run
self._future.set_result(self._fn(*self._fn_args, **self._fn_kwargs))
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 932, in execute
return bundle_manager.process_bundle(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 837, in process_bundle
result_future = self._worker_handler.control_conn.push(process_bundle_req)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/worker_handlers.py", line 352, in push
response = self.worker.do_instruction(request)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py", line 479, in do_instruction
return getattr(self, request_type)(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py", line 515, in process_bundle
bundle_processor.process_bundle(instruction_id))
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py", line 977, in process_bundle
input_op_by_transform_id[element.transform_id].process_encoded(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py", line 218, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 330, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 332, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 195, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 670, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 671, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 963, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1030, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 961, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 553, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1122, in apache_beam.runners.common._OutputProcessor.process_outputs
File "apache_beam/runners/worker/operations.py", line 195, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 670, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 671, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 963, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1030, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 961, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 553, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1122, in apache_beam.runners.common._OutputProcessor.process_outputs
File "apache_beam/runners/worker/operations.py", line 195, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 670, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 671, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 963, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1045, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "apache_beam/runners/common.py", line 961, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 553, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1095, in apache_beam.runners.common._OutputProcessor.process_outputs
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 500, in _clean_content
text = _parse_and_clean_wikicode(raw_content, parser=mwparserfromhell)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 556, in _parse_and_clean_wikicode
section_text.append(section.strip_code().strip())
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/wikicode.py", line 643, in strip_code
stripped = node.__strip__(**kwargs)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/nodes/html_entity.py", line 63, in __strip__
return self.normalize()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/nodes/html_entity.py", line 178, in normalize
return chrfunc(htmlentities.name2codepoint[self.value])
KeyError: "000nbsp [while running 'train/Clean content']"``` | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 841 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
> About the date issue, I think it's possible to use another date with
> ```python
> load_dataset("wikipedia", language="es", date="...", beam_runner="...")
> ```
I tried your suggestion about the date and the function does not accept the language and date keywords. I tried both on `nlp` v0.4 and the new `datasets` library (v1.0.2):
```
load_dataset("wikipedia", language="es", date="20200601", beam_runner='DirectRunner', split='train')
```
For now, my quick workaround to keep things moving was to simply change the date inside the library at this line: [https://github.com/huggingface/datasets/blob/master/datasets/wikipedia/wikipedia.py#L403](https://github.com/huggingface/datasets/blob/master/datasets/wikipedia/wikipedia.py#L403)
Note that the date and languages are valid: [https://dumps.wikimedia.org/eswiki/20200601/dumpstatus.json](https://dumps.wikimedia.org/eswiki/20200601/dumpstatus.json)
Any suggestion is welcome :) @lhoestq
## **[UPDATE]**
The workaround I mentioned fetched the data, but then I faced another issue (even the log says to report this as bug):
```
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
```
Here's the full stack (which says that there is a key error caused by this key: `KeyError: '000nbsp'`):
```Downloading and preparing dataset wikipedia/20200601.es (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gustavoag/.cache/huggingface/datasets/wikipedia/20200601.es/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 74.7k/74.7k [00:00<00:00, 1.53MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 232M/232M [00:48<00:00, 4.75MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 442M/442M [01:39<00:00, 4.44MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 173M/173M [00:33<00:00, 5.12MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 344M/344M [01:14<00:00, 4.59MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 541M/541M [01:59<00:00, 4.52MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 476M/476M [01:31<00:00, 5.18MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 545M/545M [02:02<00:00, 4.46MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 299M/299M [01:01<00:00, 4.89MB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 9.60M/9.60M [00:01<00:00, 4.84MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 423M/423M [01:36<00:00, 4.38MB/s]
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--lang', 'es', '--date', '20200601', '--tokenizer', 'bert-base-multilingual-cased', '--cache', 'train', 'valid', '--max_dataset_length', '200000', '10000']
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
ERROR:root:mwparserfromhell ParseError: This is a bug and should be reported. Info: C tokenizer exited with non-empty token stack.
Traceback (most recent call last):
File "apache_beam/runners/common.py", line 961, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 553, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1095, in apache_beam.runners.common._OutputProcessor.process_outputs
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 500, in _clean_content
text = _parse_and_clean_wikicode(raw_content, parser=mwparserfromhell)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 556, in _parse_and_clean_wikicode
section_text.append(section.strip_code().strip())
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/wikicode.py", line 643, in strip_code
stripped = node.__strip__(**kwargs)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/nodes/html_entity.py", line 63, in __strip__
return self.normalize()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/nodes/html_entity.py", line 178, in normalize
return chrfunc(htmlentities.name2codepoint[self.value])
KeyError: '000nbsp'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/raid/data/gustavoag/projects/char2subword/research/preprocessing/split_wiki.py", line 96, in <module>
main()
File "/raid/data/gustavoag/projects/char2subword/research/preprocessing/split_wiki.py", line 65, in main
data = nlp.load_dataset('wikipedia', f'{args.date}.{args.lang}', beam_runner='DirectRunner', split='train')
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/load.py", line 548, in load_dataset
builder_instance.download_and_prepare(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 462, in download_and_prepare
self._download_and_prepare(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/builder.py", line 969, in _download_and_prepare
pipeline_results = pipeline.run()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/pipeline.py", line 534, in run
return self.runner.run_pipeline(self, self._options)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/direct/direct_runner.py", line 119, in run_pipeline
return runner.run_pipeline(pipeline, options)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 172, in run_pipeline
self._latest_run_result = self.run_via_runner_api(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 183, in run_via_runner_api
return self.run_stages(stage_context, stages)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 338, in run_stages
stage_results = self._run_stage(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 512, in _run_stage
last_result, deferred_inputs, fired_timers = self._run_bundle(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 556, in _run_bundle
result, splits = bundle_manager.process_bundle(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 940, in process_bundle
for result, split_result in executor.map(execute, zip(part_inputs, # pylint: disable=zip-builtin-not-iterating
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/concurrent/futures/_base.py", line 611, in result_iterator
yield fs.pop().result()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/concurrent/futures/_base.py", line 439, in result
return self.__get_result()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/concurrent/futures/_base.py", line 388, in __get_result
raise self._exception
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/utils/thread_pool_executor.py", line 44, in run
self._future.set_result(self._fn(*self._fn_args, **self._fn_kwargs))
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 932, in execute
return bundle_manager.process_bundle(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 837, in process_bundle
result_future = self._worker_handler.control_conn.push(process_bundle_req)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/portability/fn_api_runner/worker_handlers.py", line 352, in push
response = self.worker.do_instruction(request)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py", line 479, in do_instruction
return getattr(self, request_type)(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py", line 515, in process_bundle
bundle_processor.process_bundle(instruction_id))
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py", line 977, in process_bundle
input_op_by_transform_id[element.transform_id].process_encoded(
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/apache_beam/runners/worker/bundle_processor.py", line 218, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 330, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 332, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 195, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 670, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 671, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 963, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1030, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 961, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 553, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1122, in apache_beam.runners.common._OutputProcessor.process_outputs
File "apache_beam/runners/worker/operations.py", line 195, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 670, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 671, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 963, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1030, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 961, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 553, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1122, in apache_beam.runners.common._OutputProcessor.process_outputs
File "apache_beam/runners/worker/operations.py", line 195, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 670, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 671, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 963, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1045, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "apache_beam/runners/common.py", line 961, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 553, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1095, in apache_beam.runners.common._OutputProcessor.process_outputs
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 500, in _clean_content
text = _parse_and_clean_wikicode(raw_content, parser=mwparserfromhell)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/nlp/datasets/wikipedia/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50/wikipedia.py", line 556, in _parse_and_clean_wikicode
section_text.append(section.strip_code().strip())
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/wikicode.py", line 643, in strip_code
stripped = node.__strip__(**kwargs)
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/nodes/html_entity.py", line 63, in __strip__
return self.normalize()
File "/home/gustavoag/anaconda3/envs/pytorch/lib/python3.8/site-packages/mwparserfromhell/nodes/html_entity.py", line 178, in normalize
return chrfunc(htmlentities.name2codepoint[self.value])
KeyError: "000nbsp [while running 'train/Clean content']"``` | [
0.2467408478,
-0.0418633968,
-0.1210310683,
0.312300086,
0.1007467508,
0.2001998574,
0.1836342812,
0.1085750982,
0.6142449379,
-0.2803642154,
0.1189875081,
0.2007732838,
0.0345286615,
-0.1363367587,
0.0880575553,
-0.2634974718,
0.0546107665,
-0.1040421277,
0.0303129815,
-0.2449037731,
-0.3982992768,
0.2610422075,
-0.3530086875,
0.0145686958,
-0.1812228858,
0.2081045061,
-0.0124993548,
-0.1226633936,
0.0534754694,
-0.401656121,
0.1271002591,
0.1967625916,
0.3313166797,
0.1973953545,
-0.0001186252,
-0.197795853,
0.633600533,
-0.1621053517,
-0.5641885996,
-0.1122202575,
-0.1143347695,
-0.436531812,
0.282156527,
-0.1752434373,
-0.0461254306,
-0.3043346405,
0.1618640125,
-0.5166304708,
0.1445370615,
0.2225317359,
0.1954719722,
-0.0924770236,
0.0923198909,
0.1296253353,
0.2619312704,
0.0538861603,
0.0945695862,
0.1151472628,
0.0538561158,
-0.3287508488,
0.0647287816,
0.0624159463,
-0.1051378027,
-0.1172368675,
0.3672707677,
-0.1092404723,
-0.121782586,
-0.5950626135,
0.0641388893,
0.0947352499,
0.7301490307,
-0.0377022438,
-0.3581018448,
-0.1889027655,
-0.0069873184,
0.0573047623,
0.1728271693,
0.155932188,
-0.0244680233,
-0.1372966766,
0.0292162076,
-0.2469108552,
-0.0619471222,
0.7434316874,
-0.0167286769,
0.3858575225,
0.0737344995,
0.1732941121,
0.0626395047,
-0.3136679828,
-0.1730394214,
-0.2192941904,
0.3790629804,
0.4839771092,
-0.1226426214,
0.3475770354,
-0.012334723,
0.3825615346,
-0.0521787629,
-0.1580696106,
-0.235299468,
0.1662345976,
-0.1662418842,
0.1426323354,
0.2290007472,
-0.013410192,
0.3087078929,
-0.1355268657,
0.2983294725,
-0.064054206,
-0.0458317883,
0.0196872279,
-0.1417012215,
-0.1992101967,
-0.1657812297,
-0.0544731244,
-0.0478283837,
-0.3128400445,
0.2918703854,
0.1117085218,
-0.2736558914,
-0.3444138765,
-0.1295119524,
0.3133080602,
-0.1838984787,
0.2302264571,
0.1439911723,
0.2676374912,
-0.3891964257,
-0.0696639642,
-0.0016797371,
0.0119817592,
-0.2927107811,
0.1612304747,
0.3973583579,
0.0833453536,
0.2268029004,
0.074819155,
-0.01465239,
-0.1550944,
-0.078368783,
-0.2006683201,
0.0307390932,
-0.0432604365,
0.1019619107,
0.5389503837,
0.0627785325,
-0.229437694,
-0.1606118828,
0.0834417343,
-0.109654434,
0.3419537842,
-0.0056773918,
0.1212534606,
-0.3625358641,
-0.1406647265,
-0.297791183,
0.2064986527,
0.0987754613,
-0.2669970095,
-0.0225402415,
-0.36905092,
-0.1852132678,
-0.1132442951,
0.2871931493,
0.4969681501,
-0.2388131768,
-0.2270082831,
-0.0786344856,
0.1175903231,
0.1469183564,
0.2245144397,
-0.0805961788,
0.1937001646,
-0.1273549199,
-0.0071628243,
0.4132715464,
-0.105134137,
0.0572991818,
0.022382766,
0.2743157744,
0.2108422667,
-0.1830104589,
0.0946492255,
0.1899321973,
0.0379426666,
0.0646439791,
0.5457742214,
0.2533351481,
-0.0215170495,
-0.3435811698,
-0.0450477377,
0.6537559628,
0.3368974328,
0.1342663467,
-0.1086801738,
0.0573587604,
0.3899140954,
0.1507700682,
0.0243370235,
0.1013399512,
0.354347676,
-0.3717959225,
0.264918983,
0.3096796274,
-0.0367003903,
-0.5847883821,
0.2162315696,
0.1441769153,
0.2161540091,
0.0928699449,
0.1093351692,
-0.3285712004,
-0.1730722189,
-0.2318116426,
0.2040986121,
0.0549689345,
0.2256598026,
-0.0271210335,
0.5444974303,
0.031196624,
0.0278387349,
0.0366642252,
-0.1735837311,
-0.5712450147,
0.3580466509,
-0.0180361494,
-0.0789081752,
0.0505057499,
0.2768632472,
0.3914178312,
-0.124750793,
-0.1328831911,
-0.2583321333,
0.3054015934,
0.2855799496,
0.0374741666,
-0.0770552903,
0.2321471423,
-0.3242639601,
0.2337018847,
0.3573763072,
0.1669023186,
-0.234048456,
-0.0023755897,
0.334844172,
-0.1550545245,
0.422419548,
0.0630623996,
0.0345317945,
0.2765227556,
-0.0269018635,
-0.1470500231,
-0.1411592364,
0.5319134593,
0.0347100087,
0.009683229,
0.276848793,
-0.1995254159,
-0.145979926,
0.574422121,
0.1044288054,
-0.0474535264,
0.2356460094,
-0.1933628023,
-0.2013407946,
-0.0289735124,
-0.1284897029,
-0.1779079884,
0.1603294164,
0.0171627458,
-0.3485313356,
-0.0272581205,
-0.1230651736,
-0.0213753283,
0.1418351978,
0.1582038552,
0.0385223143,
0.2043384761,
0.0613389909,
-0.2022947371,
-0.0328095853,
0.063271746,
0.1949343532,
-0.2581514716,
0.1352342367,
-0.4075545371,
-0.5762093067,
-0.1227529943,
-0.1689464301,
-0.495135963,
-0.4757381082,
-0.1143931895,
0.0920535326,
0.134827897,
0.1173323765,
0.0116771832,
0.0618189573,
-0.2605219483,
-0.1536410302,
-0.3928336799,
-0.4991735518,
-0.4106242657,
-0.019218469,
0.5115397573,
0.2060917914,
0.1550514847,
0.0181783736,
-0.2731818557,
-0.2047000229,
-0.3014616668,
-0.1553867608,
-0.2485722005,
0.2819807529,
-0.3579007387,
0.3568584621,
-0.120313637,
-0.0770150274,
0.3586685061,
-0.006379284,
0.0426303595,
0.1374921501,
0.1480122209,
0.2510436177,
-0.1259439439,
-0.4387935102,
-0.3378152549,
-0.2986634374,
-0.0107977632,
0.1471055448,
-0.0097869709,
0.1274691224,
-0.1898836195,
-0.0101258121,
-0.0537631512,
0.0646509528,
-0.1916069239,
0.3349400759,
0.4417560995,
-0.1034761444,
-0.4246733189,
0.0266976729,
0.0199541748,
0.0554861389,
0.0543129966,
-0.45349437,
0.0489511937,
-0.1315478384,
0.0165139586,
0.1188753396,
-0.1290848851,
-0.032755062,
-0.1807680875,
0.1202560887,
0.0126705617,
-0.2608506382,
-0.0970650017,
0.0698098913,
0.1594929546,
0.1414657533,
0.3957380354,
-0.3709972799,
0.7079386115,
0.146592766,
-0.0535262376,
0.3381590247,
0.1850139499,
0.0966087356,
-0.1214879751,
-0.3148030043,
0.0809998065,
0.0567982756,
0.0732565969,
0.3039262891,
0.0184212923,
-0.4587876797,
-0.2295136303,
0.092779085,
-0.3207523227,
-0.243547678,
0.244271487,
-0.2609000504,
0.1809429675,
-0.0212021992,
0.0700070411,
-0.0617874637,
-0.438639462,
-0.0769580975,
0.1941023171,
0.3789052069,
0.1480203867,
-0.3692035079,
-0.3921059966,
-0.2418221682,
0.1042947024,
0.0268056579,
0.5830655694,
-0.3682014644,
0.0149905607,
0.0973754302,
0.146422416,
0.2079624534,
-0.3628023863,
-0.096725598,
0.5620804429,
0.0523686409,
-0.7500113249,
0.066022411,
-0.1347549111,
0.2047205269,
0.3208771646,
0.2045468688,
-0.3868638873,
-0.3456491828,
-0.1134217381,
0.6048510075,
-0.1742548943,
-0.1486740559,
-0.3017247915,
-0.0627287924,
-0.4483121037,
-0.281288445,
-0.1597104222,
0.1530339569,
-0.1605028808,
0.0950323865,
-0.0373121314,
0.3016238213,
0.1328444928,
0.1292183995,
0.1982299089,
0.3423752785,
0.0227590576,
0.1644464433,
-0.057766173,
-0.1283720136,
0.125824213,
-0.0418086722,
-0.1127709746,
-0.0404157713,
-0.1151962429,
0.2561211884,
0.0040573515,
0.121282272,
0.0737477243,
0.082449168,
-0.1730270535,
-0.1732714176,
0.0891194046,
0.3258434534,
-0.0751115307,
-0.3365179598,
-0.498696804,
0.5371439457,
0.0989009589,
-0.0740022063,
0.4017360508,
-0.0626513511,
-0.3506155014,
0.5301766396,
-0.0509454496,
0.8433184624,
-0.3003082275,
0.1955538988,
-0.0325544327,
0.2135912776,
0.5460359454,
0.1363722831,
-0.0340170078,
-0.2968797684,
0.4123266637,
-0.00955889,
0.0540637374,
-0.0665574968,
0.1078644916,
-0.1792192012,
0.3505461812,
-0.1377652287,
0.0290450063,
0.0280303583,
0.5921817422,
0.2505411506,
-0.3041811287,
-0.387724638,
0.0808079466,
-0.2429033518,
0.3783742487,
-0.0661634952,
0.0325784907,
-0.0062681064,
-0.0077963993,
-0.2571873963,
0.1725371331,
-0.0441095158,
0.186119929,
-0.1785834134,
-0.2531177402,
0.3200861812,
0.4341289401,
0.1388462335,
0.1387244761,
-0.151546523,
0.1208060533,
-0.2238150686,
-0.2308396846,
-0.0556876287,
0.0883088112,
0.092578195,
-0.043849241,
-0.2501571178,
0.0243035108,
-0.059195783,
-0.0486496873,
-0.3381798267,
0.2409886718,
-0.0426512361,
0.0921945274,
-0.3808486164,
0.0418867916,
-0.1134272963,
-0.009474203,
0.1064320281,
0.0402996838,
0.2752074599,
0.2200920582,
0.3731344938,
-0.2605766654,
-0.1001690328,
0.2874046564,
-0.0598814078,
-0.0850010216,
0.7583626509,
0.4538275599,
-0.3374438286,
-0.2229511589,
-0.2997131348,
-0.4052827954,
-0.4894790053,
-0.2000895441,
0.1830615103,
0.3600776792,
-0.1642892361,
0.1142181009,
-0.1042965427,
-0.2964012027,
-0.0067339092,
-0.6568120718,
0.0636315271,
0.3400776982,
0.0016579209,
-0.0359529033,
0.0945210457,
-0.3717745543,
0.1721283793,
0.0088422131,
-0.2004662901,
-0.0362885557,
-0.1256378889,
0.2398033887,
0.3705245852,
-0.0229531303,
-0.0745552778,
-0.1351185888,
0.0613337755,
-0.2670683563,
0.1481944025,
-0.1666262746,
-0.02884629,
0.1589288116,
0.1681954265,
-0.1475321054,
-0.2843914032,
-0.4324958622,
0.3609696925,
-0.3049424291,
-0.1883180737,
0.0068974346,
-0.0595956296,
0.386870712,
-0.0060751755,
0.202257514,
0.026469633,
0.1902355552,
-0.007873293,
0.0779848173,
0.3223073184,
0.0898682773,
0.5199075937,
0.1076181903,
-0.546900332,
-0.0673875287,
-0.1689729095,
0.0710251555,
0.2085429579,
-0.1280461848,
0.3231799603,
0.3659039438,
-0.0152708627,
0.5192856789,
-0.0380618609,
-0.0792633295,
0.2815093696,
0.0779809877,
-0.1643523574,
0.0108779967,
0.5853739977,
0.1379612386,
-0.1037729084,
0.1389348805,
0.1245171204,
-0.1958183646,
0.1128975675,
-0.0578759536,
0.2018037587,
-0.1756237447,
0.2006062269,
-0.0206895005,
0.2387912571,
0.052091632,
0.0392182395,
0.144744277,
-0.1465449929,
0.0226370525,
0.1288398951,
0.1109214574,
0.1582997292,
0.2208361775,
-0.0854601115,
-0.2918561995,
0.0891617835,
0.37812832,
0.0758496374,
0.202557683,
0.0398825034,
0.3129252493,
0.1630696654,
-0.1602582335,
-0.0606190227,
0.2834801078,
0.0812045261,
-0.1401232481,
-0.1251838803,
-0.0776647776,
-0.1392725706,
-0.0548248328,
-0.1382766515,
-0.2507812083,
0.1163998246,
0.4009123147,
-0.0892098248,
-0.5508674383,
0.4159518778,
-0.1234513968,
0.1944138706,
-0.0673287958,
0.1497878581,
-0.0897911191,
-0.0602516308,
0.0327402018,
0.0837595314,
0.1919620484,
0.2536798418,
-0.4132230878,
0.1657592952,
-0.0451346338,
0.0403788313,
-0.0329475105,
0.4479346275,
0.1915173829,
0.2907918692,
0.2841599584,
0.106336996,
-0.0679251328,
-0.1725247502,
0.1881707609,
0.024355825,
-0.0728111118,
0.1567289531,
-0.0406156257,
0.0821506158,
-0.3063540459,
-0.0368804075,
-0.3028012514,
0.1123582125,
0.2028052509,
0.3187677264,
0.0568406284,
-0.1566621661,
0.0131231658,
-0.1144661605,
0.585165143,
0.4834053218,
0.4793631136,
-0.2608210444,
-0.2378097177,
-0.6629289985,
-0.1066001803,
-0.3062574267,
-0.0114742666,
0.2245663702,
0.1316021383,
0.1677432805,
0.0255719796,
0.0976624638,
-0.1200582683,
-0.0860859901,
0.1423184723,
-0.4113577604,
-0.1409543157,
-0.1071037948,
0.1539101601,
-0.2812532783,
-0.0558899269,
0.159066081,
-0.3587041199,
-0.0662564561,
-0.1490183771,
-0.2088914216,
-0.0200847276,
0.3185060918,
0.2466908246,
0.0142016504,
0.6204611063,
0.0284751207,
0.025763765,
-0.277007848,
-0.216400966,
0.1149387956,
0.3116106391,
-0.3073529601,
0.2667536139,
-0.2045120746,
0.1605454236,
-0.3032951951,
0.2787632644,
-0.297760278,
0.0285213068,
0.0239705741,
-0.1633803248,
0.0643669814,
-0.1165075749,
-0.0065042265,
0.2059289515,
-0.0801011175,
0.0806957111,
-0.3005992174,
-0.1112570912,
0.2099957317,
-0.3703150749,
-0.3459797502,
0.0325196087,
0.0496532321,
0.0896234438,
0.0790833384,
-0.4082630277,
-0.0019555688,
0.4207474887,
0.0091001242,
0.1098378301,
0.1136833429,
0.0344163142,
-0.1132742614,
-0.1676359773,
-0.1806418896,
0.1454226524,
-0.0464872196,
-0.0824348554,
-0.1141449362
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | Hey @gaguilar ,
I just found the ["char2subword" paper](https://arxiv.org/pdf/2010.12730.pdf) and I'm really interested in trying it out on own vocabs/datasets like for historical texts (I've already [trained some lms](https://github.com/stefan-it/europeana-bert) on newspaper articles with OCR errors).
Do you plan to release the code for your paper or is it possible to get the implementation 🤔 Many thanks :hugs: | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 57 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
Hey @gaguilar ,
I just found the ["char2subword" paper](https://arxiv.org/pdf/2010.12730.pdf) and I'm really interested in trying it out on own vocabs/datasets like for historical texts (I've already [trained some lms](https://github.com/stefan-it/europeana-bert) on newspaper articles with OCR errors).
Do you plan to release the code for your paper or is it possible to get the implementation 🤔 Many thanks :hugs: | [
0.3906110227,
-0.2458590269,
-0.0904507786,
0.4170150757,
0.0234595276,
0.1634710729,
0.1849507391,
0.1755525619,
0.511061728,
-0.2657082081,
0.015799284,
-0.177120626,
-0.0163382478,
-0.1042660475,
0.0836660862,
-0.0257930867,
0.1556574702,
-0.0064993799,
0.0507142246,
-0.2967503071,
-0.2989608645,
0.3291719258,
-0.3816178441,
0.0590655804,
-0.2614991367,
0.2898675203,
0.0583655909,
-0.135697946,
-0.0120732374,
-0.2747807503,
0.2623198628,
0.2832200527,
0.4896441996,
0.1554793566,
-0.0001210404,
-0.1048501432,
0.5296109319,
-0.2100766003,
-0.434851259,
-0.0876915753,
0.1109088883,
-0.2407890856,
0.2297365963,
-0.1705433577,
0.0351690352,
-0.0798511952,
0.3073666096,
-0.4942202866,
0.1781290323,
0.028484609,
0.1619758308,
-0.2675949931,
0.0858380347,
0.117609255,
0.3122583032,
0.066657871,
0.1999562383,
-0.0564157888,
0.1693620384,
-0.2023468018,
-0.0613850541,
0.2206848413,
-0.1636947691,
-0.1409706026,
0.1139509678,
-0.031828437,
-0.2651315629,
-0.7127399445,
0.2629501224,
0.1425904632,
0.6615582108,
0.0197380669,
-0.2775623798,
-0.0204802155,
0.0560814701,
0.1707983315,
0.2110822052,
0.2694199681,
-0.1647177637,
0.0527460165,
0.05251389,
-0.2845050693,
-0.0732933432,
0.5155889988,
0.0218740702,
0.3449907601,
0.0481324308,
0.1534673423,
0.0292204265,
-0.306663543,
-0.2476731092,
-0.1166794002,
0.3090468645,
0.441865772,
-0.3259525597,
0.4467230439,
-0.0831786245,
0.3020557165,
0.1693537682,
-0.2793387473,
-0.226931259,
0.2168735862,
-0.1235039681,
0.1648906767,
0.1029396057,
-0.1278944314,
0.1356343031,
-0.0819737613,
0.4372631311,
0.0055118278,
-0.283595264,
0.0510566458,
-0.3815603852,
-0.271969676,
-0.1230664179,
-0.1756995022,
0.0755282715,
-0.5338535905,
0.1250911653,
0.1443606764,
-0.4663890898,
-0.2009460628,
-0.1746475995,
0.3171273172,
-0.1507845968,
0.1184733212,
0.3016983867,
0.2242137939,
-0.5199455023,
0.0049656667,
0.0979949832,
-0.1183571219,
-0.2367059141,
0.152015835,
0.4170896411,
0.0140931532,
0.3155950308,
-0.0823282301,
0.2053256929,
-0.1119858176,
-0.1444023997,
-0.2095375508,
0.0242703613,
-0.0174171776,
0.1545066237,
0.4832893908,
-0.0132777477,
-0.2494759113,
-0.1152050421,
0.1660110801,
-0.1901400387,
0.4341742992,
-0.0041948874,
0.0465076342,
-0.3614524603,
-0.1916786432,
-0.1078750491,
0.3816646039,
0.2352101654,
-0.165778935,
0.0189093612,
-0.3135446012,
-0.2109050751,
-0.0437980592,
0.5165929794,
0.5619416237,
-0.2702685297,
-0.1021605656,
0.1821820885,
0.2397643179,
0.2302525789,
0.3319923878,
0.0419324227,
0.1244718358,
-0.1256992817,
0.0838079304,
0.3810040951,
-0.0728206486,
-0.0444148146,
-0.1370719224,
0.2496326417,
0.174049139,
0.0639344826,
-0.0021650717,
0.143586576,
-0.0364138484,
0.0722804368,
0.4770645201,
0.2560551167,
-0.045161631,
-0.3485658765,
-0.1867232919,
0.6346894503,
0.2341766655,
0.1699862331,
-0.2361822128,
-0.0794720054,
0.2613797188,
0.2289857417,
-0.1427282989,
0.1414876878,
0.2766787708,
-0.2538864017,
0.2242306471,
0.4858231843,
0.03579887,
-0.3258080184,
0.2015610933,
0.0289694741,
0.2578412294,
-0.0120716132,
0.115736194,
-0.2651407123,
-0.2199256718,
-0.2003050148,
0.0093616936,
0.0058346577,
0.1158889756,
-0.0417717956,
0.5396043658,
-0.1149115115,
-0.2178822905,
-0.061249271,
-0.0855938345,
-0.4906998277,
0.4045854211,
0.009036433,
-0.1398371607,
-0.0711567476,
0.3765221834,
0.2453238964,
-0.0602580197,
-0.1392200738,
-0.2804967165,
0.202695787,
0.0408364609,
0.228862524,
0.0024846718,
0.2272122204,
-0.5271454453,
0.3992442787,
0.3866474628,
0.1387700289,
-0.1339145899,
0.0012375107,
0.2392910123,
-0.0441635661,
0.4396704435,
0.0859870315,
-0.0385915861,
0.4223068655,
0.0905775875,
-0.1323938817,
-0.0435805693,
0.485247761,
0.0732426941,
0.1626707911,
0.1913672239,
-0.1442709118,
-0.1936851889,
0.5177626014,
0.0909316689,
-0.0187759213,
0.3626835644,
-0.2295611501,
-0.0533269644,
-0.1021809429,
-0.2229628712,
-0.1273010969,
0.1584007442,
0.0243468061,
-0.4226489067,
0.039483726,
-0.1722596139,
-0.0449524745,
0.0495823026,
0.1654258966,
0.0041071121,
0.1203465015,
0.0365822837,
-0.3161319494,
0.0196043327,
-0.0089465305,
0.1602906436,
-0.2447583079,
0.0077649951,
-0.2301368564,
-0.5507709384,
-0.2677682638,
-0.0249453355,
-0.3480506837,
-0.194313705,
-0.0272616874,
-0.1914625317,
0.1517438293,
0.1324219704,
-0.0981024727,
0.1305911243,
-0.1925239265,
-0.0035213251,
-0.296541065,
-0.3732477427,
-0.2692625225,
-0.0616009459,
0.5387208462,
0.1602940261,
0.0685502142,
0.018986702,
-0.2335123718,
-0.1795652509,
-0.575345695,
-0.1952230334,
-0.2593545616,
0.1230404228,
-0.2514227331,
0.2690454721,
-0.1243938059,
-0.2165592611,
0.3017779887,
-0.0531302318,
-0.0528696813,
0.2159950435,
0.2227417827,
0.0537781604,
-0.0508999899,
-0.4319813848,
-0.1463400275,
-0.1911933422,
0.0201559868,
0.2135229111,
-0.0891559198,
0.1526997089,
-0.2282403409,
0.1837088764,
0.1528961509,
0.1669876277,
-0.0652984008,
0.3019634187,
0.4984689951,
0.0652891397,
-0.4329939783,
0.051956445,
-0.0042818859,
-0.0170333013,
0.0980362445,
-0.4875887036,
0.1571111232,
-0.1949493736,
-0.1083962917,
0.2408304811,
-0.0894412175,
-0.0174039304,
-0.102652505,
0.1484975815,
0.0643686354,
-0.2411847413,
-0.0574179366,
0.0351137221,
0.1684246361,
0.1630058736,
0.4710476398,
-0.2816253304,
0.5095039606,
0.2555079758,
0.0217599608,
0.2340550274,
0.312794745,
-0.0030243183,
-0.2253609896,
-0.3853046,
0.2120364904,
-0.1305787712,
-0.2007104456,
0.4930436313,
0.1017164141,
-0.4706681073,
-0.1700116098,
0.1370810419,
-0.4081181884,
-0.2867389321,
0.1878869832,
-0.3789226115,
0.0918691158,
-0.1119488925,
0.0144184753,
0.0737814903,
-0.4300968349,
0.0140182935,
0.1315656602,
0.4339996874,
0.0714244395,
-0.3578451276,
-0.1737830937,
-0.4213222861,
0.2797152996,
-0.0169851147,
0.5153198838,
-0.2730412185,
-0.0953854173,
0.2180935889,
0.089982897,
0.1950145364,
-0.1942403913,
-0.0948113948,
0.3745183945,
0.0353588313,
-0.542188704,
-0.0446794629,
-0.1850261092,
0.0787378699,
0.4650759995,
0.2198140025,
-0.3708887696,
-0.2298642248,
0.0967989117,
0.3545047641,
-0.1607289016,
-0.0920627564,
-0.4532788694,
-0.263035059,
-0.4781174362,
-0.1487604082,
-0.1451950222,
0.1708307117,
-0.2001549304,
-0.0280960649,
-0.0539930165,
0.1814185381,
0.2792244554,
0.1019651145,
0.041267287,
0.3717176318,
0.0310330242,
0.2131711394,
-0.0054113967,
-0.2380087227,
0.267273128,
-0.1106414869,
-0.2622879148,
-0.0459534489,
-0.2169791907,
0.500957489,
0.0594170615,
0.0803889781,
0.2558211982,
0.0931422412,
-0.171389848,
-0.1147556007,
0.0068124123,
0.372396946,
0.1456339508,
-0.4133529663,
-0.4464229643,
0.4641005695,
0.0144127011,
-0.1129807383,
0.2672718167,
0.1448345035,
-0.4239816964,
0.5985948443,
0.1311941147,
0.9971318245,
-0.3668914437,
0.2018921673,
-0.2075109035,
0.2028150409,
0.3728064895,
0.0160653591,
-0.1227189749,
-0.38411659,
0.2088966966,
-0.0226502977,
0.1443276703,
0.0302235503,
-0.0453521088,
-0.3547634482,
0.187781617,
0.1019594222,
0.2176282704,
0.0884160548,
0.3984997272,
0.2086005211,
-0.4390073419,
-0.3299593031,
0.006199602,
-0.1620274782,
0.2169493586,
-0.0881795213,
0.0771138743,
-0.0005549714,
-0.0640531927,
-0.3029725552,
0.2212538123,
-0.2876313031,
0.1381064653,
-0.1255612373,
-0.3206406236,
0.4721293747,
0.5965862274,
0.3510194421,
0.2904990911,
-0.1735069007,
0.0105505157,
-0.1729169041,
-0.2807389498,
0.0505276136,
0.0023191608,
0.0547880456,
-0.0329856426,
-0.4186166227,
0.1099377573,
0.15391469,
-0.0091172457,
-0.3345568776,
0.0581937917,
0.0520101599,
-0.0574333072,
-0.2535777092,
0.0690898374,
-0.183730647,
-0.0002329201,
0.0812404156,
0.0987494364,
-0.021315515,
0.2303001434,
0.5099387765,
-0.2384016812,
-0.0022813752,
0.2756130099,
-0.0330423675,
0.0459994823,
0.8242175579,
0.3531858325,
-0.2382497936,
-0.1604871154,
-0.2077345252,
-0.3506247699,
-0.3734710217,
-0.2744708955,
0.1705955863,
0.1926057339,
-0.1020285562,
0.1096994057,
0.0579545982,
-0.4136629999,
-0.0560124777,
-0.6720861793,
-0.1262618005,
0.3311508596,
-0.1153272018,
0.0324969143,
-0.0142593831,
-0.5099300742,
0.1948367059,
0.1946292073,
-0.1821058691,
-0.0208348352,
-0.2921030521,
0.2459237725,
0.2934971154,
-0.1317526847,
-0.0726342723,
-0.018871624,
0.0475854352,
-0.1795154214,
0.1140777096,
-0.1495864987,
-0.0031642243,
0.1556257904,
0.1842787117,
-0.1620334983,
-0.2362392098,
-0.3010373116,
0.2912216187,
-0.3832780719,
-0.1784517169,
-0.1011086926,
-0.0268250331,
0.3944607675,
0.0486963019,
0.2048485875,
-0.1685722917,
0.1366663426,
-0.1157345772,
0.0159161687,
0.4320170879,
0.0235157907,
0.4140546024,
0.1401092708,
-0.2484563589,
0.0704443902,
-0.0075612627,
0.0395553261,
0.2657287717,
-0.244988665,
0.3485543728,
0.1776705533,
0.0376953632,
0.4427364469,
0.0143102929,
-0.0834107921,
0.4294578433,
0.0554973707,
-0.272662878,
0.2773705125,
0.7467556,
0.0200000554,
-0.2252028286,
0.2666770518,
0.1365983337,
-0.2904506028,
0.1645984501,
-0.0856709927,
0.1493193805,
-0.1826092303,
0.0764396042,
0.0454645306,
0.2334898561,
0.0234494321,
-0.0325810537,
0.0579325967,
-0.0885805562,
0.0422124155,
0.048065532,
0.2361266911,
-0.1614204347,
0.3138410747,
-0.0651849508,
-0.2827222943,
0.159536168,
0.423085779,
0.0679792091,
0.0578720644,
-0.0029424168,
0.5064712763,
0.2450963706,
-0.1024083346,
-0.2211092114,
0.3179285526,
-0.0194962714,
-0.3253655434,
-0.2263768613,
-0.0569536462,
-0.1090922579,
-0.0277841277,
-0.0417715944,
-0.2031986862,
0.0063875988,
0.4082835913,
-0.1870368421,
-0.4960331619,
0.3953443766,
-0.211919412,
0.2540354431,
-0.2087826431,
0.2732509971,
0.0641728789,
0.0511594638,
0.0361327343,
0.3524007797,
0.2060097605,
0.2829678059,
-0.5382404327,
0.1005953252,
0.023157429,
0.1314055771,
0.0017923564,
0.4927105606,
0.1881441623,
0.3580557406,
0.2191221714,
0.045040071,
-0.0004442222,
-0.1535633802,
0.2608041763,
-0.1749699116,
-0.1246765107,
-0.0879454762,
-0.0300964173,
0.031368576,
-0.3653178215,
-0.0056127049,
-0.3547832072,
0.1193558574,
0.2188401073,
0.0083385743,
-0.0003708787,
0.0117874779,
-0.0110135674,
-0.0026242901,
0.6466361284,
0.602710247,
0.3966040909,
-0.2353250533,
-0.1963378936,
-0.6136159897,
-0.0775847882,
-0.1869960427,
-0.0533822849,
0.1000553668,
0.1207705289,
0.214269489,
-0.0674329326,
0.0162819177,
-0.1170124039,
-0.1347687244,
0.2401025742,
-0.3762450814,
-0.0697816685,
-0.0522628278,
0.1122819036,
-0.4041320384,
-0.0911910385,
0.0363357365,
-0.2400301546,
-0.1073956415,
-0.0076478906,
-0.0945884883,
-0.0707324743,
0.1332759857,
0.2698801756,
0.0466824397,
0.6554788351,
-0.040615581,
0.0991714001,
-0.1925158054,
-0.2858042419,
0.0517743044,
0.2309684455,
-0.0677306727,
0.2449162155,
-0.1834987998,
0.0795510113,
-0.145066753,
0.3292112648,
-0.1423067003,
-0.0384933949,
-0.0850029215,
-0.1523202807,
-0.1048033684,
-0.1643171906,
-0.0506856777,
0.1647797823,
-0.156388104,
0.014332138,
-0.3151752353,
-0.0461163074,
0.2633312345,
-0.4905761778,
-0.3067493737,
0.0222112387,
0.2202166319,
0.16707623,
0.1308673471,
-0.5824355483,
0.0124707669,
0.2963414192,
0.1136174127,
0.0200182199,
0.2618849874,
0.202668786,
-0.1604252458,
-0.2549133301,
-0.1259837449,
0.0105355382,
0.0325700417,
-0.1665250808,
-0.251254797
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | Hi @stefan-it! Thanks for your interest in our work! We do plan to release the code, but we will make it available once the paper has been published at a conference. Sorry for the inconvenience!
Hi @lhoestq, do you have any insights for this issue by any chance? Thanks! | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 49 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
Hi @stefan-it! Thanks for your interest in our work! We do plan to release the code, but we will make it available once the paper has been published at a conference. Sorry for the inconvenience!
Hi @lhoestq, do you have any insights for this issue by any chance? Thanks! | [
0.2403037846,
-0.2484168708,
-0.1631049812,
0.4403243065,
0.1737350076,
0.2488846481,
0.1497556269,
0.1210620701,
0.7601823807,
-0.2220044285,
0.0222222134,
0.0713402927,
0.0954832882,
-0.1848980039,
0.1068776473,
-0.1616458893,
0.0440615863,
-0.1030345708,
0.0729286298,
-0.3541831374,
-0.3202170432,
0.2861115336,
-0.3100569844,
0.0425743125,
-0.2333231121,
0.2517734766,
0.0728948861,
-0.1765369177,
0.1209297702,
-0.3202479184,
0.1578958631,
0.231483981,
0.325101912,
0.085267745,
-0.0001190338,
-0.0872880369,
0.6546163559,
-0.1592823565,
-0.4487918913,
-0.1851799786,
-0.0973319933,
-0.3896653354,
0.2533378303,
-0.1280305386,
0.0163539499,
-0.2522680461,
0.2647598088,
-0.5146148801,
0.1419246793,
0.0559384562,
0.2139524519,
-0.1379256248,
0.0227348432,
0.1068887264,
0.2769063413,
0.0649671406,
0.1694056392,
0.1623744071,
0.0854749829,
-0.2234980464,
-0.0387616977,
0.113703087,
-0.1003309563,
-0.1278005093,
0.1507623196,
-0.1206090599,
-0.1666322798,
-0.6842164397,
0.1447825432,
0.2033782303,
0.6822655797,
0.0585356988,
-0.2979769707,
-0.1612643898,
0.0933202207,
0.0761077404,
0.1409388185,
0.3160326779,
-0.0881595835,
-0.0372595266,
-0.0245254338,
-0.2631929219,
-0.0459628552,
0.5973466039,
-0.0228504017,
0.3351398706,
0.0907269865,
0.1467965394,
0.0506305806,
-0.1951537579,
-0.278349489,
-0.1947201937,
0.4301373959,
0.3810805976,
-0.3087663949,
0.4801421463,
-0.0456434526,
0.3292754591,
0.0133884624,
-0.0930942595,
-0.1806411594,
0.2417507172,
-0.1000084803,
0.1367214918,
0.1078263968,
-0.0790934861,
0.2475861907,
-0.1791077703,
0.3294418752,
0.0208847933,
-0.1913853735,
0.0133498851,
-0.3254073858,
-0.1911403984,
-0.1798247397,
-0.1045292541,
-0.0958528891,
-0.2829457223,
0.1404172629,
0.1610738188,
-0.2561334968,
-0.286463201,
-0.106617026,
0.2845074832,
-0.0664204508,
0.2138199508,
0.1953395903,
0.3847075403,
-0.4297504425,
-0.0301164221,
0.0409816951,
-0.0935205147,
-0.2020631135,
0.129285723,
0.3580398858,
0.1207704246,
0.3291495144,
-0.0694360361,
0.0810365751,
-0.155700475,
-0.0373727679,
-0.2140842974,
-0.0087237135,
0.0356199071,
0.1680489033,
0.5185455084,
0.0808481574,
-0.2173631638,
-0.1059135869,
0.1191342026,
-0.1258129328,
0.3667226136,
0.0681523532,
0.1092631817,
-0.2834847569,
-0.1489903629,
-0.2708299458,
0.2827831805,
0.1446894556,
-0.1428504288,
0.0068382435,
-0.3387594819,
-0.1361777484,
-0.093292132,
0.3333098292,
0.6020078063,
-0.2379500419,
-0.1239937469,
-0.1097204164,
0.0625885874,
0.268011421,
0.2264482677,
-0.0461873449,
0.2239890099,
-0.156692028,
0.0619399101,
0.3468514979,
-0.046979636,
0.0503413901,
-0.06758973,
0.2577610314,
0.2884016037,
-0.0052857995,
0.0463493727,
0.1783653647,
0.1373413503,
0.1054223925,
0.5019159913,
0.2646088302,
-0.0576130375,
-0.2766193151,
0.0062877685,
0.5969916582,
0.3590584695,
0.1414650381,
-0.1886009276,
0.0241509993,
0.2261441648,
0.195943594,
-0.0028576758,
0.0716464072,
0.350929141,
-0.3852471709,
0.2675403059,
0.2548682094,
-0.0209834017,
-0.5135543942,
0.2265411019,
0.0287270993,
0.1059743911,
0.0159476418,
0.1545716524,
-0.3214332163,
-0.1303370893,
-0.2276049107,
0.219058454,
0.0747415572,
0.1921157986,
-0.121871978,
0.4916855693,
0.012967743,
-0.0874284804,
-0.0117389709,
-0.1381869912,
-0.5985739231,
0.4235672951,
0.0482075065,
-0.056223616,
0.0593153238,
0.2330971658,
0.3044796884,
-0.2093189806,
-0.0807319283,
-0.2575536072,
0.3645226657,
0.2046220601,
0.1982114017,
-0.0947814286,
0.2405408174,
-0.4418723583,
0.2254379988,
0.2603182793,
0.1516066939,
-0.2144681066,
0.0301999357,
0.2297265083,
-0.1430053413,
0.5375186205,
0.0430867374,
0.0495939627,
0.3477124274,
0.0680930465,
-0.1327031702,
-0.0794157013,
0.5763515234,
0.0573303737,
0.0827739984,
0.2758431435,
-0.2210944891,
-0.1473475248,
0.5132132769,
0.0761644989,
-0.0223132595,
0.2689855099,
-0.167255193,
-0.0592590496,
-0.0539216623,
-0.2574970722,
-0.1739652753,
0.15620251,
0.0415367186,
-0.4466872513,
0.0504267961,
-0.196802035,
-0.0201259851,
0.1313179582,
0.232264176,
0.1344465017,
0.1571953595,
-0.0975151658,
-0.3240647316,
-0.0681779981,
0.0862874463,
0.1361801773,
-0.1638787538,
0.0931760445,
-0.3838434219,
-0.4041523039,
-0.2602179945,
-0.25427562,
-0.4711833,
-0.3847305477,
-0.1355744898,
-0.0394926332,
0.1254094988,
0.1541991532,
-0.0528745428,
-0.0324827582,
-0.2601850331,
-0.1441572607,
-0.385861814,
-0.4119677246,
-0.3634985089,
-0.0191758089,
0.4967759252,
0.1376647949,
0.1182280332,
-0.0453858823,
-0.2144780904,
-0.1219035015,
-0.2909274697,
-0.2819657326,
-0.2903446853,
0.1524648815,
-0.2907550633,
0.4180138409,
-0.1580181718,
-0.0648813099,
0.3832048476,
0.0132899359,
0.0320149735,
0.2098768353,
0.2292373925,
0.1488261074,
-0.0189769864,
-0.4778476059,
-0.2821603715,
-0.2814056873,
-0.0662080348,
0.1745468378,
-0.0539821796,
0.0485929959,
-0.1069308668,
0.0735312104,
0.0665919036,
0.0155836381,
-0.0805285275,
0.3945895433,
0.4707880616,
-0.0173794869,
-0.3659877479,
0.0674861223,
0.074989967,
0.1151346564,
0.0657226294,
-0.5204167962,
0.1854236126,
-0.2083956748,
0.0451810583,
0.0980082452,
-0.0894571394,
-0.0342033319,
-0.1984019727,
0.1101851165,
0.028505899,
-0.2111651301,
-0.1685478985,
-0.0847698674,
0.1542384923,
0.0776108131,
0.3722037077,
-0.4040285349,
0.6679965258,
0.1800067872,
0.0724655241,
0.263615489,
0.2328598201,
0.0247049518,
-0.1845100075,
-0.3040194511,
0.0815445185,
-0.0401311815,
-0.0343115702,
0.3433052003,
0.0115958489,
-0.4260085225,
-0.1549558342,
0.1568579674,
-0.3191000521,
-0.2304742336,
0.2014162838,
-0.3361046612,
0.1445451826,
-0.0662021786,
-0.0007366091,
0.0227242261,
-0.3679700494,
-0.0963785946,
0.2318518609,
0.4085186124,
0.1300842613,
-0.3945160508,
-0.4773105383,
-0.2277917862,
0.2020530701,
0.0023552515,
0.5626488328,
-0.3096439838,
0.0228686333,
0.2210662663,
0.167770505,
0.3177934885,
-0.240072459,
-0.1420289725,
0.4882228374,
0.0156562477,
-0.638060987,
-0.0204226151,
-0.162219435,
0.0833282694,
0.3625947833,
0.1878808737,
-0.4364580512,
-0.2926008999,
-0.0732095242,
0.5856535435,
-0.1869585514,
-0.0772840828,
-0.50575459,
-0.1394652873,
-0.4415779114,
-0.1582533866,
-0.1779349148,
0.1417429745,
-0.276198864,
0.0048869178,
-0.0101856608,
0.2405769229,
0.1153623909,
0.1174698472,
0.1481310278,
0.3268493116,
-0.0886295065,
0.1726896167,
0.0501055829,
-0.175848037,
0.2731842697,
-0.0112893758,
-0.223464787,
-0.0317073613,
-0.1399450749,
0.3491229713,
-0.0059755594,
0.1135571152,
0.0953233838,
0.0818392187,
-0.1781452298,
-0.028683383,
0.0135914572,
0.278360039,
-0.0382185355,
-0.3387449384,
-0.4012159705,
0.5765024424,
0.0070207864,
-0.0892080292,
0.2786206603,
-0.0551361181,
-0.3789117932,
0.5338392258,
-0.0165800489,
0.8492920399,
-0.3891260624,
0.1554400772,
-0.143234387,
0.1330959499,
0.4436137378,
0.0589538515,
-0.0184577666,
-0.3015090823,
0.2767418921,
-0.0101543479,
0.0423013419,
-0.0207286216,
0.0980929136,
-0.1530328691,
0.2524522543,
-0.1435233355,
-0.0090275481,
0.0044083372,
0.4702905118,
0.2433362603,
-0.3317689896,
-0.3470062017,
0.0545625463,
-0.2036555856,
0.3451608419,
-0.0672629625,
0.1444679797,
-0.0432107076,
0.037894316,
-0.2116959691,
0.1651281863,
-0.1171960384,
0.2133904696,
-0.134457022,
-0.2575913668,
0.3556827903,
0.5074009895,
0.0215181336,
0.2731774449,
-0.1743642092,
0.1470683217,
-0.1626449525,
-0.2417932749,
-0.0147239827,
0.0709898621,
0.0129226465,
0.0283666402,
-0.3950638771,
0.0442971289,
0.0037672073,
0.0431340262,
-0.3927105367,
0.1890948266,
0.1144264713,
0.0836998075,
-0.3093761802,
-0.0286049992,
-0.151761964,
0.0350538641,
0.1089377701,
0.0696151853,
0.2089782953,
0.1458032429,
0.4594575763,
-0.2960388064,
-0.0933228284,
0.3001583219,
0.0638606474,
-0.0011978,
0.7996011972,
0.3861700296,
-0.3594212532,
-0.2298437953,
-0.2924444377,
-0.3995827436,
-0.4833539128,
-0.2949714065,
0.2098214924,
0.3583871424,
-0.2310599685,
-0.0729671121,
-0.0307940003,
-0.4497262537,
-0.0161585361,
-0.7090859413,
0.0165187865,
0.3147457242,
-0.0000941034,
0.0260792933,
-0.0276996195,
-0.4851160645,
0.2010199726,
0.0916263834,
-0.1756436974,
-0.0473134927,
-0.2248127609,
0.1954901814,
0.2992232442,
-0.0352142565,
-0.1304015964,
-0.0882099271,
0.0713465288,
-0.2821353376,
0.1325381249,
-0.1703801453,
0.0527290553,
0.1434515864,
0.1451937109,
-0.1037201434,
-0.2014232874,
-0.3925798833,
0.4189735055,
-0.3449261487,
-0.1525971144,
-0.1066155359,
-0.0266447812,
0.3194226027,
-0.0140297431,
0.1901138872,
-0.0774290711,
0.1578540206,
-0.0850619972,
-0.082388863,
0.3056703806,
-0.0068960031,
0.5810574889,
0.1099859625,
-0.5075942874,
-0.0799194276,
-0.0467281789,
0.1994894147,
0.2299569845,
-0.1565626264,
0.3871648312,
0.3396727741,
-0.0693702176,
0.5151049495,
0.0073438026,
-0.0528737232,
0.2545532882,
0.0867434889,
-0.2415275127,
0.092068702,
0.6816081405,
0.1594069451,
-0.1839641035,
0.1778036952,
0.1591593027,
-0.3290594816,
0.138338238,
-0.0923919305,
0.1674385369,
-0.1172066927,
0.1580106765,
-0.0585480817,
0.2875806093,
0.0745178834,
0.0432787165,
0.1063265353,
-0.1758419573,
0.0457459465,
0.0872723609,
0.1847973168,
-0.0610205233,
0.1015652791,
-0.2019760907,
-0.3225004375,
0.0728745386,
0.4022851288,
0.0096957684,
0.2069606781,
-0.0112264082,
0.2832613587,
0.1767341495,
-0.2089666426,
-0.1305616796,
0.2014624178,
0.0201624837,
-0.2395564765,
-0.1525936425,
-0.0021777302,
-0.1380705386,
-0.0669317916,
-0.0068494966,
-0.2023609579,
0.0999130458,
0.4635054171,
-0.1395789385,
-0.5846255422,
0.4334032536,
-0.2078954875,
0.1689866632,
-0.1067217886,
0.167897284,
0.031039428,
0.0004861336,
0.0151001345,
0.127366066,
0.1169175953,
0.2159250379,
-0.3526304066,
0.0818491578,
-0.0694540441,
0.1497201174,
-0.0708672628,
0.5127889514,
0.1474295557,
0.3634584248,
0.260546416,
0.1120830178,
-0.0764614195,
-0.1830802858,
0.2877778411,
-0.0758491755,
-0.1495366991,
0.1175030842,
-0.1551468223,
0.0124112368,
-0.3886644244,
-0.0512387007,
-0.321736455,
0.2560249269,
0.1722008437,
0.1620510817,
0.0402545594,
-0.1091504395,
0.0207823142,
-0.1413677782,
0.666307807,
0.506669879,
0.3194197416,
-0.2758909762,
-0.1868965775,
-0.565941453,
-0.0348335393,
-0.2592557669,
-0.0048587173,
0.2345605493,
0.1528182626,
0.1511087716,
0.0310096145,
0.0640196726,
-0.1681589782,
-0.1787694395,
0.2347788215,
-0.3656272292,
-0.1523028314,
-0.200296849,
0.1293934435,
-0.3538476825,
-0.0602659993,
0.1703131497,
-0.2967703342,
-0.0597510487,
-0.0768338144,
-0.0956022516,
0.0310841352,
0.224775821,
0.2009019107,
0.025307091,
0.6212069988,
0.0268291831,
0.065591611,
-0.2723121941,
-0.2518820763,
0.1372407526,
0.3818349242,
-0.2070985734,
0.2753728628,
-0.1499735117,
0.1425305903,
-0.2214933932,
0.3350883722,
-0.1331694722,
0.0281582735,
0.0193345621,
-0.2070405483,
0.0183898062,
-0.1782642007,
0.0649187714,
0.2715235651,
-0.0717396289,
0.0544043854,
-0.2975776792,
-0.1280846149,
0.1845304072,
-0.3519734144,
-0.4541133344,
0.0557492152,
0.1558383703,
0.1332851797,
0.1779457927,
-0.3800543547,
0.0439231992,
0.4062697887,
-0.0453389809,
0.1094524041,
0.1945028752,
0.0544759929,
-0.1300241351,
-0.1876984835,
-0.1143444553,
0.0784825236,
-0.0117284954,
-0.0870876163,
-0.143393293
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | This is an issue on the `mwparserfromhell` side. You could try to update `mwparserfromhell` and see if it fixes the issue. If it doesn't we'll have to create an issue on their repo for them to fix it.
But first let's see if the latest version of `mwparserfromhell` does the job. | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 51 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
This is an issue on the `mwparserfromhell` side. You could try to update `mwparserfromhell` and see if it fixes the issue. If it doesn't we'll have to create an issue on their repo for them to fix it.
But first let's see if the latest version of `mwparserfromhell` does the job. | [
0.081005007,
-0.1919703633,
-0.1455878615,
0.3570192158,
0.141710639,
0.2244729698,
0.1284713149,
0.1698519289,
0.7591431141,
-0.1323024631,
0.1258178949,
0.0651474893,
0.0723556876,
-0.2256408632,
0.1120554879,
-0.1160824075,
0.0668615773,
-0.0926694721,
-0.1197490692,
-0.3306937814,
-0.3787724972,
0.2551675439,
-0.3586250544,
0.0707234368,
-0.2304443866,
0.2964782119,
0.0954792053,
-0.0933474526,
0.0668037385,
-0.3543367088,
0.1113954708,
0.1501277685,
0.3143436909,
0.0501734912,
-0.0001221014,
-0.0832126662,
0.7176686525,
-0.1283558607,
-0.417483598,
-0.1124235243,
-0.1387167573,
-0.4225859046,
0.1965475678,
-0.0338455439,
0.1579807997,
-0.0946852118,
0.254113555,
-0.4594912529,
0.1522707343,
0.1381664276,
0.1686459631,
-0.0437430069,
0.0378890336,
0.1722896397,
0.2732974291,
0.0850298703,
0.1379062235,
0.0558568239,
0.1328399777,
-0.2986983061,
-0.0256937146,
0.1416163892,
-0.1255059838,
-0.0742882788,
0.1094743311,
-0.0571213849,
0.0959101766,
-0.6544167399,
0.1758660674,
0.1662171185,
0.7025107741,
0.1253196597,
-0.25368011,
-0.1987036765,
0.0069874041,
0.1365218312,
0.2370205969,
0.4124925733,
-0.1284553707,
0.0225705002,
-0.0646740422,
-0.1853044182,
0.0156959966,
0.6050243378,
-0.055666659,
0.5664604306,
0.1225822568,
0.1940300465,
0.0965862572,
-0.2046297491,
-0.247741133,
-0.2428138852,
0.3923796117,
0.3300985694,
-0.1336006075,
0.4396108985,
-0.0657655597,
0.3765958548,
-0.0172559582,
-0.1677453816,
-0.3217782974,
0.3225260973,
-0.0982337892,
0.1468281001,
0.224214524,
-0.1507670283,
0.3737663627,
-0.3395673335,
0.3315448165,
0.0707734376,
-0.3030893505,
0.0902167261,
-0.2847906351,
-0.2116479278,
-0.2382558882,
-0.0894371122,
-0.0229717381,
-0.24581936,
0.2052040696,
0.1451723874,
-0.142918855,
-0.4003151953,
-0.1844203174,
0.3283185959,
-0.0951183289,
0.3528808653,
0.2199088335,
0.279997766,
-0.4712411761,
-0.1743669808,
0.0938560367,
-0.1050389409,
-0.3094646037,
0.2047818601,
0.3118846714,
0.2557213008,
0.2917594314,
-0.0396319404,
0.0920541286,
-0.2491995543,
-0.1176884919,
-0.0920444056,
-0.0362549014,
0.0806822851,
0.2162984908,
0.5053889751,
0.1520206481,
-0.2430996895,
-0.1731179953,
0.1904239655,
-0.1428011209,
0.3200358748,
0.0128070731,
0.070048444,
-0.2752063274,
-0.0271424446,
-0.3407326043,
0.2371086478,
0.0794990584,
-0.1422310472,
-0.0680596158,
-0.4404706359,
-0.2845093608,
-0.0630102605,
0.2751733661,
0.6824001074,
-0.2337111831,
-0.0756818503,
-0.1166579723,
0.0289645605,
0.2386088222,
0.1880315393,
-0.1038098037,
0.2789823115,
-0.1744205058,
0.0498724729,
0.3225486875,
-0.0450439826,
0.0905494615,
-0.0659109205,
0.2943430543,
0.2915301919,
-0.0669318587,
-0.0763113201,
0.0243999586,
0.0208585858,
0.0509527698,
0.4110233188,
0.1989033371,
0.0261735171,
-0.2342590541,
-0.1510099769,
0.5229262114,
0.3386921585,
0.0948800743,
-0.1500118673,
0.0498551652,
0.340354532,
0.2524160147,
-0.0655559301,
0.1375988871,
0.3369293213,
-0.3582299054,
0.2677420676,
0.2269892097,
-0.125304237,
-0.5547102094,
0.2306484878,
0.0456745476,
0.2032353282,
0.0385235399,
0.2390528023,
-0.2087599933,
-0.0982532799,
-0.2538188398,
0.3036789894,
0.0383169502,
0.2109608054,
-0.104460448,
0.3884802461,
0.0119995922,
-0.0699290112,
-0.0308243595,
-0.0951843858,
-0.4704375267,
0.3934471011,
0.0150188096,
-0.135568887,
0.0250267088,
0.2128003836,
0.3387267292,
-0.1139988303,
-0.121291548,
-0.2969062924,
0.4140079021,
0.2411862314,
0.1252677143,
-0.0791358054,
0.2526992857,
-0.4128448367,
0.2862430215,
0.194645673,
0.1356677711,
-0.2306517363,
-0.0101968236,
0.2143627703,
-0.0276602861,
0.5676330924,
0.0566373169,
0.0776869878,
0.340313971,
0.019382827,
-0.0210545547,
-0.1676113904,
0.54621315,
0.0987182781,
0.0150503889,
0.2127049565,
-0.0319071561,
-0.0564373173,
0.544849515,
0.0253494978,
0.0208528135,
0.2844620943,
-0.2476341873,
-0.0999410599,
-0.0892348588,
-0.2863973081,
-0.0896073654,
0.1175358295,
0.0390073322,
-0.2837414443,
0.0367041342,
-0.1309303045,
0.0369655937,
0.214071691,
0.2218250185,
0.0949608982,
0.1513844728,
-0.196806252,
-0.2821566164,
0.0594322681,
0.0114387125,
0.1479181051,
-0.1853642762,
0.0757365674,
-0.4506167173,
-0.1568676531,
-0.3748693764,
-0.2476454079,
-0.5724465847,
-0.37827757,
-0.188910529,
0.072367996,
0.0509034544,
0.0925706998,
-0.0886129588,
0.0059154779,
-0.317831248,
0.0326353684,
-0.3865585029,
-0.4102073312,
-0.4526663125,
-0.1088002175,
0.5771565437,
0.1519236416,
0.0683281794,
-0.0507629253,
-0.2250637561,
-0.1655671597,
-0.3020946383,
-0.1577260345,
-0.2327003479,
0.2592413425,
-0.2609624267,
0.3641051948,
-0.0438749194,
-0.1165325716,
0.395242095,
-0.1275029927,
0.0709458515,
0.1280754209,
0.1700386107,
0.1128407866,
0.0035245717,
-0.4678865671,
-0.3583432138,
-0.1989179254,
-0.1360439807,
0.2233171165,
-0.0345680863,
-0.1036566198,
-0.149871856,
0.0007478148,
-0.0653175861,
-0.0321479738,
-0.1435093433,
0.3579944968,
0.4484447241,
-0.0049898736,
-0.3743656278,
0.1497587562,
0.0967430547,
0.0119668823,
-0.0661529675,
-0.5379255414,
0.0361199751,
-0.1953773648,
-0.0451237224,
0.08570683,
-0.0456170291,
-0.0016663708,
-0.1290930659,
0.1255060434,
-0.0732009783,
-0.3776084781,
-0.2028998733,
-0.0218625627,
0.195078522,
-0.00382578,
0.3918197155,
-0.368539989,
0.6188602448,
0.1327213049,
0.0883561075,
0.1921494603,
0.1955505461,
-0.0770061985,
-0.1292403787,
-0.2143441141,
0.1045694053,
-0.1352510601,
-0.030681219,
0.4353137612,
0.0935670361,
-0.4703202248,
-0.1284869611,
0.2265396416,
-0.3547601104,
-0.3016400635,
0.2073978782,
-0.0769027919,
0.0912336782,
-0.0251528099,
0.0043481514,
-0.0949113145,
-0.4156619906,
-0.0764417872,
0.2262275964,
0.3356663287,
0.1157967597,
-0.4246569574,
-0.47828269,
-0.2039899826,
0.1658689976,
0.1202295125,
0.5696464181,
-0.3223561347,
-0.0003042892,
0.2053593695,
0.0649918094,
0.4050049186,
-0.2661749125,
-0.0219338983,
0.4310588837,
0.0408876687,
-0.5106998086,
-0.0938522816,
-0.1408033222,
-0.0567446761,
0.4831958115,
0.1177916825,
-0.3134798706,
-0.2617877424,
-0.101522997,
0.5601981282,
-0.2487788796,
-0.0578354523,
-0.5154816508,
-0.1347946078,
-0.527618587,
-0.1725112498,
-0.1721641123,
0.1211989,
-0.1933708489,
0.033469677,
0.1754015088,
0.2532632351,
0.1501695216,
0.0883514583,
0.1000318229,
0.3204566836,
-0.0820175931,
0.0797393024,
0.1786416322,
-0.1024800539,
0.320413202,
0.045911897,
-0.1989247501,
-0.0307990424,
-0.1371487379,
0.3870813251,
0.1025648266,
0.1812933385,
-0.0358676501,
0.1762813032,
-0.0651324242,
-0.1640778035,
0.1738251746,
0.2553320229,
-0.0520868674,
-0.3193428814,
-0.3537043631,
0.6563942432,
-0.0665211082,
-0.0533948317,
0.2429166734,
-0.0126090646,
-0.4147314429,
0.4613977671,
0.0934209079,
0.9715637565,
-0.3171392679,
0.1448173523,
-0.1763693839,
0.0304250196,
0.5069678426,
0.0396604538,
-0.0675017089,
-0.2740965486,
0.3756840825,
-0.0274236687,
0.0040097311,
0.0358709842,
0.174750492,
-0.1739990711,
0.2709542215,
-0.0112915635,
0.0670561194,
0.0594732538,
0.4678329825,
0.2532625794,
-0.3550416827,
-0.3557908833,
0.0304089412,
-0.2388122678,
0.3524135649,
-0.0693886653,
0.0893614069,
0.0727595389,
0.0014336184,
-0.2758074105,
0.2437410951,
-0.0760526955,
0.16767326,
-0.1628612131,
-0.0867025852,
0.392252624,
0.3192016482,
0.0237108059,
0.1971579194,
-0.2213932127,
0.2150330544,
-0.2453537583,
-0.1812201142,
-0.0596248545,
0.1299698353,
-0.0255759861,
-0.0059780777,
-0.394752115,
-0.0175611228,
0.0111834779,
-0.0713634789,
-0.4252784252,
0.130371362,
0.1867263615,
0.0967244953,
-0.3776079416,
0.032780204,
-0.069438152,
-0.0047399029,
0.0795038342,
0.0087580252,
0.148255676,
0.305105567,
0.4470952749,
-0.2960820794,
-0.0706328377,
0.3325312734,
-0.1022140086,
-0.1241697222,
0.7613789439,
0.5131793618,
-0.3496615291,
-0.2417657375,
-0.2614116073,
-0.4295881093,
-0.4725288749,
-0.2641902864,
0.2634415925,
0.339651823,
-0.143969804,
0.0455077216,
-0.0183111187,
-0.4574541152,
-0.0471401326,
-0.6610635519,
0.0393412486,
0.2092494518,
-0.0146015007,
-0.0092409812,
0.0129329935,
-0.3130425513,
0.1598882079,
0.1986603141,
-0.1436172426,
-0.0569278151,
-0.2126782686,
0.1778739989,
0.3161409199,
-0.0280056186,
-0.206829682,
-0.0318055525,
-0.0100409314,
-0.2475093603,
0.0706068352,
-0.1326599419,
0.0354801193,
0.1784732938,
0.1666119993,
-0.1664201617,
-0.2140846103,
-0.332883805,
0.4437121749,
-0.3661095202,
-0.1377568543,
-0.1608034372,
-0.0988248289,
0.2900401354,
0.111506626,
0.1816404909,
-0.0278571546,
0.1566686928,
-0.0921216607,
0.055249922,
0.3821150661,
0.0441886596,
0.5658998489,
0.1548044831,
-0.2928232551,
-0.090357475,
0.0087496787,
0.2369122207,
0.2314941585,
-0.1619160324,
0.2521120012,
0.3678915501,
-0.1160195917,
0.480548054,
0.0019994639,
-0.0912088081,
0.2682614625,
0.0529296622,
-0.2074798048,
0.0183698311,
0.7501775026,
0.1219281852,
-0.2131170034,
0.1073691696,
0.2034206986,
-0.3128160536,
0.1306406409,
-0.0431425869,
0.3513777554,
-0.1563561112,
0.0493196547,
-0.0115220025,
0.2406991124,
0.1322971433,
0.0122130569,
0.0604779162,
-0.0349897705,
0.190199852,
0.145440653,
0.1794830263,
-0.0096063726,
0.1245274395,
-0.2170789838,
-0.3147488832,
0.0829715133,
0.4683622718,
0.1482069045,
0.16157794,
-0.0154722743,
0.1307286918,
0.0450581759,
-0.2188173532,
-0.2239313126,
0.2758843601,
0.0937470049,
-0.2819697559,
-0.2536246181,
0.0600824505,
-0.105129227,
0.0900981277,
0.0291756205,
-0.2653275728,
0.0694411919,
0.4912323654,
-0.1592134833,
-0.4413054883,
0.1000304222,
-0.2558703423,
0.0785364881,
-0.1379112303,
0.140327394,
0.055933293,
-0.0345931761,
-0.019260481,
-0.0161286406,
0.2129902542,
0.3521134555,
-0.4400801957,
0.062866807,
-0.0654184371,
0.1685571373,
-0.1672759056,
0.5017226934,
0.1632209867,
0.465002656,
0.2088685483,
0.0879562348,
-0.0324950106,
-0.2866949737,
0.269980371,
-0.1593204439,
-0.0804592669,
0.120146364,
-0.0552681424,
0.0659205168,
-0.3422129154,
-0.0335798375,
-0.2467506081,
0.293782413,
0.2424667776,
0.077846311,
0.0621774346,
-0.0159027353,
-0.0126866419,
-0.108187288,
0.6601423621,
0.5105105639,
0.2919297814,
-0.2291160524,
-0.2408703268,
-0.608042717,
-0.0933293477,
-0.3303569257,
-0.0876864046,
0.2293174565,
0.1256251931,
0.1672123522,
0.0515520871,
0.09771429,
-0.07652051,
-0.1580455452,
0.2019281387,
-0.3592011333,
-0.198122263,
-0.0247906968,
0.1949091256,
-0.385938853,
-0.1294590831,
0.2386542857,
-0.2436130196,
-0.1144220084,
-0.1381229162,
-0.1011750773,
0.0008641183,
0.2453067005,
0.2526299357,
0.1153824925,
0.6777518988,
-0.0509352908,
0.0873865411,
-0.3224084973,
-0.2137747258,
0.1348678321,
0.1828359365,
-0.3542960286,
0.2720740139,
-0.1901827753,
0.3042676449,
-0.3099836409,
0.3408548236,
-0.0733304396,
0.143373847,
-0.0161368996,
-0.2343813777,
-0.0079792291,
-0.1627302766,
0.1087228879,
0.2059751749,
-0.0367442742,
0.0858358666,
-0.2553946376,
-0.0138556734,
0.2228537649,
-0.3553269506,
-0.6054434776,
-0.0821304843,
0.1580117643,
-0.0303068124,
0.1140199825,
-0.4135736823,
-0.0624737591,
0.3374498487,
0.0369988345,
0.0575544685,
0.2023040205,
0.0482079983,
-0.2381666154,
-0.1790002584,
-0.1499948204,
0.1198800355,
-0.10011255,
-0.1120221391,
-0.1738822162
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | I think the work around as suggested in the issue [#886] is not working for several languages, such as `id`. For example, I tried all the dates to download dataset for `id` langauge from the following link: (https://github.com/huggingface/datasets/pull/886) [https://dumps.wikimedia.org/idwiki/](https://dumps.wikimedia.org/idwiki/ )
> >>> dataset = load_dataset('wikipedia', language='id', date="20210501", beam_runner='DirectRunner')
WARNING:datasets.builder:Using custom data configuration 20210501.id-date=20210501,language=id
Downloading and preparing dataset wikipedia/20210501.id (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /Users/.cache/huggingface/datasets/wikipedia/20210501.id-date=20210501,language=id/0.0.0/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/load.py", line 745, in load_dataset
builder_instance.download_and_prepare(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/builder.py", line 574, in download_and_prepare
self._download_and_prepare(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/builder.py", line 1139, in _download_and_prepare
super(BeamBasedBuilder, self)._download_and_prepare(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/builder.py", line 630, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/Users/.cache/huggingface/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 420, in _split_generators
downloaded_files = dl_manager.download_and_extract({"info": info_url})
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/download_manager.py", line 287, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/download_manager.py", line 195, in download
downloaded_path_or_paths = map_nested(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 203, in map_nested
mapped = [
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 204, in <listcomp>
_single_map_nested((function, obj, types, None, True)) for obj in tqdm(iterable, disable=disable_tqdm)
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 142, in _single_map_nested
return function(data_struct)
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/download_manager.py", line 218, in _download
return cached_path(url_or_filename, download_config=download_config)
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/file_utils.py", line 281, in cached_path
output_path = get_from_cache(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/file_utils.py", line 623, in get_from_cache
raise ConnectionError("Couldn't reach {}".format(url))
ConnectionError: Couldn't reach https://dumps.wikimedia.org/idwiki/20210501/dumpstatus.json
Moreover the downloading speed for `non-en` language is very very slow. And interestingly the download stopped after approx a couple minutes due to the read time-out. I tried numerous times and the results is same. Is there any feasible way to download non-en language using huggingface?
> File "/Users/miislamg/opt/anaconda3/envs/proj-semlm/lib/python3.9/site-packages/requests/models.py", line 760, in generate
raise ConnectionError(e)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='dumps.wikimedia.org', port=443): Read timed out.
Downloading: 7%|████████▎ | 10.2M/153M [03:35<50:07, 47.4kB/s] | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 274 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
I think the work around as suggested in the issue [#886] is not working for several languages, such as `id`. For example, I tried all the dates to download dataset for `id` langauge from the following link: (https://github.com/huggingface/datasets/pull/886) [https://dumps.wikimedia.org/idwiki/](https://dumps.wikimedia.org/idwiki/ )
> >>> dataset = load_dataset('wikipedia', language='id', date="20210501", beam_runner='DirectRunner')
WARNING:datasets.builder:Using custom data configuration 20210501.id-date=20210501,language=id
Downloading and preparing dataset wikipedia/20210501.id (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /Users/.cache/huggingface/datasets/wikipedia/20210501.id-date=20210501,language=id/0.0.0/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/load.py", line 745, in load_dataset
builder_instance.download_and_prepare(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/builder.py", line 574, in download_and_prepare
self._download_and_prepare(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/builder.py", line 1139, in _download_and_prepare
super(BeamBasedBuilder, self)._download_and_prepare(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/builder.py", line 630, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/Users/.cache/huggingface/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 420, in _split_generators
downloaded_files = dl_manager.download_and_extract({"info": info_url})
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/download_manager.py", line 287, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/download_manager.py", line 195, in download
downloaded_path_or_paths = map_nested(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 203, in map_nested
mapped = [
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 204, in <listcomp>
_single_map_nested((function, obj, types, None, True)) for obj in tqdm(iterable, disable=disable_tqdm)
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 142, in _single_map_nested
return function(data_struct)
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/download_manager.py", line 218, in _download
return cached_path(url_or_filename, download_config=download_config)
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/file_utils.py", line 281, in cached_path
output_path = get_from_cache(
File "/Users/opt/anaconda3/envs/proj/lib/python3.9/site-packages/datasets/utils/file_utils.py", line 623, in get_from_cache
raise ConnectionError("Couldn't reach {}".format(url))
ConnectionError: Couldn't reach https://dumps.wikimedia.org/idwiki/20210501/dumpstatus.json
Moreover the downloading speed for `non-en` language is very very slow. And interestingly the download stopped after approx a couple minutes due to the read time-out. I tried numerous times and the results is same. Is there any feasible way to download non-en language using huggingface?
> File "/Users/miislamg/opt/anaconda3/envs/proj-semlm/lib/python3.9/site-packages/requests/models.py", line 760, in generate
raise ConnectionError(e)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='dumps.wikimedia.org', port=443): Read timed out.
Downloading: 7%|████████▎ | 10.2M/153M [03:35<50:07, 47.4kB/s] | [
0.1800149381,
-0.1523415446,
-0.09894903,
0.4668174684,
0.1006341353,
0.2423329949,
0.1486940235,
0.1840969026,
0.7133536935,
-0.2562915087,
-0.0293330774,
0.0879318863,
0.1379879266,
-0.0704497248,
0.0778177902,
-0.2036104202,
0.0833219066,
-0.1366673708,
0.0640653074,
-0.2263312936,
-0.32865116,
0.3770776987,
-0.3152111471,
-0.0339933895,
-0.3056197464,
0.2438494414,
0.0005958751,
-0.1118373498,
0.1123137772,
-0.4247995615,
0.2647846341,
0.3021339774,
0.3125078082,
0.175404489,
-0.0001204848,
-0.0556482747,
0.704850018,
-0.1893925071,
-0.5230682492,
-0.2352165133,
-0.0554582961,
-0.4046286941,
0.2281315327,
-0.1170848906,
-0.0360486358,
-0.2035436332,
0.1830866784,
-0.3900808096,
0.1351652741,
0.0376098566,
0.1968803108,
0.0330658033,
0.1856202781,
0.034865845,
0.272990346,
0.1124170423,
0.1574663818,
0.1152703613,
-0.0042904615,
-0.1470837295,
0.006008815,
0.1632695049,
-0.0448037274,
-0.1061686128,
0.2156064957,
-0.0627297759,
-0.1983723491,
-0.651352644,
0.2012461722,
0.2551748455,
0.6670199037,
-0.0362071097,
-0.4080485106,
-0.2918709517,
0.008493267,
0.0092043057,
0.2186989039,
0.2262768149,
-0.1332880706,
-0.0248332992,
-0.0666921809,
-0.2193876803,
-0.0538425632,
0.6153123975,
-0.0989302471,
0.372101903,
0.0412183478,
0.1525869519,
0.1164748967,
-0.2843741477,
-0.3300949931,
-0.1842394918,
0.3082188368,
0.4228442609,
-0.2161459327,
0.3507114053,
-0.0911563784,
0.3997014463,
0.085133791,
-0.1119419634,
-0.2023399621,
0.2832680643,
-0.0947816223,
0.1178442389,
0.2353040427,
-0.0526314303,
0.2513491511,
-0.0842843354,
0.3670354486,
0.155224219,
-0.1794158816,
0.0032225009,
-0.2408721745,
-0.2111485302,
-0.2467988729,
-0.1010614336,
-0.0562511869,
-0.270691812,
0.2040661573,
0.1117737293,
-0.2330822796,
-0.2889910638,
-0.0754253194,
0.2926602066,
-0.1363919526,
0.1686061323,
0.121771045,
0.3821803331,
-0.4291240573,
-0.0510993525,
0.0246102884,
-0.083572641,
-0.1841305196,
0.1816867739,
0.4236652553,
-0.011211399,
0.3521848917,
-0.0009735711,
-0.0124726929,
-0.1143767908,
-0.1391679943,
-0.1302518249,
-0.0192520469,
0.0520947948,
0.1551323831,
0.5195740461,
0.0761608332,
-0.2763778567,
-0.22868523,
0.0425433069,
-0.1405331194,
0.2529542446,
0.0347803161,
0.0855044872,
-0.4118612707,
-0.157736659,
-0.220283702,
0.3061147332,
0.161356613,
-0.1992474496,
-0.0212133676,
-0.2634349167,
-0.1758727431,
-0.0303867422,
0.3723707795,
0.7429202199,
-0.2238101214,
-0.1922197938,
-0.0357125662,
-0.0061539235,
0.1993001103,
0.246270895,
-0.0146878064,
0.2034336478,
-0.1720423549,
0.065737009,
0.2481042296,
-0.1104742736,
-0.0709447786,
0.0185720026,
0.2829234004,
0.3354869485,
-0.0245476291,
0.0082482956,
0.1751098782,
0.0824776143,
0.0755902082,
0.4516780376,
0.2592275143,
-0.0364522934,
-0.2878524959,
-0.095509477,
0.4473000169,
0.3634182215,
0.111195758,
-0.1489386708,
0.0520697683,
0.3142111897,
0.20286946,
-0.017319018,
0.1274421811,
0.3553067744,
-0.4120816588,
0.2675914466,
0.1979928017,
-0.0266609639,
-0.6482115984,
0.3039933443,
0.0955560952,
0.1926102936,
-0.0523335859,
0.0743493065,
-0.3088148236,
-0.1769931018,
-0.277630657,
0.1526446491,
0.0550502315,
0.198576197,
0.0841119289,
0.4623351097,
-0.0035470352,
-0.0708871037,
0.0051470026,
-0.0730245113,
-0.6251958013,
0.3269505799,
0.0571065918,
-0.007304946,
0.0225015208,
0.2638174593,
0.3264118433,
-0.1936039627,
-0.05406297,
-0.2034283727,
0.2910214663,
0.341198355,
0.1482166201,
-0.0214506388,
0.3269129097,
-0.3324073255,
0.2147026956,
0.2711622715,
0.1093294472,
-0.2314001918,
-0.0771376789,
0.3427500725,
-0.1766520143,
0.5679540038,
0.0784942359,
0.0279700309,
0.3493210971,
0.0560685247,
-0.1829763055,
-0.2069451064,
0.4769164324,
0.0130712539,
0.1207298338,
0.25237602,
-0.2548974156,
-0.0791640133,
0.4247353077,
0.056790188,
-0.0763746798,
0.30219841,
-0.1847208589,
-0.0454659611,
0.0024954155,
-0.1456357837,
-0.1451689005,
0.1742580235,
0.1809597909,
-0.3056588769,
0.0288484693,
-0.2651313245,
0.0159962028,
0.1129487455,
0.0938935354,
0.1430918276,
0.144743681,
-0.0515068658,
-0.3176006973,
0.0371385738,
0.0587169752,
0.1647081673,
-0.2192393541,
0.0716494694,
-0.4835643172,
-0.4850091636,
-0.2407977134,
-0.2680954337,
-0.506727457,
-0.41045928,
-0.1513092667,
0.0539762937,
0.0409825072,
0.1727265716,
-0.009034887,
-0.0408731848,
-0.252690345,
-0.2420944422,
-0.4165678024,
-0.4249740243,
-0.3751246929,
-0.0479992777,
0.5167557001,
0.1541213393,
0.1876807213,
-0.1379989833,
-0.2960051894,
-0.2453895807,
-0.3184577823,
-0.1335679442,
-0.2716217339,
0.2574394047,
-0.2281762213,
0.3630480766,
-0.1546149105,
-0.0545477234,
0.4395361841,
0.0442843959,
0.0154842064,
0.1544585079,
0.1573006809,
0.2080965191,
-0.051299952,
-0.3745373189,
-0.3137007654,
-0.3135612011,
0.0784838796,
0.1661368012,
-0.012398269,
0.1932969391,
-0.1125323027,
0.0462636016,
-0.0381771214,
0.0103718638,
-0.1560000479,
0.1697085649,
0.4793509543,
-0.1100021303,
-0.3932743967,
0.0673183724,
0.0674257129,
0.1897399873,
0.0910400301,
-0.5468358397,
0.1703420281,
-0.150428012,
0.1138712987,
0.2040822059,
0.0073746778,
-0.0466769412,
-0.2249494195,
0.1329429448,
0.0290108323,
-0.2154765129,
-0.0998133719,
-0.1156411022,
0.1483949721,
0.1870358437,
0.3458721638,
-0.3495886922,
0.7946067452,
0.3428250551,
0.0824462622,
0.3724908531,
0.169915244,
0.1285422742,
-0.1782814711,
-0.3313497603,
0.0166985914,
-0.060523279,
0.0615107119,
0.3438504636,
0.0552458391,
-0.386154592,
-0.2321020067,
0.044662565,
-0.2911090851,
-0.3418125808,
0.1781961173,
-0.1911976486,
0.2100483775,
-0.0755980164,
0.0897875279,
-0.0583663061,
-0.3991449177,
-0.0373492539,
0.2391899973,
0.3913218379,
0.2014875561,
-0.4114879966,
-0.4465773702,
-0.3870900869,
0.2356890887,
-0.0073546171,
0.482668519,
-0.2531714439,
0.0031653568,
0.1816847771,
0.1617054343,
0.319250226,
-0.2490256578,
-0.1349230409,
0.4210759997,
-0.1227575541,
-0.7164521813,
0.0497483909,
-0.0690234601,
0.080678314,
0.3638059795,
0.3140455782,
-0.5319975019,
-0.3132553101,
-0.0475542992,
0.5816975832,
-0.1379835904,
-0.1933636069,
-0.3705927432,
-0.1753131151,
-0.5105838776,
-0.1583567411,
-0.1646461785,
0.1698282957,
-0.1786432266,
0.0079972669,
-0.018815808,
0.2356261015,
0.077411145,
0.1168572009,
0.1895965338,
0.3470455706,
0.0704396293,
0.2878076136,
0.0471600108,
-0.0871028453,
0.3934617341,
-0.0413501672,
-0.2258592099,
-0.0686271638,
-0.0754133314,
0.2024582326,
0.1237273589,
0.180600673,
0.0705814511,
0.134158209,
-0.1330599636,
-0.1399548352,
0.0249320716,
0.1640876532,
-0.0207959712,
-0.3369775414,
-0.4289215207,
0.629591763,
0.051032193,
-0.0160409361,
0.2564947605,
0.0874257609,
-0.4474817812,
0.3799234033,
-0.0563121662,
0.842880547,
-0.247880131,
0.1902818531,
-0.0803550929,
0.0971008167,
0.5161350369,
0.13196145,
-0.0354963616,
-0.2667683959,
0.37174052,
0.0381696597,
0.097698465,
-0.0479689054,
0.0650866777,
-0.1759567857,
0.341398865,
-0.1395684332,
0.0184767209,
-0.0198477805,
0.5373399258,
0.048616156,
-0.4129087329,
-0.3865795434,
0.0466350093,
-0.1613311619,
0.4640794098,
-0.0780521706,
0.0684453771,
-0.0245863944,
-0.0271554142,
-0.2905389071,
0.0905471593,
-0.1602772474,
0.123157084,
-0.1205274761,
-0.3054321706,
0.319547832,
0.4627718627,
0.1373739094,
0.125913471,
-0.2365531027,
0.1082671881,
-0.2404055446,
-0.2273079753,
-0.0074580126,
0.0549566373,
0.0817352161,
-0.0056702271,
-0.3299393654,
-0.0615575425,
-0.0077631176,
0.0784516931,
-0.4299234152,
0.1066841483,
-0.0029674331,
0.0635944679,
-0.3002034724,
-0.0961659104,
-0.0863901973,
-0.0379307792,
0.0927071124,
0.0291345064,
0.1981612891,
0.0762012154,
0.3280578256,
-0.258569479,
-0.1356019825,
0.3534422815,
-0.0675864369,
-0.0445448868,
0.7959244251,
0.375174135,
-0.3174499273,
-0.1682406813,
-0.2172056437,
-0.3076891899,
-0.5152921677,
-0.2519539893,
0.1604456306,
0.330883503,
-0.2394683659,
0.0721555054,
0.0040703565,
-0.4090077281,
0.0556102842,
-0.6211546659,
-0.0471360981,
0.3892133236,
0.019845333,
0.0419544652,
-0.09817601,
-0.3809344769,
0.1713947505,
0.0565866604,
-0.1696271896,
-0.0085771717,
-0.1524817944,
0.1862352192,
0.4146293104,
0.0372424945,
-0.1246513501,
-0.1072085947,
0.0359360613,
-0.1875634938,
0.0902869627,
-0.1480580568,
-0.0431226641,
0.1577671915,
0.1764921844,
-0.145552367,
-0.1077942103,
-0.3282789588,
0.3568215072,
-0.4021496475,
-0.0671137869,
-0.0220960826,
-0.0244816802,
0.3344536722,
-0.0031858748,
0.2768302262,
-0.0731009319,
0.1625114828,
0.0563128963,
0.0489858799,
0.2679499984,
0.0181406923,
0.3847284615,
0.1170411259,
-0.3176272511,
-0.0271483064,
-0.0381817967,
0.1088416576,
0.2414455712,
-0.1534074098,
0.2959812284,
0.41751495,
-0.0091352798,
0.5071923733,
0.001461044,
-0.0576083846,
0.3259931803,
0.0661522448,
-0.1879683584,
0.0828083977,
0.6814029217,
0.0829468966,
-0.118481569,
0.220780611,
0.2282132804,
-0.2920017838,
0.1522595137,
-0.0557174087,
0.2191085815,
-0.113162443,
0.2034055442,
0.0151085686,
0.2112384588,
0.1067376137,
0.0779900178,
0.1231912822,
-0.1127574071,
0.05201355,
0.0141875036,
0.1695987731,
-0.0638835132,
0.2166302055,
-0.1179408878,
-0.3135273457,
0.0741369277,
0.3286196291,
-0.0034966543,
0.2168740183,
-0.0233670138,
0.3721562922,
0.0895648077,
-0.1539682895,
-0.1557576209,
0.2957777977,
0.0512534305,
-0.2696963549,
-0.2077618688,
-0.1345244348,
-0.1193177029,
0.023681514,
-0.0822672546,
-0.3273008466,
0.0912086517,
0.3711729646,
-0.1423127055,
-0.6611459255,
0.3607363999,
-0.196574524,
0.2174196541,
-0.1351705939,
0.2060372829,
0.0114979818,
-0.043363478,
0.0537621863,
0.163427487,
0.2086668909,
0.2028916925,
-0.2874932587,
0.0840261579,
-0.0873385146,
0.1300367117,
-0.1065585241,
0.5063301325,
0.2059765309,
0.2298506349,
0.3066174984,
0.0775055587,
-0.065345265,
-0.3077971637,
0.2216906846,
0.0170844123,
-0.2135255337,
0.1237167493,
-0.241434589,
0.0527080446,
-0.2354444712,
-0.1045848206,
-0.3741670549,
0.2458757758,
0.1488330066,
0.2014047205,
0.1116239429,
-0.1091586053,
0.0167852007,
-0.1219195426,
0.6659026742,
0.4709484279,
0.2420376837,
-0.3118557334,
-0.2013340592,
-0.682638526,
-0.1007699072,
-0.2332902849,
0.0286015347,
0.2069205642,
0.1498174816,
0.1325330138,
0.0392778963,
0.0202886388,
-0.2368140072,
-0.0766947716,
0.1923871934,
-0.3165415823,
-0.1501757801,
-0.2111923993,
0.0761068389,
-0.305655688,
-0.0772879049,
0.2012036741,
-0.2959721982,
-0.1056287512,
-0.0678513646,
0.008313857,
0.019056201,
0.1428830773,
0.1978752464,
0.0781712011,
0.6389297247,
0.0099744946,
0.019310765,
-0.3615390658,
-0.2315557748,
0.030694671,
0.4513468146,
-0.3157904446,
0.3092701435,
-0.1534179151,
-0.0080641955,
-0.2956069708,
0.3202162683,
-0.1158283502,
0.0076613165,
0.0380926877,
-0.1818341315,
0.0075738654,
-0.1475877762,
0.1352751404,
0.3294769824,
-0.0790999681,
0.0743840262,
-0.3277178407,
-0.1718484163,
0.2044258416,
-0.4374213219,
-0.3669038117,
0.0459500588,
0.0716148317,
0.2126588821,
0.0860102624,
-0.4164582789,
0.0903614908,
0.3953001797,
-0.0862372965,
0.0810389966,
0.1426230669,
0.03734079,
-0.1932458878,
-0.2294643521,
-0.1520403624,
0.1861824989,
-0.1080211699,
-0.0250991955,
-0.1390414536
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | Hi ! The link https://dumps.wikimedia.org/idwiki/20210501/dumpstatus.json seems to be working fine for me.
Regarding the time outs, it must come either from an issue on the wikimedia host side, or from your internet connection.
Feel free to try again several times. | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 40 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
Hi ! The link https://dumps.wikimedia.org/idwiki/20210501/dumpstatus.json seems to be working fine for me.
Regarding the time outs, it must come either from an issue on the wikimedia host side, or from your internet connection.
Feel free to try again several times. | [
0.192068845,
-0.2602577806,
-0.1416168809,
0.3762247562,
0.1157504767,
0.2121074796,
0.1645561606,
0.1551885605,
0.7062771916,
-0.2246859521,
0.1179802343,
0.0496409163,
0.2343846411,
-0.2096702754,
-0.0094012264,
-0.221013099,
0.0925962552,
-0.1953918487,
0.0648474991,
-0.2729017735,
-0.2977041602,
0.2671945393,
-0.3978960812,
0.0271711163,
-0.185713768,
0.1696381718,
0.0080056265,
-0.1815735847,
0.0773586109,
-0.3585592508,
0.1663144231,
0.2452062219,
0.3375380039,
0.1511195153,
-0.0001189297,
-0.1920973808,
0.6491460204,
-0.1044017375,
-0.4709958732,
-0.0881614983,
-0.1650995761,
-0.3635074496,
0.2519366145,
-0.1299187243,
0.0853193402,
-0.1647309065,
0.2627396584,
-0.4774170816,
0.1545998752,
0.0544664562,
0.1764226109,
-0.044277139,
0.0150868148,
0.1050192565,
0.13926588,
0.0897765383,
0.1414279938,
0.0952253789,
0.1480287611,
-0.3328266442,
0.0266099367,
0.1536878049,
-0.1374326199,
-0.1619032919,
0.2353245616,
-0.1162594408,
-0.2261555791,
-0.6714558601,
0.1671419442,
0.2753605843,
0.6968306303,
0.1771869212,
-0.3633897603,
-0.2033319175,
0.0303314142,
0.1108859107,
0.2197709978,
0.3188805878,
-0.1332757026,
-0.0756758749,
-0.0223537106,
-0.1838999838,
-0.0571159311,
0.6162974834,
-0.032754235,
0.4338471889,
0.1757820547,
0.2188184559,
0.0114604328,
-0.2223478556,
-0.2297369242,
-0.2822296619,
0.2777376771,
0.2912100852,
-0.229287982,
0.3861021698,
-0.0092941821,
0.3389608562,
-0.0212208331,
-0.1730935872,
-0.170312658,
0.2720247507,
-0.0565526858,
0.150032118,
0.2194151878,
-0.1049372777,
0.2648622096,
-0.0503604077,
0.3448390067,
0.0464346707,
-0.2778553069,
-0.0219870359,
-0.2163527012,
-0.1501249373,
-0.1527250409,
-0.1844338924,
-0.1040599719,
-0.3015194237,
0.2841691077,
0.1173973382,
-0.1959317178,
-0.2869795561,
-0.189568907,
0.2224100679,
-0.1789708436,
0.3431456089,
0.2166810483,
0.3182796836,
-0.4698387384,
-0.0308832321,
0.0581999086,
-0.0295044631,
-0.20479092,
0.2661013603,
0.2671019435,
0.0934593529,
0.3063891232,
-0.0349105373,
0.0334959142,
-0.1906964481,
0.0378604531,
-0.1908875555,
0.008627288,
-0.0087973066,
0.1492285579,
0.6243290305,
0.10693492,
-0.2406967431,
-0.1143353134,
0.0173428711,
-0.1332241744,
0.3218245506,
-0.0038700309,
0.1082377881,
-0.2380399406,
-0.0695932806,
-0.2211725414,
0.2241665721,
0.1904153824,
-0.2009399533,
-0.0108834207,
-0.3347891271,
-0.1034783721,
-0.0116113313,
0.2275259644,
0.5787906647,
-0.2940458655,
0.0078720301,
-0.0782736391,
0.246021688,
0.2944016457,
0.2770062387,
-0.1376060247,
0.224449724,
-0.1103301793,
0.105093658,
0.230057776,
-0.0963964239,
0.0633382946,
-0.0674740598,
0.2769451737,
0.2506722808,
-0.1018456519,
0.0715541318,
0.3002308011,
0.0578250736,
0.1811819375,
0.5470145941,
0.1700083166,
-0.0198374204,
-0.3321274519,
0.0105104223,
0.4842860699,
0.2271985412,
0.0975400433,
-0.2832227647,
-0.0301947556,
0.3478218913,
0.1292705238,
0.0006160904,
0.1187986061,
0.3935267329,
-0.4654512405,
0.2700215578,
0.3104886413,
-0.0189499483,
-0.4321422577,
0.2205508798,
0.1768898368,
0.2310066223,
-0.0370153822,
0.1332352459,
-0.2768266797,
-0.0912003368,
-0.2021670043,
0.2402666956,
0.078774102,
0.11181584,
0.01467219,
0.550321281,
0.0871952921,
-0.2343343943,
-0.023013182,
-0.0929097086,
-0.5333669782,
0.4093144536,
0.0253340956,
-0.0747518539,
0.0875981301,
0.1141731068,
0.3228400648,
-0.1566503793,
-0.1230947375,
-0.2445737123,
0.2439672947,
0.342117399,
0.1769212186,
-0.0512001663,
0.2956597209,
-0.4270627499,
0.2064128816,
0.2302022874,
0.2040230334,
-0.2941741347,
-0.0031862864,
0.2778056264,
-0.1314049959,
0.579092741,
0.0430004001,
0.0254935846,
0.3273768127,
0.0562184975,
-0.1040595397,
-0.1395439357,
0.4879277349,
0.0925179124,
-0.1234053373,
0.2163669318,
-0.1414019167,
0.0244224127,
0.5701660514,
0.0576846749,
-0.141188845,
0.1743887067,
-0.1842638403,
-0.2130208164,
0.045165658,
-0.1988900006,
-0.1464825273,
0.1640645266,
0.1196082011,
-0.4188326895,
0.0040634908,
-0.1981057525,
0.0572084971,
0.0680594966,
0.1505714804,
0.1231385618,
0.1096358225,
-0.098469615,
-0.2919632792,
-0.0995786339,
0.0575047508,
0.1817236692,
-0.1256028861,
0.1521683335,
-0.3066745996,
-0.3357039392,
-0.1427483857,
-0.2836947143,
-0.4758935869,
-0.5971002579,
-0.131085217,
0.0254995767,
0.0646240413,
0.1711786836,
-0.0614617988,
-0.0896035731,
-0.10607934,
-0.102960825,
-0.4092003703,
-0.4708699286,
-0.3191872835,
-0.0478072315,
0.4986443222,
0.2129052877,
0.0927865654,
-0.0574296154,
-0.3174841404,
-0.195725888,
-0.2670000792,
-0.1926907152,
-0.2869165242,
0.1864595115,
-0.3020750284,
0.4951708317,
-0.1321395338,
-0.1355846226,
0.4025030434,
-0.0661303923,
0.1005681008,
0.1314281225,
0.1966308653,
0.1588289291,
-0.0201146938,
-0.3963413239,
-0.2530011535,
-0.221023947,
-0.1711594909,
0.0928544477,
-0.0977480561,
-0.057378687,
-0.029981697,
-0.0126927942,
0.0298291408,
-0.0714878738,
-0.0779577643,
0.220189184,
0.5501048565,
-0.0635477155,
-0.4144962728,
0.1688748896,
0.0828102455,
0.1014464349,
0.0397736095,
-0.5010534525,
0.1527960449,
-0.2344358265,
0.0599974245,
0.1821861714,
-0.0837003142,
-0.056918595,
-0.2782071829,
0.0634087026,
0.075475499,
-0.1482954323,
-0.1136239618,
-0.0790368915,
0.1974340826,
0.091221422,
0.3415471315,
-0.3307110667,
0.6799038053,
0.160436824,
0.0958408713,
0.1546343565,
0.2151066661,
-0.0109757595,
-0.1354507208,
-0.2767878175,
0.0811415762,
-0.0431490541,
-0.021531757,
0.2655890882,
0.0784380212,
-0.4907025099,
-0.1525413692,
0.145276621,
-0.321346432,
-0.3399423957,
0.1406853497,
-0.2062226981,
0.1402598321,
0.0673844367,
0.0754929259,
-0.0615368336,
-0.4176481664,
-0.0513379797,
0.1821240336,
0.3251452446,
0.0753134415,
-0.4876291156,
-0.4883401096,
-0.3488649428,
0.1095597148,
0.0329042189,
0.5476911068,
-0.2390421629,
0.0497858524,
0.2448568344,
0.1036564112,
0.2784719169,
-0.2613420784,
-0.0725744069,
0.3867154717,
0.0527335703,
-0.5512060523,
0.0222634301,
-0.0696460605,
0.076198861,
0.3663046956,
0.1582710743,
-0.4561686516,
-0.2515118718,
-0.1542204916,
0.593349576,
-0.1971589923,
-0.1010305136,
-0.3331769705,
-0.2064866722,
-0.4048654437,
-0.1718538851,
-0.1741794795,
0.1794888973,
-0.2179679275,
-0.0510047376,
0.056990169,
0.3927876949,
0.128698796,
0.1028496623,
0.0724303499,
0.1700818837,
-0.0082414299,
0.1224795207,
0.0387488194,
-0.0678741857,
0.246794343,
0.1005031019,
-0.0703676343,
-0.0824479014,
-0.0939302221,
0.2078877836,
0.0814424753,
0.2122065127,
0.1155674011,
0.0986323357,
0.0036281645,
-0.0799863338,
-0.011969883,
0.2462151647,
0.0859594271,
-0.3830393255,
-0.3961808681,
0.4086367786,
0.0116264224,
-0.0380274653,
0.4489440322,
0.028225245,
-0.3888244331,
0.4838346243,
0.0946416929,
0.9694379568,
-0.3626307249,
0.2214490771,
-0.0437713936,
0.2138671577,
0.4991882741,
0.1092955023,
-0.0625864565,
-0.329461813,
0.3004904091,
-0.0175756365,
0.0838003904,
-0.0148680117,
0.028921159,
-0.1912823915,
0.2490025759,
-0.0175567195,
0.0332378596,
-0.1363825798,
0.4640839398,
0.1043734252,
-0.3234972954,
-0.410351336,
0.0503704734,
-0.1101785451,
0.2641598284,
-0.0269806683,
0.136090368,
-0.0234081596,
-0.0133633837,
-0.2455649674,
0.1273057014,
0.0046776608,
0.1741745621,
-0.2429159433,
-0.1915962994,
0.3354976475,
0.4963086545,
-0.0401246138,
0.2873746455,
-0.137369439,
0.1501462311,
-0.3548802435,
-0.2688719332,
-0.072710529,
0.0108714849,
-0.0164028052,
0.0468469486,
-0.4290374815,
0.1894696355,
-0.0311859977,
0.0892991275,
-0.4014639854,
0.1624377668,
-0.0486571454,
0.0198917352,
-0.3122404218,
-0.0576992407,
-0.1851564348,
-0.0049887449,
0.096187748,
0.0825843289,
0.1815801561,
0.1937353313,
0.4425463676,
-0.2791810632,
-0.0839968771,
0.325494349,
-0.0386319458,
0.0233749151,
0.8597972393,
0.5083205104,
-0.409045577,
-0.2138402164,
-0.199401781,
-0.4488829076,
-0.3732962012,
-0.198515147,
0.1887005717,
0.4621221125,
-0.1589324474,
-0.0379788801,
0.0002928199,
-0.4264068604,
0.0035268366,
-0.7153944969,
0.077302739,
0.3578206599,
-0.0380159207,
-0.032780733,
-0.1383405477,
-0.5434530973,
0.1512973607,
0.1065286025,
-0.1819632202,
-0.0730728805,
-0.3117237687,
0.2369903773,
0.2586437464,
-0.0669716373,
-0.1493397653,
-0.1213871092,
0.0419023968,
-0.2143434286,
0.1584364474,
-0.1586949229,
0.0270054638,
0.1492963433,
0.1782793254,
-0.2000199705,
-0.112138629,
-0.4700894654,
0.4432992041,
-0.3254801631,
-0.1196746901,
-0.1218462437,
-0.1160029769,
0.2512050271,
0.0406893902,
0.2366197705,
-0.0806230754,
0.1934553385,
-0.0462983772,
0.1295408607,
0.2762294114,
0.0621566735,
0.6145498157,
0.1069129854,
-0.5067752004,
0.0503825024,
-0.1780495793,
0.1562203318,
0.4244709611,
-0.1226939112,
0.3545632362,
0.3840478659,
-0.0265257023,
0.5288015604,
-0.0064137131,
-0.0363181122,
0.2458726466,
0.0774638355,
-0.1324832439,
0.0445875712,
0.6764922142,
-0.021214515,
-0.1660684049,
0.1263346374,
0.0910315812,
-0.474735409,
0.0532795973,
-0.0454300568,
0.2685056329,
-0.1508268565,
0.1641149968,
0.0040682345,
0.2596120238,
0.129816696,
0.0720229521,
0.133285135,
-0.1690905988,
-0.0071402453,
0.0926574767,
0.1842785776,
-0.0161301438,
0.2660955191,
-0.2436092496,
-0.279232502,
0.0486121997,
0.385207653,
-0.0197932273,
0.1365385652,
0.0286119785,
0.1218475997,
0.0750942528,
-0.1625016928,
-0.0949183255,
0.3073095977,
0.0666258484,
-0.2399258614,
-0.2406445295,
-0.094600305,
-0.0292490721,
0.0449005961,
0.079350993,
-0.1638033986,
0.0914225727,
0.44560045,
-0.0483416244,
-0.5520179272,
0.4726902843,
-0.2859684527,
0.225538522,
-0.2370163202,
0.0579248965,
0.0388136953,
-0.0223018713,
0.007335525,
0.0700336546,
0.1823275685,
0.2420695126,
-0.3882777095,
0.1533539146,
-0.0922933966,
0.0992064476,
0.0095500574,
0.5086567998,
0.1092300564,
0.2271666974,
0.2142224014,
0.0890799165,
-0.1114998162,
-0.3165867627,
0.2277706265,
-0.0068158098,
-0.073211886,
-0.0116785504,
-0.116115734,
0.038386941,
-0.2987957597,
-0.0176454484,
-0.2505204678,
0.1779466867,
0.1882173121,
0.1119512469,
0.0408330336,
-0.2005803138,
0.012509156,
-0.1838636994,
0.7236379385,
0.5859831572,
0.28359586,
-0.2090543509,
-0.2439520508,
-0.4555476904,
-0.0817558467,
-0.4325774908,
-0.0857657567,
0.2726195753,
0.1955885291,
0.1392156482,
0.0434404761,
0.1211519092,
-0.0834993348,
-0.1928898394,
0.331289202,
-0.3818549812,
-0.2281170785,
-0.1718607843,
0.1640769094,
-0.4031392336,
-0.1043486372,
0.256142199,
-0.3159589469,
-0.0706807896,
-0.0782779977,
-0.0037423,
-0.0126480609,
0.2914524078,
0.1636821628,
0.1555775553,
0.5637648702,
0.1061394513,
0.0176238045,
-0.2532347739,
-0.2525509298,
0.1693686694,
0.3212293983,
-0.2114065588,
0.3466904163,
-0.1603902578,
0.0920934081,
-0.1886447966,
0.3205467761,
-0.2373629808,
0.0067880303,
0.04996223,
-0.2436492741,
0.0094699636,
-0.0830180719,
0.0681336671,
0.2913365662,
-0.0288167819,
0.0511912555,
-0.1973847449,
-0.1326449215,
0.1089984179,
-0.3755275011,
-0.5066684484,
0.054143995,
-0.001311034,
0.1312771738,
0.1871964633,
-0.3093902469,
-0.0325961113,
0.3663621247,
-0.006638227,
0.0827039182,
0.1522185951,
0.093211174,
-0.0943487212,
-0.2015012801,
-0.1099101901,
0.0467753857,
-0.0614158101,
-0.1624552757,
-0.1030108631
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | I was trying to download dataset for `es` language, however I am getting the following error:
```
dataset = load_dataset('wikipedia', language='es', date="20210320", beam_runner='DirectRunner')
```
```
Downloading and preparing dataset wikipedia/20210320.es (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /scratch/user_name/datasets/wikipedia/20210320.es-date=20210320,language=es/0.0.0/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1...
Traceback (most recent call last):
File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1368, in apache_beam.runners.common._OutputProcessor.process_outputs
File "/scratch/user_name/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 492, in _clean_content
text = _parse_and_clean_wikicode(raw_content, parser=mwparserfromhell)
File "/scratch/user_name/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 548, in _parse_and_clean_wikicode
section_text.append(section.strip_code().strip())
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/wikicode.py", line 639, in strip_code
stripped = node.__strip__(**kwargs)
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/nodes/html_entity.py", line 60, in __strip__
return self.normalize()
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/nodes/html_entity.py", line 150, in normalize
return chr(htmlentities.name2codepoint[self.value])
KeyError: '000nbsp'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "download_dataset_all.py", line 8, in <module>
dataset = load_dataset('wikipedia', language=language, date="20210320", beam_runner='DirectRunner')
File "/opt/conda/lib/python3.7/site-packages/datasets/load.py", line 748, in load_dataset
use_auth_token=use_auth_token,
File "/opt/conda/lib/python3.7/site-packages/datasets/builder.py", line 575, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/opt/conda/lib/python3.7/site-packages/datasets/builder.py", line 1152, in _download_and_prepare
pipeline_results = pipeline.run()
File "/opt/conda/lib/python3.7/site-packages/apache_beam/pipeline.py", line 564, in run
return self.runner.run_pipeline(self, self._options)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/direct/direct_runner.py", line 131, in run_pipeline
return runner.run_pipeline(pipeline, options)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 190, in run_pipeline
pipeline.to_runner_api(default_environment=self._default_environment))
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 200, in run_via_runner_api
return self.run_stages(stage_context, stages)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 366, in run_stages
bundle_context_manager,
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 562, in _run_stage
bundle_manager)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 602, in _run_bundle
data_input, data_output, input_timers, expected_timer_output)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 903, in process_bundle
result_future = self._worker_handler.control_conn.push(process_bundle_req)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/worker_handlers.py", line 378, in push
response = self.worker.do_instruction(request)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 610, in do_instruction
getattr(request, request_type), request.instruction_id)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 647, in process_bundle
bundle_processor.process_bundle(instruction_id))
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1001, in process_bundle
element.data)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 229, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 356, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "/opt/conda/lib/python3.7/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1368, in apache_beam.runners.common._OutputProcessor.process_outputs
File "/scratch/user_name/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 492, in _clean_content
text = _parse_and_clean_wikicode(raw_content, parser=mwparserfromhell)
File "/scratch/user_name/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 548, in _parse_and_clean_wikicode
section_text.append(section.strip_code().strip())
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/wikicode.py", line 639, in strip_code
stripped = node.__strip__(**kwargs)
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/nodes/html_entity.py", line 60, in __strip__
return self.normalize()
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/nodes/html_entity.py", line 150, in normalize
return chr(htmlentities.name2codepoint[self.value])
KeyError: "000nbsp [while running 'train/Clean content']"
``` | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 481 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
I was trying to download dataset for `es` language, however I am getting the following error:
```
dataset = load_dataset('wikipedia', language='es', date="20210320", beam_runner='DirectRunner')
```
```
Downloading and preparing dataset wikipedia/20210320.es (download: Unknown size, generated: Unknown size, post-processed: Unknown size, total: Unknown size) to /scratch/user_name/datasets/wikipedia/20210320.es-date=20210320,language=es/0.0.0/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1...
Traceback (most recent call last):
File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1368, in apache_beam.runners.common._OutputProcessor.process_outputs
File "/scratch/user_name/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 492, in _clean_content
text = _parse_and_clean_wikicode(raw_content, parser=mwparserfromhell)
File "/scratch/user_name/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 548, in _parse_and_clean_wikicode
section_text.append(section.strip_code().strip())
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/wikicode.py", line 639, in strip_code
stripped = node.__strip__(**kwargs)
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/nodes/html_entity.py", line 60, in __strip__
return self.normalize()
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/nodes/html_entity.py", line 150, in normalize
return chr(htmlentities.name2codepoint[self.value])
KeyError: '000nbsp'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "download_dataset_all.py", line 8, in <module>
dataset = load_dataset('wikipedia', language=language, date="20210320", beam_runner='DirectRunner')
File "/opt/conda/lib/python3.7/site-packages/datasets/load.py", line 748, in load_dataset
use_auth_token=use_auth_token,
File "/opt/conda/lib/python3.7/site-packages/datasets/builder.py", line 575, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/opt/conda/lib/python3.7/site-packages/datasets/builder.py", line 1152, in _download_and_prepare
pipeline_results = pipeline.run()
File "/opt/conda/lib/python3.7/site-packages/apache_beam/pipeline.py", line 564, in run
return self.runner.run_pipeline(self, self._options)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/direct/direct_runner.py", line 131, in run_pipeline
return runner.run_pipeline(pipeline, options)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 190, in run_pipeline
pipeline.to_runner_api(default_environment=self._default_environment))
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 200, in run_via_runner_api
return self.run_stages(stage_context, stages)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 366, in run_stages
bundle_context_manager,
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 562, in _run_stage
bundle_manager)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 602, in _run_bundle
data_input, data_output, input_timers, expected_timer_output)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 903, in process_bundle
result_future = self._worker_handler.control_conn.push(process_bundle_req)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/portability/fn_api_runner/worker_handlers.py", line 378, in push
response = self.worker.do_instruction(request)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 610, in do_instruction
getattr(request, request_type), request.instruction_id)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 647, in process_bundle
bundle_processor.process_bundle(instruction_id))
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1001, in process_bundle
element.data)
File "/opt/conda/lib/python3.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 229, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 356, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
File "/opt/conda/lib/python3.7/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
File "apache_beam/runners/common.py", line 1368, in apache_beam.runners.common._OutputProcessor.process_outputs
File "/scratch/user_name/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 492, in _clean_content
text = _parse_and_clean_wikicode(raw_content, parser=mwparserfromhell)
File "/scratch/user_name/modules/datasets_modules/datasets/wikipedia/2fe8db1405aef67dff9fcc51e133e1f9c5b0106f9d9e9638188176d278fd5ff1/wikipedia.py", line 548, in _parse_and_clean_wikicode
section_text.append(section.strip_code().strip())
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/wikicode.py", line 639, in strip_code
stripped = node.__strip__(**kwargs)
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/nodes/html_entity.py", line 60, in __strip__
return self.normalize()
File "/opt/conda/lib/python3.7/site-packages/mwparserfromhell/nodes/html_entity.py", line 150, in normalize
return chr(htmlentities.name2codepoint[self.value])
KeyError: "000nbsp [while running 'train/Clean content']"
``` | [
0.1763544977,
-0.1677631736,
-0.1718391776,
0.4323387742,
0.1707597971,
0.2831905484,
0.1634617448,
0.2103997022,
0.7203698158,
-0.2296730578,
0.0827717111,
0.0324950889,
0.0824815035,
-0.1710960269,
0.1047611982,
-0.1762410402,
0.0409066677,
-0.1311677545,
0.0619522706,
-0.3332502246,
-0.3264748156,
0.2989556789,
-0.2558050156,
0.0192325395,
-0.2063503265,
0.2564131021,
0.0552222058,
-0.1168860048,
0.0501077548,
-0.3417121768,
0.1863568723,
0.209916994,
0.3298727274,
0.1184751689,
-0.0001155041,
-0.0839762241,
0.646073401,
-0.1828086972,
-0.477791369,
-0.2485989481,
-0.1580173969,
-0.4117630422,
0.1903593242,
-0.1931241006,
0.0858210325,
-0.24235861,
0.2660349011,
-0.5352804065,
0.1104983389,
0.1375033855,
0.2382596135,
-0.1910076439,
0.0760798529,
0.0719040036,
0.3408458829,
0.0599357896,
0.1818887591,
0.1272433847,
0.0240779929,
-0.1747233868,
0.0576144606,
0.1426582038,
-0.1027080566,
-0.0544826165,
0.1305571496,
-0.0746287555,
-0.0935018361,
-0.6982359886,
0.1663735807,
0.248355329,
0.752777338,
0.0537604503,
-0.343313843,
-0.1573727429,
0.0807393566,
0.054195486,
0.1600307226,
0.2955704331,
-0.1713261604,
-0.0859520137,
-0.0020555146,
-0.2566791475,
-0.0361365974,
0.6079903841,
-0.0604825616,
0.3638173342,
0.1059381887,
0.148415342,
0.0372278653,
-0.2130876034,
-0.202787891,
-0.2086807638,
0.3671195507,
0.388355881,
-0.2663076222,
0.4091641903,
-0.0272481516,
0.2662192583,
0.0512343347,
-0.1118467003,
-0.1712980568,
0.2166760713,
-0.1110824645,
0.1603291035,
0.2141034305,
-0.1045852378,
0.2297863662,
-0.0994357318,
0.2828103304,
0.0523057729,
-0.1058019251,
0.0235036723,
-0.2767425478,
-0.2106364965,
-0.1899594367,
-0.0080815665,
-0.0186594017,
-0.3135660887,
0.1164931059,
0.1050033569,
-0.2387090325,
-0.2927421629,
-0.1104370058,
0.2609948516,
-0.140469119,
0.2598823905,
0.1735316366,
0.3614091873,
-0.4104596376,
-0.0088009872,
-0.0065339059,
-0.0349616557,
-0.2180784196,
0.1077686325,
0.3512763679,
0.1387449801,
0.3635691106,
-0.0446703881,
-0.0229276307,
-0.1345914006,
-0.0369592607,
-0.2337093651,
-0.071212925,
-0.0027836813,
0.1963966042,
0.5106025338,
0.068972446,
-0.2143215686,
-0.1292958409,
0.1026895791,
-0.1362960786,
0.3337814212,
-0.0152215417,
0.1500986665,
-0.3008663058,
-0.1998068094,
-0.262598604,
0.2575985789,
0.1306097209,
-0.1939143538,
-0.0293586142,
-0.3026608527,
-0.1415427476,
-0.0959850997,
0.3209594488,
0.5890485644,
-0.2810036838,
-0.1207377687,
-0.136845842,
0.0293475613,
0.2669730783,
0.2324832827,
-0.074437812,
0.2620213032,
-0.1818812191,
0.0912394226,
0.3925757408,
-0.0914892033,
-0.0333309062,
0.0226025023,
0.2543192506,
0.2409900725,
-0.0208703354,
0.0407413505,
0.1852903217,
0.1152123213,
0.0883844197,
0.5103181005,
0.1995158494,
-0.0357517079,
-0.2663413286,
-0.02220238,
0.588986814,
0.3162770271,
0.1695823967,
-0.186586231,
0.016465826,
0.3192101121,
0.2271891534,
-0.0213695765,
0.103580229,
0.3533835411,
-0.3805097342,
0.3051335216,
0.1942701638,
-0.0205477066,
-0.5447940826,
0.2211419791,
0.0312380791,
0.1221075207,
-0.0230358914,
0.1085034907,
-0.3356635571,
-0.1956944019,
-0.2104382217,
0.2059177458,
0.115301013,
0.2285693139,
-0.0890173018,
0.4337513447,
0.0359379202,
-0.0744439438,
0.0314514413,
-0.0982373133,
-0.5547196269,
0.443115443,
0.0008210996,
-0.0520538799,
0.0469285846,
0.1857424378,
0.2999596,
-0.1909609884,
-0.1066002399,
-0.2020516694,
0.348793596,
0.1899759769,
0.1270588785,
-0.1278115958,
0.2536049783,
-0.3870402873,
0.1879011542,
0.3023719788,
0.1857381016,
-0.1931845546,
-0.0406669863,
0.1949960589,
-0.120480068,
0.4849775136,
0.0450719446,
0.0869901106,
0.3448006213,
0.0223696977,
-0.0912766457,
-0.11437729,
0.5354328752,
0.0885522813,
0.0355971903,
0.21500507,
-0.1734351665,
-0.1120483354,
0.5037203431,
0.1128967106,
0.0008996241,
0.22619389,
-0.1032249629,
-0.0716238618,
-0.0189748742,
-0.1701520532,
-0.1271853894,
0.1564019769,
0.083888635,
-0.3795365691,
0.0664771944,
-0.195032537,
0.0010229051,
0.1105646044,
0.2348873317,
0.1291823834,
0.0971045867,
-0.0754141659,
-0.270868063,
-0.1090858951,
0.0891005322,
0.2237174362,
-0.1484309733,
0.0937628597,
-0.431865871,
-0.4758824706,
-0.26407215,
-0.2109763473,
-0.4392777085,
-0.4192176163,
-0.1311878115,
-0.023821475,
0.0547447056,
0.1354448497,
-0.0787020251,
-0.1223684102,
-0.2312017977,
-0.1154662892,
-0.3434457779,
-0.4542333186,
-0.3919793069,
0.0169906765,
0.495080024,
0.141469568,
0.1429106146,
-0.0854894146,
-0.2279533148,
-0.1821109056,
-0.253644824,
-0.2276082039,
-0.3089285493,
0.1341109872,
-0.2234797478,
0.4357999861,
-0.1184873581,
-0.1121195331,
0.3430632055,
0.005208537,
0.0267197415,
0.1757971346,
0.235289827,
0.1736937314,
0.0021610186,
-0.5106554031,
-0.3253026605,
-0.3045635223,
-0.1049588248,
0.1949893683,
-0.0025411472,
0.1089144722,
-0.1016928852,
0.0883664489,
0.0876667798,
-0.0099651795,
-0.0963254124,
0.2964365184,
0.4350727797,
-0.0457886234,
-0.3770375252,
0.1137982905,
0.0461982563,
0.1506628096,
0.1252918541,
-0.5083813071,
0.2113196552,
-0.1774634719,
0.0534329861,
0.0646805018,
-0.0603529587,
0.0183730815,
-0.1751562953,
0.0705563799,
0.048605442,
-0.180352658,
-0.1996345222,
-0.0730891451,
0.2446131855,
0.0437165461,
0.3491377831,
-0.3166741133,
0.647099793,
0.1936469972,
0.0915277898,
0.3157170415,
0.1844846159,
0.0257806201,
-0.1023906842,
-0.3294160068,
0.0942494571,
-0.0175164863,
-0.0070649013,
0.3261170983,
-0.05401453,
-0.4404800832,
-0.1798975766,
0.1821162254,
-0.3427730799,
-0.2643785179,
0.1805137545,
-0.3352517188,
0.1539509594,
-0.0801702663,
0.0648974031,
-0.0291188955,
-0.3887671232,
-0.1041919291,
0.2153417915,
0.3641253412,
0.1478044838,
-0.4153712392,
-0.4672617912,
-0.2456070185,
0.2328765392,
0.0414018556,
0.5650765896,
-0.2956768572,
0.0696781874,
0.1657710224,
0.1735633612,
0.3264966309,
-0.326179713,
-0.1534574628,
0.4508714676,
0.0462361127,
-0.7034319639,
-0.004207395,
-0.2033416629,
0.1663325429,
0.3576460481,
0.1388453096,
-0.4681186676,
-0.2272913158,
-0.0402059555,
0.5781701207,
-0.1646544337,
-0.1139072031,
-0.4263756275,
-0.2196539342,
-0.4382929504,
-0.1795292795,
-0.1568226516,
0.1889146566,
-0.2022856623,
0.011307165,
-0.0283576939,
0.2243599147,
0.0859734714,
0.0765161812,
0.1680492759,
0.3729920983,
-0.1032435149,
0.1669538766,
0.0377361998,
-0.1140469238,
0.3032961488,
0.0118588451,
-0.2465834618,
-0.0403532088,
-0.1105408296,
0.2490497231,
0.0052945688,
0.1370441914,
0.0612599216,
0.1077450663,
-0.1814014465,
0.0132249072,
0.0830998421,
0.2614064217,
-0.0501082614,
-0.3356151581,
-0.5175666213,
0.542670846,
0.049294427,
-0.0429766178,
0.2938234508,
-0.1385830492,
-0.3869177401,
0.4827124476,
-0.0211483948,
0.8443027139,
-0.2888145447,
0.1401821971,
-0.1172741652,
0.1775506437,
0.5058168769,
0.0213002563,
0.021809604,
-0.2752047777,
0.2812977135,
-0.0359647945,
0.076836586,
-0.0160987806,
0.1130458117,
-0.2303788066,
0.3406593204,
-0.1429937929,
-0.0339906514,
-0.0085052196,
0.5127865672,
0.1584012657,
-0.3336580694,
-0.3512772322,
0.0931841284,
-0.2316276133,
0.3745653629,
-0.0766454041,
0.1081151664,
-0.0638465732,
-0.0180389807,
-0.2965824306,
0.1969094723,
-0.1289566755,
0.2717767358,
-0.2028960288,
-0.2525973022,
0.314938128,
0.4847461283,
0.079706192,
0.2703461647,
-0.1776283383,
0.1221790016,
-0.1631235927,
-0.2654574513,
-0.0193036143,
0.0598704815,
0.0388401821,
0.0270221978,
-0.4331542552,
0.1020947248,
0.0171885118,
0.0461761653,
-0.3746178448,
0.1316767931,
0.0650542229,
0.0574658923,
-0.2938774228,
-0.0191368759,
-0.1510985196,
0.0080398321,
0.1236543357,
0.0664099306,
0.1730908006,
0.1505157948,
0.3648106456,
-0.2949020863,
-0.083943516,
0.3448216319,
0.0631322339,
-0.0334383585,
0.7968847156,
0.3733591437,
-0.3334772289,
-0.237900719,
-0.275855273,
-0.4379781485,
-0.407941103,
-0.2622452974,
0.1215663701,
0.4168596268,
-0.2348002791,
-0.0616233721,
-0.004354775,
-0.3975145519,
0.0172492489,
-0.6908321977,
0.019819811,
0.3004772067,
-0.0004533615,
0.0362746045,
-0.0524351448,
-0.428868413,
0.2243537903,
0.0289954338,
-0.2083336115,
-0.0463952497,
-0.2517011166,
0.1999659538,
0.3432351947,
-0.0210953411,
-0.1768623888,
-0.1202900112,
0.0963071734,
-0.2485561073,
0.1086609811,
-0.199216485,
0.0105912462,
0.1252684444,
0.1598917544,
-0.1354903579,
-0.1576244831,
-0.4065167904,
0.4006990492,
-0.2869610488,
-0.1482428312,
-0.0836087987,
0.0140242586,
0.372094959,
0.0178924408,
0.201733157,
-0.093744643,
0.1883347481,
-0.0662292615,
-0.0049582981,
0.2702947259,
0.0871795043,
0.6029509902,
0.0909567624,
-0.4829335511,
-0.0918361545,
-0.0459738523,
0.1904548407,
0.1906241477,
-0.1822913438,
0.402369082,
0.366453737,
-0.058333952,
0.5254052281,
-0.0139513724,
-0.0432626717,
0.2768574059,
0.1318715513,
-0.2207181901,
0.0970372409,
0.6333002448,
0.1795686036,
-0.1412868053,
0.1656075716,
0.1306850463,
-0.2902123928,
0.1639253944,
-0.0739220828,
0.1292051375,
-0.0764230639,
0.2255380899,
-0.0071147662,
0.2125225961,
0.0758071095,
0.0114294942,
0.0831692368,
-0.1004324332,
0.0805988908,
0.0960418656,
0.1570515037,
-0.079093352,
0.1025842726,
-0.2248826474,
-0.3368077874,
0.0790257454,
0.3334818482,
0.0219753124,
0.1849647611,
0.0201576445,
0.241878897,
0.1360519677,
-0.1676201224,
-0.1336250007,
0.2248079181,
0.0342130214,
-0.1987566352,
-0.2119289935,
-0.0630900115,
-0.1566894352,
-0.0681770444,
-0.0568804666,
-0.2547960877,
0.0527103022,
0.4515687227,
-0.1733063608,
-0.6086341143,
0.4393875301,
-0.1677063555,
0.0924091488,
-0.1355501264,
0.1694072634,
0.0415966436,
-0.0366638154,
0.007900767,
0.1931376755,
0.1218986511,
0.2390929461,
-0.3821932077,
0.066704087,
-0.020277543,
0.1204772368,
-0.0778522789,
0.4424695671,
0.1526799202,
0.3361575902,
0.308671236,
0.1398468763,
-0.092934832,
-0.1569002271,
0.2654615343,
-0.0708076209,
-0.1257315874,
0.157500416,
-0.1317601055,
0.0889851153,
-0.3516997993,
-0.0725110322,
-0.4017902315,
0.1466519833,
0.1726343632,
0.1591142565,
0.0643571615,
-0.1085579768,
0.0359785445,
-0.1587149501,
0.6551539898,
0.4563870728,
0.3284858465,
-0.2881633043,
-0.1525838077,
-0.5963298082,
-0.0118082426,
-0.3060023189,
-0.0847930089,
0.2159912586,
0.199280411,
0.1125137955,
0.103335008,
0.0917910337,
-0.1117608696,
-0.1631115377,
0.2765767276,
-0.3883312345,
-0.2146693021,
-0.1558445692,
0.0969344303,
-0.3057293594,
-0.1047947705,
0.1783331186,
-0.291610837,
-0.0240954161,
-0.1326547563,
-0.1322905719,
0.0373814404,
0.1854147464,
0.2569070756,
0.0480231494,
0.6660433412,
-0.0019647405,
0.0365837738,
-0.334002018,
-0.2727921009,
0.0568978377,
0.4294121265,
-0.1816427559,
0.2869806886,
-0.1499211043,
0.1362273544,
-0.2817759812,
0.3479063511,
-0.1118229851,
0.042360276,
0.0446398705,
-0.2241067141,
0.0207613558,
-0.1619875729,
0.057032533,
0.2371428609,
-0.0722227842,
0.070083797,
-0.2826006413,
-0.1661402434,
0.1918408573,
-0.2951718867,
-0.4527609348,
0.021107927,
0.1114524975,
0.1165711284,
0.1614441872,
-0.3354123235,
0.1012779772,
0.4468927085,
-0.041003406,
0.0854315609,
0.144568488,
0.0365343094,
-0.1233104393,
-0.1775791049,
-0.144780919,
0.0762891024,
-0.0864646956,
-0.0554756448,
-0.1618866622
] |
https://github.com/huggingface/datasets/issues/577 | Some languages in wikipedia dataset are not loading | Hi ! This looks related to this issue: https://github.com/huggingface/datasets/issues/1994
Basically the parser that is used (mwparserfromhell) has some issues for some pages in `es`.
We already reported some issues for `es` on their repo at https://github.com/earwig/mwparserfromhell/issues/247 but it looks like there are still a few issues. Might be a good idea to open a new issue on the mwparserfromhell repo | Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks! | 60 | Some languages in wikipedia dataset are not loading
Hi,
I am working with the `wikipedia` dataset and I have a script that goes over 92 of the available languages in that dataset. So far I have detected that `ar`, `af`, `an` are not loading. Other languages like `fr` and `en` are working fine. Here's how I am loading them:
```
import nlp
langs = ['ar'. 'af', 'an']
for lang in langs:
data = nlp.load_dataset('wikipedia', f'20200501.{lang}', beam_runner='DirectRunner', split='train')
print(lang, len(data))
```
Here's what I see for 'ar' (it gets stuck there):
```
Downloading and preparing dataset wikipedia/20200501.ar (download: Unknown size, generated: Unknown size, post-processed: Unknown sizetotal: Unknown size) to /home/gaguilar/.cache/huggingface/datasets/wikipedia/20200501.ar/1.0.0/7be7f4324255faf70687be8692de57cf79197afdc33ff08d6a04ed602df32d50...
```
Note that those languages are indeed in the list of expected languages. Any suggestions on how to work around this? Thanks!
Hi ! This looks related to this issue: https://github.com/huggingface/datasets/issues/1994
Basically the parser that is used (mwparserfromhell) has some issues for some pages in `es`.
We already reported some issues for `es` on their repo at https://github.com/earwig/mwparserfromhell/issues/247 but it looks like there are still a few issues. Might be a good idea to open a new issue on the mwparserfromhell repo | [
0.1454113126,
-0.3527688086,
-0.1282090545,
0.4522043467,
0.1266885698,
0.2579192221,
0.0820912421,
0.1208299249,
0.6645289063,
-0.2317671478,
0.0490373746,
0.0963199139,
0.1007605046,
-0.1433961987,
0.1095914021,
-0.1608641297,
0.0873806104,
-0.0652696639,
-0.0717892945,
-0.2942489982,
-0.3200693429,
0.3018417954,
-0.3130694032,
0.0875175744,
-0.2290616333,
0.2680262625,
0.0768611953,
-0.0958317071,
0.1281858087,
-0.3754604459,
0.1456997991,
0.1750693619,
0.3053929806,
0.1023047119,
-0.0001208771,
-0.0671265349,
0.6932803392,
-0.1155171692,
-0.4495870769,
-0.1662026048,
-0.0600183941,
-0.3945031464,
0.2241930664,
-0.0844037682,
0.0897280946,
-0.2101302445,
0.2128774226,
-0.4753665924,
0.2147086114,
0.0394960716,
0.183995977,
-0.0195653252,
0.0651313812,
0.1096661687,
0.2768371701,
0.1622899026,
0.1529469937,
0.0777906179,
0.0675138533,
-0.1927975118,
-0.0473748967,
0.1472342461,
-0.075901784,
-0.16036731,
0.1972973496,
-0.0518289916,
-0.0422015302,
-0.5822982192,
0.2540655732,
0.171851486,
0.6346392632,
0.0478737429,
-0.3067271411,
-0.2932398319,
-0.0394681506,
0.1125656366,
0.2497185022,
0.3170331419,
-0.1362359226,
0.0208641477,
-0.0583339594,
-0.278362453,
-0.0473204777,
0.55742383,
-0.0775972009,
0.4045907855,
0.0481628589,
0.1386423111,
0.1338664591,
-0.2032583356,
-0.3406943083,
-0.2172073722,
0.3812316656,
0.3883212507,
-0.1246938705,
0.4484201968,
-0.004627414,
0.3969625235,
0.0827115327,
-0.1514821947,
-0.3183686137,
0.2936991751,
-0.1225307956,
0.1603397727,
0.2655623555,
-0.1362958103,
0.3388786018,
-0.2905007303,
0.3430871964,
0.1240721643,
-0.180046618,
0.0506780222,
-0.2567568123,
-0.1705680937,
-0.23667638,
-0.1381655931,
-0.0410399064,
-0.3294738531,
0.1968506575,
0.2522234619,
-0.1545378566,
-0.3296502829,
-0.1120169759,
0.3492592275,
-0.0588777475,
0.2394420952,
0.1553756148,
0.3203842342,
-0.4523282647,
-0.1098878458,
0.0471827276,
-0.1493368745,
-0.2431428432,
0.1791197807,
0.2842285633,
0.0920018852,
0.3260173798,
-0.0165744759,
0.0259937849,
-0.2473273575,
-0.1506368816,
-0.1562273651,
-0.0563949868,
0.0323725939,
0.1693222225,
0.5196841955,
0.1437485814,
-0.2705616355,
-0.1778962165,
0.0902567953,
-0.1203156263,
0.3405405879,
0.0907865912,
0.0816246718,
-0.2965572476,
-0.0704908967,
-0.3920732141,
0.2990906835,
0.0731614381,
-0.1684184372,
-0.0265510157,
-0.3151745796,
-0.2208945155,
-0.037468385,
0.3430620432,
0.6845704317,
-0.2111521661,
-0.1792088449,
-0.0998366475,
0.0061340146,
0.2455129027,
0.2722978294,
-0.0244781133,
0.2045750022,
-0.1845207959,
0.1233400404,
0.210726589,
0.0210102499,
0.1327969134,
-0.0070710257,
0.2727641463,
0.368707329,
-0.0597573854,
-0.1002426222,
0.0652203113,
0.0331742205,
0.0405871011,
0.4009352922,
0.2103780806,
0.0254880786,
-0.274068296,
-0.0816093236,
0.5256816149,
0.3201317787,
0.0770763457,
-0.2085173279,
-0.0055629909,
0.2584120333,
0.1945171803,
-0.1087041795,
0.1237083301,
0.4082092941,
-0.351331085,
0.3550538719,
0.2252846062,
-0.0785183087,
-0.5690240264,
0.1859128475,
0.0625605509,
0.1939191818,
-0.0308890082,
0.1396826208,
-0.2773542404,
-0.0942794457,
-0.2347110212,
0.197390154,
0.0538189858,
0.2056228966,
-0.0723794177,
0.5458546281,
0.0506740287,
-0.1096694842,
-0.0347334966,
-0.0858459324,
-0.5625826716,
0.3580296338,
0.0369768292,
-0.0827884078,
0.0142246708,
0.2626747191,
0.267306,
-0.1898435801,
-0.0969690084,
-0.2472344041,
0.4029218853,
0.2461488545,
0.1654467732,
-0.0673916116,
0.2773105502,
-0.4852218926,
0.1915935427,
0.2021673322,
0.1328283995,
-0.1937076151,
-0.0900983363,
0.2346539348,
-0.1538482904,
0.5538973808,
0.0743624419,
0.0472387746,
0.3715313077,
0.0450374633,
-0.0550124347,
-0.1765599847,
0.5834849477,
0.0442727469,
0.113614127,
0.2516684234,
-0.1351591349,
-0.1290348172,
0.4401015341,
0.0481501222,
-0.0139510296,
0.3489094675,
-0.2245785296,
-0.0870224684,
-0.0800374448,
-0.2888081074,
-0.1773390323,
0.1297318935,
0.0773879141,
-0.2715655863,
0.0271353461,
-0.1570094973,
0.0504054651,
0.1857003272,
0.1723350734,
0.1115914285,
0.1352158636,
-0.1144850254,
-0.330660373,
0.0014808998,
-0.0210892782,
0.1773557365,
-0.2405600548,
0.1249642,
-0.4194023609,
-0.273673445,
-0.3203495443,
-0.1592350304,
-0.597266078,
-0.3544650376,
-0.1591811478,
0.0280092191,
0.0243868381,
0.0893844143,
-0.0452943742,
0.017703101,
-0.2964749634,
-0.0144439423,
-0.4688630402,
-0.4974045455,
-0.4261676073,
-0.0808672756,
0.6184805632,
0.1411380023,
0.1409072727,
-0.0959097892,
-0.2744811177,
-0.1694308221,
-0.3183215857,
-0.1569318771,
-0.3113231957,
0.2710371315,
-0.280746758,
0.3934482336,
-0.1340006888,
-0.0908793285,
0.4661806822,
-0.0452606454,
0.0351129398,
0.1680601984,
0.2521027327,
0.0955850556,
0.0170568526,
-0.4298453629,
-0.2396095544,
-0.2278457284,
0.0001123901,
0.1121791601,
-0.0594213679,
-0.0182179324,
-0.1800002605,
0.0478012823,
-0.0729809999,
0.059004873,
-0.1639424711,
0.3648957014,
0.4119185507,
-0.0110468678,
-0.3786582649,
0.0539663956,
0.147371918,
0.1111443192,
-0.0056711957,
-0.5760471225,
0.0858578831,
-0.1520607322,
-0.0183307827,
0.1165748239,
-0.0038576145,
-0.0666898191,
-0.1485967189,
0.1208327413,
0.022958748,
-0.3062511981,
-0.2113111615,
-0.1515654773,
0.1288843304,
0.0644818544,
0.4482480288,
-0.3527322114,
0.6243929267,
0.2692994773,
0.1014879942,
0.287414372,
0.2123980075,
-0.0006007822,
-0.1808136553,
-0.2832008898,
0.0454874896,
-0.102077052,
-0.0436692014,
0.4483748078,
0.0932997763,
-0.4209011197,
-0.1743417531,
0.0814677477,
-0.3827197552,
-0.3302598596,
0.1829968244,
-0.0985629112,
0.1848843545,
-0.1032558382,
0.0673331469,
-0.0198077038,
-0.3953724802,
-0.0184429958,
0.2780786455,
0.377998054,
0.1212864816,
-0.4214285612,
-0.4577126801,
-0.2371248752,
0.1816521287,
0.0365825593,
0.5437669754,
-0.3557376564,
-0.0216220878,
0.2409394383,
0.0927203596,
0.3838035762,
-0.1757915467,
-0.0598698743,
0.3640908003,
-0.0865459368,
-0.579693675,
0.0129829869,
-0.126201421,
0.0152769126,
0.4794661403,
0.2567055225,
-0.4715680182,
-0.3081723452,
-0.1061526686,
0.5398080349,
-0.2224414647,
-0.0887349695,
-0.4245487154,
-0.1331249774,
-0.4907981753,
-0.1154403687,
-0.1945580095,
0.1101857573,
-0.1219688132,
0.0694767907,
0.0580265447,
0.2612292171,
0.1675080657,
0.1626873612,
0.1256118417,
0.236199528,
-0.0506516024,
0.2241119146,
0.159833461,
-0.0596054643,
0.3734051883,
0.0134114437,
-0.3509369791,
-0.0528818145,
-0.1363409609,
0.3402990103,
0.052517876,
0.1857913882,
-0.0086663291,
0.2165903598,
-0.0618351325,
-0.1990318298,
0.0680580586,
0.2766778469,
0.0226412192,
-0.3399434388,
-0.3216896355,
0.6216529012,
-0.0022296831,
-0.046379298,
0.1910455972,
0.1186401919,
-0.4334768653,
0.4841288328,
-0.0268211849,
0.9151915312,
-0.3280842304,
0.1606158763,
-0.1644025743,
0.0693102106,
0.5103272796,
0.0728497505,
-0.1079435349,
-0.281508863,
0.3742110729,
0.0037615299,
0.0704971477,
0.0337761901,
0.1312718093,
-0.17262353,
0.249428913,
-0.0444497764,
-0.0069418452,
0.0265836343,
0.527349472,
0.2180224359,
-0.3456303775,
-0.3993363678,
0.0495019034,
-0.226079762,
0.3657529056,
-0.0629483834,
0.1194541603,
-0.0033381954,
-0.0138053894,
-0.3177853823,
0.1582015008,
-0.0728239417,
0.1525751203,
-0.0260324161,
-0.2204855829,
0.4314936399,
0.357804656,
0.0291127283,
0.1886188388,
-0.2568200827,
0.1398095787,
-0.294570595,
-0.2012759447,
-0.0186440144,
0.0743671358,
0.036181584,
0.0337959528,
-0.3190801144,
-0.0103382394,
0.0058312118,
0.0496032387,
-0.3906801343,
0.0969274193,
0.0281795524,
0.0357886888,
-0.30123353,
-0.0508097894,
-0.037819095,
-0.0096782744,
0.0933458433,
0.0314239375,
0.1742431074,
0.1721737385,
0.4657343328,
-0.2537181675,
-0.0828505903,
0.2836851478,
-0.0905739963,
-0.1029884815,
0.7523554564,
0.4931761026,
-0.3861751556,
-0.2160489112,
-0.2555178702,
-0.3863667846,
-0.5297489762,
-0.2308186293,
0.2378481925,
0.3199223578,
-0.2337436378,
-0.048727382,
0.0475785322,
-0.476485312,
-0.0238330662,
-0.6159592271,
0.0629193485,
0.3354688585,
0.016578516,
0.0593026131,
0.0062833652,
-0.352532804,
0.1037406176,
0.1478775144,
-0.1578248441,
-0.0258047823,
-0.1869843602,
0.1717970669,
0.3937838078,
-0.0427255109,
-0.1469961107,
-0.107294485,
0.0563527234,
-0.2338118702,
0.0880921483,
-0.1381166577,
0.045675166,
0.1547646672,
0.1228262931,
-0.1809512824,
-0.2163014859,
-0.2853407264,
0.4532651901,
-0.4488104582,
-0.100592345,
-0.1114417836,
-0.0505845211,
0.2476321757,
0.078530848,
0.1711520702,
-0.0566083789,
0.1552757472,
-0.0537958369,
0.014577996,
0.3479131162,
0.030118078,
0.4769910276,
0.128231883,
-0.3364943564,
0.0329932161,
0.0736093372,
0.1639913321,
0.2181761414,
-0.142911762,
0.2568368316,
0.4702875018,
-0.0622325353,
0.5298508406,
0.0320642218,
-0.0739814714,
0.3156302869,
0.0666026175,
-0.1692602932,
0.0446890518,
0.7771179676,
0.1248705685,
-0.1889657974,
0.1635432839,
0.149710983,
-0.2596125603,
0.1042394191,
-0.0925645754,
0.3295483887,
-0.0656357408,
0.1110401973,
-0.0727935955,
0.2612843215,
0.1104627252,
0.0010077581,
0.0843653083,
-0.0359287374,
0.0308818109,
0.070720911,
0.0810005069,
-0.1426285356,
0.2048093826,
-0.1930040717,
-0.359395951,
-0.0145361079,
0.3893149495,
0.1058752537,
0.1884558946,
-0.0210110843,
0.3243284822,
0.1207646281,
-0.2275567055,
-0.2272688001,
0.3058837056,
0.0331073664,
-0.3350082934,
-0.1665450633,
-0.0003768206,
-0.1447979659,
0.0586105064,
-0.0320914462,
-0.2932662368,
0.0943525136,
0.438400507,
-0.1160319895,
-0.5470774174,
0.2070731372,
-0.2495486736,
0.1478571594,
-0.0930465311,
0.2113570124,
0.0408198871,
-0.0305346027,
-0.0404399075,
0.0705893412,
0.2195031047,
0.2346307486,
-0.3896166086,
0.0523685776,
-0.1126590967,
0.2038097978,
-0.1255105585,
0.5091901422,
0.0827483386,
0.4134191573,
0.2292459905,
0.0984182954,
-0.0372114554,
-0.3240754604,
0.2217306793,
-0.0853826478,
-0.0885720551,
0.0381480455,
-0.1624500602,
0.1032265425,
-0.3157380819,
-0.1482565701,
-0.2848926187,
0.2957911491,
0.2464161515,
0.1293192953,
0.040048521,
0.0059816539,
0.0081702806,
-0.0770005733,
0.6391991973,
0.6014127135,
0.2962318957,
-0.2728564143,
-0.2316266745,
-0.6162998676,
-0.0720525831,
-0.2881962657,
-0.074129425,
0.2476620972,
0.1127099097,
0.1472311914,
0.0724601969,
0.0799300671,
-0.1244611889,
-0.1621922255,
0.2225467861,
-0.3230171204,
-0.1215389147,
-0.1404102892,
0.0720487982,
-0.3405056596,
-0.0398607925,
0.2299978733,
-0.2287095338,
-0.1217886731,
-0.0672964826,
0.0563754812,
-0.1051209047,
0.1474518478,
0.2411970347,
0.0859881788,
0.5995777845,
0.001179941,
0.0553468093,
-0.2797991037,
-0.2594977617,
0.1524226367,
0.4133251905,
-0.2980041504,
0.230302617,
-0.2063947618,
0.077889055,
-0.2899841666,
0.3602119684,
-0.0527695529,
0.0901127458,
-0.0611299872,
-0.1782128811,
-0.0139322132,
-0.1953758001,
0.0780582875,
0.2493197322,
-0.0704434961,
0.1113267019,
-0.2750208378,
-0.1183608919,
0.2797541618,
-0.3547445238,
-0.4361908138,
0.0273806751,
0.1413471997,
0.167613104,
0.0812710524,
-0.4344431162,
-0.0146422684,
0.3558732569,
-0.0144728767,
0.0073267892,
0.1548234969,
0.0178532712,
-0.2843690515,
-0.1955551207,
-0.1332449615,
0.1044573635,
-0.0845369622,
0.0304156318,
-0.1153531447
] |
https://github.com/huggingface/datasets/issues/575 | Couldn't reach certain URLs and for the ones that can be reached, code just blocks after downloading. | Update:
The imdb download completed after a long time (about 45 mins). Ofcourse once download loading was instantaneous. Also, the loaded object was of type `arrow_dataset`.
The urls for glue still doesn't work though. | Hi,
I'm following the [quick tour](https://huggingface.co/nlp/quicktour.html) and tried to load the glue dataset:
```
>>> from nlp import load_dataset
>>> dataset = load_dataset('glue', 'mrpc', split='train')
```
However, this ran into a `ConnectionError` saying it could not reach the URL (just pasting the last few lines):
```
/net/vaosl01/opt/NFS/su0/miniconda3/envs/hf/lib/python3.7/site-packages/nlp/utils/file_utils.py in get_from_cache(url, cache_dir, force_download, proxies, etag_timeout, resume_download, user_agent, local_files_only)
354 " to False."
355 )
--> 356 raise ConnectionError("Couldn't reach {}".format(url))
357
358 # From now on, connected is True.
ConnectionError: Couldn't reach https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc
```
I tried glue with cola and sst2. I got the same error, just instead of mrpc in the URL, it was replaced with cola and sst2.
Since this was not working, I thought I'll try another dataset. So I tried downloading the imdb dataset:
```
ds = load_dataset('imdb', split='train')
```
This downloads the data, but it just blocks after that:
```
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.56k/4.56k [00:00<00:00, 1.38MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.07k/2.07k [00:00<00:00, 1.15MB/s]
Downloading and preparing dataset imdb/plain_text (download: 80.23 MiB, generated: 127.06 MiB, post-processed: Unknown sizetotal: 207.28 MiB) to /net/vaosl01/opt/NFS/su0/huggingface/datasets/imdb/plain_text/1.0.0/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743...
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 84.1M/84.1M [00:07<00:00, 11.1MB/s]
```
I checked the folder `$HF_HOME/datasets/downloads/extracted/<id>/aclImdb`. This folder is constantly growing in size. When I navigated to the train folder within, there was no file. However, the test folder seemed to be populating. The last time I checked it was 327M. I thought the Imdb dataset was smaller than that. My questions are:
1. Why is it still blocking? Is it still downloading?
2. I specified split as train, so why is the test folder being populated?
3. I read somewhere that after downloading, `nlp` converts the text files into some sort of `arrow` files, which will also take a while. Is this also happening here?
Thanks.
| 34 | Couldn't reach certain URLs and for the ones that can be reached, code just blocks after downloading.
Hi,
I'm following the [quick tour](https://huggingface.co/nlp/quicktour.html) and tried to load the glue dataset:
```
>>> from nlp import load_dataset
>>> dataset = load_dataset('glue', 'mrpc', split='train')
```
However, this ran into a `ConnectionError` saying it could not reach the URL (just pasting the last few lines):
```
/net/vaosl01/opt/NFS/su0/miniconda3/envs/hf/lib/python3.7/site-packages/nlp/utils/file_utils.py in get_from_cache(url, cache_dir, force_download, proxies, etag_timeout, resume_download, user_agent, local_files_only)
354 " to False."
355 )
--> 356 raise ConnectionError("Couldn't reach {}".format(url))
357
358 # From now on, connected is True.
ConnectionError: Couldn't reach https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc
```
I tried glue with cola and sst2. I got the same error, just instead of mrpc in the URL, it was replaced with cola and sst2.
Since this was not working, I thought I'll try another dataset. So I tried downloading the imdb dataset:
```
ds = load_dataset('imdb', split='train')
```
This downloads the data, but it just blocks after that:
```
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.56k/4.56k [00:00<00:00, 1.38MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.07k/2.07k [00:00<00:00, 1.15MB/s]
Downloading and preparing dataset imdb/plain_text (download: 80.23 MiB, generated: 127.06 MiB, post-processed: Unknown sizetotal: 207.28 MiB) to /net/vaosl01/opt/NFS/su0/huggingface/datasets/imdb/plain_text/1.0.0/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743...
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 84.1M/84.1M [00:07<00:00, 11.1MB/s]
```
I checked the folder `$HF_HOME/datasets/downloads/extracted/<id>/aclImdb`. This folder is constantly growing in size. When I navigated to the train folder within, there was no file. However, the test folder seemed to be populating. The last time I checked it was 327M. I thought the Imdb dataset was smaller than that. My questions are:
1. Why is it still blocking? Is it still downloading?
2. I specified split as train, so why is the test folder being populated?
3. I read somewhere that after downloading, `nlp` converts the text files into some sort of `arrow` files, which will also take a while. Is this also happening here?
Thanks.
Update:
The imdb download completed after a long time (about 45 mins). Ofcourse once download loading was instantaneous. Also, the loaded object was of type `arrow_dataset`.
The urls for glue still doesn't work though. | [
-0.1126361117,
-0.0148183554,
-0.051447019,
0.2499043196,
0.2150653601,
-0.1080485284,
-0.0023044972,
0.1171094924,
0.144177109,
-0.1348839998,
-0.4256687164,
-0.0689157769,
0.1447226703,
0.1289690882,
0.2742245495,
-0.0121177407,
-0.1593030542,
-0.0744766146,
-0.0916164368,
0.0177008733,
-0.2575626075,
0.3606528938,
-0.2036086917,
0.1764501333,
-0.1922913343,
-0.0935443714,
-0.1606315374,
0.1398070753,
-0.1229681,
-0.3245213628,
0.0947915614,
0.3048552871,
0.0481999665,
0.3857812881,
-0.0001219456,
-0.0564450845,
0.5723122358,
-0.0555413514,
-0.3910083175,
-0.4353300929,
-0.2779271305,
-0.0589924008,
0.3559851348,
-0.0667514801,
0.1307542026,
0.1634608656,
0.0878986865,
-0.2419316173,
0.2720349729,
0.2631810606,
0.1377919018,
0.0733034015,
0.135902822,
0.0714743882,
0.3174402416,
-0.2632397711,
0.0046694875,
0.3708162308,
0.3326880336,
-0.2914192677,
0.0472084209,
0.0577221401,
-0.2056468874,
0.0583761223,
0.1841482967,
0.1635259986,
-0.2578425407,
-0.5462061167,
0.0292285308,
0.3213653862,
0.2748624086,
0.0140512735,
-0.4840654135,
-0.3165414333,
-0.0286721066,
0.117361702,
0.2789926827,
0.1334595382,
-0.2538383007,
0.0309729986,
-0.5450455546,
-0.1321875602,
0.0033701584,
0.5532462001,
0.110110417,
-0.0098114721,
0.0639422238,
0.1689393818,
0.3128623068,
-0.039525412,
0.1148342639,
-0.0232362803,
0.2994756401,
0.1872626394,
-0.0450580791,
-0.0265388787,
0.0512052402,
0.423658222,
0.1454074085,
0.1372273564,
0.1992810965,
0.0405514576,
-0.1392972916,
0.1321616471,
0.154836446,
-0.0140558481,
-0.0127520002,
0.0798757747,
0.6293035746,
0.3603570759,
-0.1680347174,
0.0711007714,
-0.1136870608,
-0.0785522014,
-0.3989925683,
-0.0077087712,
-0.1199620143,
-0.2840939462,
-0.1622229815,
-0.0771392286,
-0.1368773729,
0.015072003,
0.0852004364,
0.3928339183,
-0.3108936548,
0.2066409141,
-0.087643221,
0.3793321848,
-0.1213533953,
0.03193371,
-0.093083322,
0.1856426001,
-0.1889153719,
0.2517305613,
0.4859229922,
0.0784360468,
0.3209572732,
-0.3183522522,
-0.1498349011,
-0.2291524708,
0.0177800283,
-0.3440530598,
0.0556666888,
0.0532135479,
0.3014577925,
0.3389766812,
-0.0610681996,
-0.1870908141,
-0.1006211489,
-0.210953325,
-0.2240745276,
-0.1662693322,
0.3674654663,
0.1121519357,
-0.4041665196,
-0.1559496522,
-0.1344942153,
-0.0048293001,
-0.0961962938,
-0.09488298,
-0.1139310896,
-0.0595752336,
-0.1584252119,
-0.2009519637,
0.3062125444,
0.6965600848,
-0.1322717071,
-0.2766488791,
-0.1275887787,
0.0412126295,
-0.0782897323,
0.5020376444,
-0.0070994124,
0.1006360203,
-0.4256359339,
0.2437321395,
0.5672341585,
-0.2894795835,
-0.3902561367,
0.5320960879,
-0.0154578015,
0.0364814922,
0.1849974543,
0.0225607976,
0.1312482357,
-0.0723037198,
0.4729861319,
0.5410935283,
-0.0449327826,
-0.097043097,
-0.2779183388,
-0.1605926156,
0.1289731562,
0.1257044673,
0.2235037386,
0.0212897565,
-0.0032731071,
-0.07970649,
0.2332623005,
0.2292819172,
0.0552492663,
0.2484000623,
-0.1013189852,
0.1788351536,
-0.1047973037,
-0.138165012,
-0.3142167926,
0.2160551846,
-0.0629846156,
0.1218498349,
-0.3327452242,
0.2367784679,
-0.1883799136,
-0.1930933297,
0.0038399529,
-0.0187658556,
0.0700897947,
0.2044758946,
0.1313028187,
0.0753204897,
-0.252273351,
0.5734855533,
0.0272470787,
0.17247729,
-0.3160077929,
0.6796264648,
-0.1542851031,
-0.1917633861,
0.1090582535,
0.3169533014,
0.2635818422,
-0.2635530233,
0.0134694465,
0.302254647,
-0.1289991438,
0.1917851418,
0.2702764273,
0.0180152878,
0.3637426794,
-0.2720224559,
0.112630181,
0.0491892844,
0.1340865195,
-0.1252142191,
0.1797027588,
0.2376715988,
-0.2186419964,
0.6319780946,
0.4755755067,
-0.0091404244,
0.1447476745,
-0.1831095517,
-0.2134259641,
-0.2173276842,
0.6007688642,
0.2402254045,
0.2317632884,
-0.1197274774,
-0.1488307267,
0.1870428473,
0.3490747213,
0.0156642795,
0.0556957573,
0.2178223431,
-0.1637913585,
-0.1998849511,
-0.1010186076,
0.3804473877,
0.2209592462,
0.1598925442,
0.1135151312,
0.0212801415,
-0.2521316707,
-0.3306075037,
0.2030387223,
0.1683369875,
-0.0889686048,
0.0819642395,
0.2019194812,
0.1494821161,
-0.0947705805,
-0.4382835031,
0.0053284764,
-0.0233715661,
-0.3714634776,
-0.0647936165,
-0.4457282126,
-0.6604163051,
0.0467653386,
-0.1721000075,
-0.2792550027,
-0.6464509964,
-0.1037923098,
0.2034946233,
0.0090117678,
0.0356712416,
-0.2063029557,
-0.0062602162,
-0.2111669779,
-0.0129493354,
-0.1801104844,
-0.3260892034,
-0.0629034564,
0.0395990461,
0.3534009755,
0.0574422888,
0.1485337168,
-0.2441106141,
-0.1611089408,
-0.1612909138,
-0.1101249978,
0.1928355992,
-0.1821730584,
0.3608712554,
0.210953027,
0.4903082252,
0.2029088438,
-0.2494545579,
0.3121626675,
-0.1433432698,
0.0774956197,
-0.0519494228,
0.00776859,
-0.0471087135,
0.1043554097,
-0.009999685,
-0.4713087976,
-0.4650454223,
0.2359145433,
-0.0598597676,
0.2176339924,
0.1203419566,
-0.0410525948,
0.2929947376,
-0.0266636126,
0.1280007064,
-0.0411785506,
-0.335452199,
0.5650700927,
-0.2066883445,
-0.200068891,
-0.0150569454,
0.0398595184,
-0.2016784847,
0.0807285681,
-0.706532836,
-0.2792834044,
-0.4299278259,
0.0273612291,
0.2272491008,
-0.0665317029,
-0.0501207896,
-0.2217651308,
-0.0760566369,
-0.0140880942,
-0.4060665965,
-0.002945669,
0.3619812429,
0.34752509,
0.1065039933,
0.5122762918,
-0.0753892884,
0.2725412846,
-0.0739140138,
0.074788481,
0.6189817786,
0.0821447149,
0.3144125044,
-0.1913373172,
-0.2690410614,
0.1089695394,
-0.3142777085,
0.0521756932,
0.135019213,
0.2109739184,
-0.315036118,
-0.5289623737,
0.1727685779,
-0.4613903463,
-0.1889806092,
0.0283964258,
-0.0316219777,
-0.2145556957,
0.2021682113,
0.0356191248,
-0.0524336472,
-0.2252570987,
-0.0457175002,
0.1494724751,
0.2521129847,
0.0396240354,
-0.1105960086,
-0.1379540563,
-0.4256442189,
0.2749572992,
0.1958888471,
0.3104602396,
-0.2696571052,
-0.0499456562,
0.0475214384,
-0.0248349123,
0.5041224957,
-0.413585037,
0.2516334951,
0.1580863446,
0.0192974806,
-0.6474224925,
0.0887409523,
-0.1247082651,
0.140357852,
0.6630887389,
0.2826696932,
-0.1967945695,
0.1195999384,
-0.0139654195,
0.1940147579,
0.1074986011,
0.074658528,
-0.0067793205,
-0.2686882615,
-0.258589834,
-0.2295415699,
-0.0385957509,
0.0707758367,
-0.0640728846,
-0.3239623904,
0.313452512,
-0.4003783166,
-0.0395803563,
0.0364647582,
0.1761944294,
-0.1073655561,
0.0827874094,
0.2307302058,
0.1615782678,
-0.1797794253,
0.4915142059,
-0.3460823894,
0.0986249447,
0.5306717753,
-0.0260188431,
0.1837768108,
0.6057763696,
0.1359436363,
0.1512894332,
0.0835238099,
0.1909215301,
0.2960534692,
0.3395085335,
0.2108196318,
-0.0150642507,
-0.2821374536,
-0.3250032663,
0.3341209888,
-0.0021029636,
-0.0231160447,
0.203830868,
0.0800836086,
-0.2553892732,
0.2971126139,
-0.3381646872,
1.1482989788,
0.0942293257,
0.1294937432,
-0.1617402732,
-0.2525260448,
0.664984405,
-0.1067687273,
0.1237173378,
-0.0469338596,
-0.1145090163,
-0.1157740727,
-0.1703817248,
-0.2114433497,
0.0383758545,
-0.2149141729,
0.5331194997,
0.0755661726,
-0.011315844,
0.1618145555,
0.4351915419,
-0.1023056209,
-0.0456139371,
-0.5257045627,
0.0877487957,
-0.0442998372,
0.251855433,
-0.0140865408,
-0.1485860944,
-0.2059696317,
0.0053139403,
-0.460152626,
0.195518598,
-0.6179571152,
0.2456761748,
-0.0609877743,
-0.3103727996,
-0.1943773627,
-0.0575053133,
0.1287862659,
-0.0164992549,
-0.3511246145,
0.0874211118,
0.2061313093,
-0.1411282271,
0.1736899614,
0.044138521,
0.164144069,
-0.1359500438,
-0.0259153843,
-0.0478294007,
0.0252407752,
-0.327275902,
0.0321034938,
-0.1285141706,
0.0396088324,
0.0648923367,
-0.1876661777,
-0.138992995,
-0.1790035963,
-0.2322731316,
0.0341175534,
-0.0430923775,
0.1417648941,
-0.0858425647,
0.3707483709,
-0.2654457688,
-0.1623957753,
0.514253974,
-0.3053885996,
0.0012645051,
0.339337945,
0.3020928502,
-0.3583813012,
-0.0925877392,
-0.1114679947,
-0.182228446,
-0.3933392763,
-0.0003346093,
-0.2642870843,
0.0666862428,
-0.0891283527,
0.0259226672,
0.1505149603,
0.1860915571,
-0.1181162968,
-0.6274638176,
-0.1790272892,
0.4015604258,
-0.0270391516,
0.0485014804,
-0.2964763045,
0.1652343869,
0.4207265377,
-0.038622357,
-0.2305992842,
0.1560417414,
-0.4415417314,
0.0104380976,
0.3247362375,
-0.3346687555,
0.0485882014,
-0.0424531065,
0.0563485101,
0.0833297819,
-0.1790659428,
-0.0608193502,
-0.1101867408,
0.1974409223,
0.0511797369,
-0.3207356334,
-0.0964924246,
-0.0979489833,
0.2066011727,
-0.0818445981,
-0.0759879872,
-0.0132813677,
0.0211342238,
-0.2951014936,
0.1266243756,
-0.0999268442,
-0.1372245848,
0.3212112784,
-0.0065162778,
0.2825661004,
0.31738621,
0.1064765155,
0.0883072764,
-0.0717145056,
-0.0931004435,
-0.1241107881,
-0.0829005092,
0.1949428916,
0.4736435711,
-0.0916404426,
0.0512026548,
0.056989029,
0.3979812264,
0.5058016181,
0.0556126162,
-0.1075793058,
0.1179757714,
0.0908085853,
0.0191554651,
0.2518128753,
0.2627198696,
0.1216955855,
0.0234896839,
0.0976518095,
0.0498040468,
-0.072479248,
0.0851857364,
0.0754873529,
0.658939302,
0.2015366554,
-0.0219140425,
-0.1459541917,
0.0476531833,
-0.052735962,
0.0410785601,
-0.0996242315,
0.0687502399,
-0.1266630888,
0.0534233898,
-0.0639401898,
-0.1313263625,
0.3249319494,
0.2482801676,
-0.2003667057,
-0.0873380154,
0.2849597335,
0.0983348638,
0.1369966418,
-0.2204144299,
0.3333669603,
-0.1611332744,
0.0659119487,
-0.1380517781,
0.4353381991,
0.1926470995,
-0.3216385543,
0.0582482442,
-0.3349764943,
-0.4865925312,
0.0540139154,
-0.1999975294,
-0.1715221256,
0.1560162902,
0.2826381624,
0.0135861393,
-0.7253263593,
-0.1404954493,
-0.191246748,
0.2329851389,
-0.3082933724,
0.4661636353,
-0.1038244218,
0.0777672082,
-0.0195024386,
0.1526184827,
0.3781438768,
0.3050532639,
-0.4279477298,
0.0964921266,
0.2019782364,
-0.0506127886,
-0.0096545927,
0.2302980721,
0.4857645333,
-0.1341397464,
0.49281317,
0.0891052559,
-0.1842340231,
0.268684566,
-0.0354274102,
0.1412736028,
-0.075104326,
0.2763310373,
-0.0875300691,
-0.0513292328,
-0.2288813144,
-0.1396887898,
-0.5556664467,
-0.1096579731,
-0.0335210338,
-0.597343564,
0.0323540643,
-0.0905735493,
0.0411371812,
0.0053060539,
0.6298922896,
0.569093585,
0.5683694482,
-0.3809540272,
-0.3364818692,
-0.3252818882,
0.0403119996,
-0.1251098514,
-0.0124701113,
-0.1701543033,
0.2908909619,
-0.1745342016,
0.2047265768,
-0.0137992352,
0.674814105,
-0.1523850262,
0.174304381,
-0.2713821828,
-0.0484786294,
-0.0313814692,
0.0573393479,
-0.1030780375,
-0.0273056477,
0.0516433083,
-0.5177119374,
-0.0516983978,
0.275902003,
-0.2825131118,
0.1339704096,
0.2366518974,
0.3119562268,
0.1032112837,
0.762863338,
-0.1227935404,
-0.1470052153,
-0.4071493149,
-0.6882560849,
-0.024067834,
-0.1192929298,
0.0928618908,
0.1546038091,
-0.0933203995,
0.202130571,
-0.1746086925,
0.190074563,
-0.0879269838,
0.4516708553,
0.1424891651,
-0.3678413332,
0.0650075525,
-0.0495071746,
0.0439471081,
-0.0063380934,
-0.2057439983,
0.1007040143,
-0.1159255207,
-0.3511935472,
0.5208381414,
-0.0540382303,
-0.0175135434,
-0.1145697683,
0.2055764198,
0.0688287169,
0.212995559,
-0.3332383037,
-0.2800841928,
0.3444099128,
0.0583391972,
-0.43530792,
0.2324450016,
-0.0068449657,
-0.0778454095,
-0.0766063109,
-0.0418283679,
0.2216901779,
-0.2184318751,
-0.192987904,
-0.067809172
] |
https://github.com/huggingface/datasets/issues/575 | Couldn't reach certain URLs and for the ones that can be reached, code just blocks after downloading. | I am also seeing a similar error when running the following:
```
import nlp
dataset = load_dataset('cola')
```
Error:
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/js11133/.conda/envs/jiant/lib/python3.8/site-packages/nlp/load.py", line 509, in load_dataset
module_path = prepare_module(path, download_config=download_config, dataset=True)
File "/home/js11133/.conda/envs/jiant/lib/python3.8/site-packages/nlp/load.py", line 248, in prepare_module
local_path = cached_path(file_path, download_config=download_config)
File "/home/js11133/.conda/envs/jiant/lib/python3.8/site-packages/nlp/utils/file_utils.py", line 191, in cached_path
output_path = get_from_cache(
File "/home/js11133/.conda/envs/jiant/lib/python3.8/site-packages/nlp/utils/file_utils.py", line 356, in get_from_cache
raise ConnectionError("Couldn't reach {}".format(url))
ConnectionError: Couldn't reach https://s3.amazonaws.com/datasets.huggingface.co/nlp/datasets/cola/cola.py
``` | Hi,
I'm following the [quick tour](https://huggingface.co/nlp/quicktour.html) and tried to load the glue dataset:
```
>>> from nlp import load_dataset
>>> dataset = load_dataset('glue', 'mrpc', split='train')
```
However, this ran into a `ConnectionError` saying it could not reach the URL (just pasting the last few lines):
```
/net/vaosl01/opt/NFS/su0/miniconda3/envs/hf/lib/python3.7/site-packages/nlp/utils/file_utils.py in get_from_cache(url, cache_dir, force_download, proxies, etag_timeout, resume_download, user_agent, local_files_only)
354 " to False."
355 )
--> 356 raise ConnectionError("Couldn't reach {}".format(url))
357
358 # From now on, connected is True.
ConnectionError: Couldn't reach https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc
```
I tried glue with cola and sst2. I got the same error, just instead of mrpc in the URL, it was replaced with cola and sst2.
Since this was not working, I thought I'll try another dataset. So I tried downloading the imdb dataset:
```
ds = load_dataset('imdb', split='train')
```
This downloads the data, but it just blocks after that:
```
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.56k/4.56k [00:00<00:00, 1.38MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.07k/2.07k [00:00<00:00, 1.15MB/s]
Downloading and preparing dataset imdb/plain_text (download: 80.23 MiB, generated: 127.06 MiB, post-processed: Unknown sizetotal: 207.28 MiB) to /net/vaosl01/opt/NFS/su0/huggingface/datasets/imdb/plain_text/1.0.0/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743...
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 84.1M/84.1M [00:07<00:00, 11.1MB/s]
```
I checked the folder `$HF_HOME/datasets/downloads/extracted/<id>/aclImdb`. This folder is constantly growing in size. When I navigated to the train folder within, there was no file. However, the test folder seemed to be populating. The last time I checked it was 327M. I thought the Imdb dataset was smaller than that. My questions are:
1. Why is it still blocking? Is it still downloading?
2. I specified split as train, so why is the test folder being populated?
3. I read somewhere that after downloading, `nlp` converts the text files into some sort of `arrow` files, which will also take a while. Is this also happening here?
Thanks.
| 76 | Couldn't reach certain URLs and for the ones that can be reached, code just blocks after downloading.
Hi,
I'm following the [quick tour](https://huggingface.co/nlp/quicktour.html) and tried to load the glue dataset:
```
>>> from nlp import load_dataset
>>> dataset = load_dataset('glue', 'mrpc', split='train')
```
However, this ran into a `ConnectionError` saying it could not reach the URL (just pasting the last few lines):
```
/net/vaosl01/opt/NFS/su0/miniconda3/envs/hf/lib/python3.7/site-packages/nlp/utils/file_utils.py in get_from_cache(url, cache_dir, force_download, proxies, etag_timeout, resume_download, user_agent, local_files_only)
354 " to False."
355 )
--> 356 raise ConnectionError("Couldn't reach {}".format(url))
357
358 # From now on, connected is True.
ConnectionError: Couldn't reach https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc
```
I tried glue with cola and sst2. I got the same error, just instead of mrpc in the URL, it was replaced with cola and sst2.
Since this was not working, I thought I'll try another dataset. So I tried downloading the imdb dataset:
```
ds = load_dataset('imdb', split='train')
```
This downloads the data, but it just blocks after that:
```
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.56k/4.56k [00:00<00:00, 1.38MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.07k/2.07k [00:00<00:00, 1.15MB/s]
Downloading and preparing dataset imdb/plain_text (download: 80.23 MiB, generated: 127.06 MiB, post-processed: Unknown sizetotal: 207.28 MiB) to /net/vaosl01/opt/NFS/su0/huggingface/datasets/imdb/plain_text/1.0.0/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743...
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 84.1M/84.1M [00:07<00:00, 11.1MB/s]
```
I checked the folder `$HF_HOME/datasets/downloads/extracted/<id>/aclImdb`. This folder is constantly growing in size. When I navigated to the train folder within, there was no file. However, the test folder seemed to be populating. The last time I checked it was 327M. I thought the Imdb dataset was smaller than that. My questions are:
1. Why is it still blocking? Is it still downloading?
2. I specified split as train, so why is the test folder being populated?
3. I read somewhere that after downloading, `nlp` converts the text files into some sort of `arrow` files, which will also take a while. Is this also happening here?
Thanks.
I am also seeing a similar error when running the following:
```
import nlp
dataset = load_dataset('cola')
```
Error:
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/js11133/.conda/envs/jiant/lib/python3.8/site-packages/nlp/load.py", line 509, in load_dataset
module_path = prepare_module(path, download_config=download_config, dataset=True)
File "/home/js11133/.conda/envs/jiant/lib/python3.8/site-packages/nlp/load.py", line 248, in prepare_module
local_path = cached_path(file_path, download_config=download_config)
File "/home/js11133/.conda/envs/jiant/lib/python3.8/site-packages/nlp/utils/file_utils.py", line 191, in cached_path
output_path = get_from_cache(
File "/home/js11133/.conda/envs/jiant/lib/python3.8/site-packages/nlp/utils/file_utils.py", line 356, in get_from_cache
raise ConnectionError("Couldn't reach {}".format(url))
ConnectionError: Couldn't reach https://s3.amazonaws.com/datasets.huggingface.co/nlp/datasets/cola/cola.py
``` | [
-0.1126361117,
-0.0148183554,
-0.051447019,
0.2499043196,
0.2150653601,
-0.1080485284,
-0.0023044972,
0.1171094924,
0.144177109,
-0.1348839998,
-0.4256687164,
-0.0689157769,
0.1447226703,
0.1289690882,
0.2742245495,
-0.0121177407,
-0.1593030542,
-0.0744766146,
-0.0916164368,
0.0177008733,
-0.2575626075,
0.3606528938,
-0.2036086917,
0.1764501333,
-0.1922913343,
-0.0935443714,
-0.1606315374,
0.1398070753,
-0.1229681,
-0.3245213628,
0.0947915614,
0.3048552871,
0.0481999665,
0.3857812881,
-0.0001219456,
-0.0564450845,
0.5723122358,
-0.0555413514,
-0.3910083175,
-0.4353300929,
-0.2779271305,
-0.0589924008,
0.3559851348,
-0.0667514801,
0.1307542026,
0.1634608656,
0.0878986865,
-0.2419316173,
0.2720349729,
0.2631810606,
0.1377919018,
0.0733034015,
0.135902822,
0.0714743882,
0.3174402416,
-0.2632397711,
0.0046694875,
0.3708162308,
0.3326880336,
-0.2914192677,
0.0472084209,
0.0577221401,
-0.2056468874,
0.0583761223,
0.1841482967,
0.1635259986,
-0.2578425407,
-0.5462061167,
0.0292285308,
0.3213653862,
0.2748624086,
0.0140512735,
-0.4840654135,
-0.3165414333,
-0.0286721066,
0.117361702,
0.2789926827,
0.1334595382,
-0.2538383007,
0.0309729986,
-0.5450455546,
-0.1321875602,
0.0033701584,
0.5532462001,
0.110110417,
-0.0098114721,
0.0639422238,
0.1689393818,
0.3128623068,
-0.039525412,
0.1148342639,
-0.0232362803,
0.2994756401,
0.1872626394,
-0.0450580791,
-0.0265388787,
0.0512052402,
0.423658222,
0.1454074085,
0.1372273564,
0.1992810965,
0.0405514576,
-0.1392972916,
0.1321616471,
0.154836446,
-0.0140558481,
-0.0127520002,
0.0798757747,
0.6293035746,
0.3603570759,
-0.1680347174,
0.0711007714,
-0.1136870608,
-0.0785522014,
-0.3989925683,
-0.0077087712,
-0.1199620143,
-0.2840939462,
-0.1622229815,
-0.0771392286,
-0.1368773729,
0.015072003,
0.0852004364,
0.3928339183,
-0.3108936548,
0.2066409141,
-0.087643221,
0.3793321848,
-0.1213533953,
0.03193371,
-0.093083322,
0.1856426001,
-0.1889153719,
0.2517305613,
0.4859229922,
0.0784360468,
0.3209572732,
-0.3183522522,
-0.1498349011,
-0.2291524708,
0.0177800283,
-0.3440530598,
0.0556666888,
0.0532135479,
0.3014577925,
0.3389766812,
-0.0610681996,
-0.1870908141,
-0.1006211489,
-0.210953325,
-0.2240745276,
-0.1662693322,
0.3674654663,
0.1121519357,
-0.4041665196,
-0.1559496522,
-0.1344942153,
-0.0048293001,
-0.0961962938,
-0.09488298,
-0.1139310896,
-0.0595752336,
-0.1584252119,
-0.2009519637,
0.3062125444,
0.6965600848,
-0.1322717071,
-0.2766488791,
-0.1275887787,
0.0412126295,
-0.0782897323,
0.5020376444,
-0.0070994124,
0.1006360203,
-0.4256359339,
0.2437321395,
0.5672341585,
-0.2894795835,
-0.3902561367,
0.5320960879,
-0.0154578015,
0.0364814922,
0.1849974543,
0.0225607976,
0.1312482357,
-0.0723037198,
0.4729861319,
0.5410935283,
-0.0449327826,
-0.097043097,
-0.2779183388,
-0.1605926156,
0.1289731562,
0.1257044673,
0.2235037386,
0.0212897565,
-0.0032731071,
-0.07970649,
0.2332623005,
0.2292819172,
0.0552492663,
0.2484000623,
-0.1013189852,
0.1788351536,
-0.1047973037,
-0.138165012,
-0.3142167926,
0.2160551846,
-0.0629846156,
0.1218498349,
-0.3327452242,
0.2367784679,
-0.1883799136,
-0.1930933297,
0.0038399529,
-0.0187658556,
0.0700897947,
0.2044758946,
0.1313028187,
0.0753204897,
-0.252273351,
0.5734855533,
0.0272470787,
0.17247729,
-0.3160077929,
0.6796264648,
-0.1542851031,
-0.1917633861,
0.1090582535,
0.3169533014,
0.2635818422,
-0.2635530233,
0.0134694465,
0.302254647,
-0.1289991438,
0.1917851418,
0.2702764273,
0.0180152878,
0.3637426794,
-0.2720224559,
0.112630181,
0.0491892844,
0.1340865195,
-0.1252142191,
0.1797027588,
0.2376715988,
-0.2186419964,
0.6319780946,
0.4755755067,
-0.0091404244,
0.1447476745,
-0.1831095517,
-0.2134259641,
-0.2173276842,
0.6007688642,
0.2402254045,
0.2317632884,
-0.1197274774,
-0.1488307267,
0.1870428473,
0.3490747213,
0.0156642795,
0.0556957573,
0.2178223431,
-0.1637913585,
-0.1998849511,
-0.1010186076,
0.3804473877,
0.2209592462,
0.1598925442,
0.1135151312,
0.0212801415,
-0.2521316707,
-0.3306075037,
0.2030387223,
0.1683369875,
-0.0889686048,
0.0819642395,
0.2019194812,
0.1494821161,
-0.0947705805,
-0.4382835031,
0.0053284764,
-0.0233715661,
-0.3714634776,
-0.0647936165,
-0.4457282126,
-0.6604163051,
0.0467653386,
-0.1721000075,
-0.2792550027,
-0.6464509964,
-0.1037923098,
0.2034946233,
0.0090117678,
0.0356712416,
-0.2063029557,
-0.0062602162,
-0.2111669779,
-0.0129493354,
-0.1801104844,
-0.3260892034,
-0.0629034564,
0.0395990461,
0.3534009755,
0.0574422888,
0.1485337168,
-0.2441106141,
-0.1611089408,
-0.1612909138,
-0.1101249978,
0.1928355992,
-0.1821730584,
0.3608712554,
0.210953027,
0.4903082252,
0.2029088438,
-0.2494545579,
0.3121626675,
-0.1433432698,
0.0774956197,
-0.0519494228,
0.00776859,
-0.0471087135,
0.1043554097,
-0.009999685,
-0.4713087976,
-0.4650454223,
0.2359145433,
-0.0598597676,
0.2176339924,
0.1203419566,
-0.0410525948,
0.2929947376,
-0.0266636126,
0.1280007064,
-0.0411785506,
-0.335452199,
0.5650700927,
-0.2066883445,
-0.200068891,
-0.0150569454,
0.0398595184,
-0.2016784847,
0.0807285681,
-0.706532836,
-0.2792834044,
-0.4299278259,
0.0273612291,
0.2272491008,
-0.0665317029,
-0.0501207896,
-0.2217651308,
-0.0760566369,
-0.0140880942,
-0.4060665965,
-0.002945669,
0.3619812429,
0.34752509,
0.1065039933,
0.5122762918,
-0.0753892884,
0.2725412846,
-0.0739140138,
0.074788481,
0.6189817786,
0.0821447149,
0.3144125044,
-0.1913373172,
-0.2690410614,
0.1089695394,
-0.3142777085,
0.0521756932,
0.135019213,
0.2109739184,
-0.315036118,
-0.5289623737,
0.1727685779,
-0.4613903463,
-0.1889806092,
0.0283964258,
-0.0316219777,
-0.2145556957,
0.2021682113,
0.0356191248,
-0.0524336472,
-0.2252570987,
-0.0457175002,
0.1494724751,
0.2521129847,
0.0396240354,
-0.1105960086,
-0.1379540563,
-0.4256442189,
0.2749572992,
0.1958888471,
0.3104602396,
-0.2696571052,
-0.0499456562,
0.0475214384,
-0.0248349123,
0.5041224957,
-0.413585037,
0.2516334951,
0.1580863446,
0.0192974806,
-0.6474224925,
0.0887409523,
-0.1247082651,
0.140357852,
0.6630887389,
0.2826696932,
-0.1967945695,
0.1195999384,
-0.0139654195,
0.1940147579,
0.1074986011,
0.074658528,
-0.0067793205,
-0.2686882615,
-0.258589834,
-0.2295415699,
-0.0385957509,
0.0707758367,
-0.0640728846,
-0.3239623904,
0.313452512,
-0.4003783166,
-0.0395803563,
0.0364647582,
0.1761944294,
-0.1073655561,
0.0827874094,
0.2307302058,
0.1615782678,
-0.1797794253,
0.4915142059,
-0.3460823894,
0.0986249447,
0.5306717753,
-0.0260188431,
0.1837768108,
0.6057763696,
0.1359436363,
0.1512894332,
0.0835238099,
0.1909215301,
0.2960534692,
0.3395085335,
0.2108196318,
-0.0150642507,
-0.2821374536,
-0.3250032663,
0.3341209888,
-0.0021029636,
-0.0231160447,
0.203830868,
0.0800836086,
-0.2553892732,
0.2971126139,
-0.3381646872,
1.1482989788,
0.0942293257,
0.1294937432,
-0.1617402732,
-0.2525260448,
0.664984405,
-0.1067687273,
0.1237173378,
-0.0469338596,
-0.1145090163,
-0.1157740727,
-0.1703817248,
-0.2114433497,
0.0383758545,
-0.2149141729,
0.5331194997,
0.0755661726,
-0.011315844,
0.1618145555,
0.4351915419,
-0.1023056209,
-0.0456139371,
-0.5257045627,
0.0877487957,
-0.0442998372,
0.251855433,
-0.0140865408,
-0.1485860944,
-0.2059696317,
0.0053139403,
-0.460152626,
0.195518598,
-0.6179571152,
0.2456761748,
-0.0609877743,
-0.3103727996,
-0.1943773627,
-0.0575053133,
0.1287862659,
-0.0164992549,
-0.3511246145,
0.0874211118,
0.2061313093,
-0.1411282271,
0.1736899614,
0.044138521,
0.164144069,
-0.1359500438,
-0.0259153843,
-0.0478294007,
0.0252407752,
-0.327275902,
0.0321034938,
-0.1285141706,
0.0396088324,
0.0648923367,
-0.1876661777,
-0.138992995,
-0.1790035963,
-0.2322731316,
0.0341175534,
-0.0430923775,
0.1417648941,
-0.0858425647,
0.3707483709,
-0.2654457688,
-0.1623957753,
0.514253974,
-0.3053885996,
0.0012645051,
0.339337945,
0.3020928502,
-0.3583813012,
-0.0925877392,
-0.1114679947,
-0.182228446,
-0.3933392763,
-0.0003346093,
-0.2642870843,
0.0666862428,
-0.0891283527,
0.0259226672,
0.1505149603,
0.1860915571,
-0.1181162968,
-0.6274638176,
-0.1790272892,
0.4015604258,
-0.0270391516,
0.0485014804,
-0.2964763045,
0.1652343869,
0.4207265377,
-0.038622357,
-0.2305992842,
0.1560417414,
-0.4415417314,
0.0104380976,
0.3247362375,
-0.3346687555,
0.0485882014,
-0.0424531065,
0.0563485101,
0.0833297819,
-0.1790659428,
-0.0608193502,
-0.1101867408,
0.1974409223,
0.0511797369,
-0.3207356334,
-0.0964924246,
-0.0979489833,
0.2066011727,
-0.0818445981,
-0.0759879872,
-0.0132813677,
0.0211342238,
-0.2951014936,
0.1266243756,
-0.0999268442,
-0.1372245848,
0.3212112784,
-0.0065162778,
0.2825661004,
0.31738621,
0.1064765155,
0.0883072764,
-0.0717145056,
-0.0931004435,
-0.1241107881,
-0.0829005092,
0.1949428916,
0.4736435711,
-0.0916404426,
0.0512026548,
0.056989029,
0.3979812264,
0.5058016181,
0.0556126162,
-0.1075793058,
0.1179757714,
0.0908085853,
0.0191554651,
0.2518128753,
0.2627198696,
0.1216955855,
0.0234896839,
0.0976518095,
0.0498040468,
-0.072479248,
0.0851857364,
0.0754873529,
0.658939302,
0.2015366554,
-0.0219140425,
-0.1459541917,
0.0476531833,
-0.052735962,
0.0410785601,
-0.0996242315,
0.0687502399,
-0.1266630888,
0.0534233898,
-0.0639401898,
-0.1313263625,
0.3249319494,
0.2482801676,
-0.2003667057,
-0.0873380154,
0.2849597335,
0.0983348638,
0.1369966418,
-0.2204144299,
0.3333669603,
-0.1611332744,
0.0659119487,
-0.1380517781,
0.4353381991,
0.1926470995,
-0.3216385543,
0.0582482442,
-0.3349764943,
-0.4865925312,
0.0540139154,
-0.1999975294,
-0.1715221256,
0.1560162902,
0.2826381624,
0.0135861393,
-0.7253263593,
-0.1404954493,
-0.191246748,
0.2329851389,
-0.3082933724,
0.4661636353,
-0.1038244218,
0.0777672082,
-0.0195024386,
0.1526184827,
0.3781438768,
0.3050532639,
-0.4279477298,
0.0964921266,
0.2019782364,
-0.0506127886,
-0.0096545927,
0.2302980721,
0.4857645333,
-0.1341397464,
0.49281317,
0.0891052559,
-0.1842340231,
0.268684566,
-0.0354274102,
0.1412736028,
-0.075104326,
0.2763310373,
-0.0875300691,
-0.0513292328,
-0.2288813144,
-0.1396887898,
-0.5556664467,
-0.1096579731,
-0.0335210338,
-0.597343564,
0.0323540643,
-0.0905735493,
0.0411371812,
0.0053060539,
0.6298922896,
0.569093585,
0.5683694482,
-0.3809540272,
-0.3364818692,
-0.3252818882,
0.0403119996,
-0.1251098514,
-0.0124701113,
-0.1701543033,
0.2908909619,
-0.1745342016,
0.2047265768,
-0.0137992352,
0.674814105,
-0.1523850262,
0.174304381,
-0.2713821828,
-0.0484786294,
-0.0313814692,
0.0573393479,
-0.1030780375,
-0.0273056477,
0.0516433083,
-0.5177119374,
-0.0516983978,
0.275902003,
-0.2825131118,
0.1339704096,
0.2366518974,
0.3119562268,
0.1032112837,
0.762863338,
-0.1227935404,
-0.1470052153,
-0.4071493149,
-0.6882560849,
-0.024067834,
-0.1192929298,
0.0928618908,
0.1546038091,
-0.0933203995,
0.202130571,
-0.1746086925,
0.190074563,
-0.0879269838,
0.4516708553,
0.1424891651,
-0.3678413332,
0.0650075525,
-0.0495071746,
0.0439471081,
-0.0063380934,
-0.2057439983,
0.1007040143,
-0.1159255207,
-0.3511935472,
0.5208381414,
-0.0540382303,
-0.0175135434,
-0.1145697683,
0.2055764198,
0.0688287169,
0.212995559,
-0.3332383037,
-0.2800841928,
0.3444099128,
0.0583391972,
-0.43530792,
0.2324450016,
-0.0068449657,
-0.0778454095,
-0.0766063109,
-0.0418283679,
0.2216901779,
-0.2184318751,
-0.192987904,
-0.067809172
] |
https://github.com/huggingface/datasets/issues/575 | Couldn't reach certain URLs and for the ones that can be reached, code just blocks after downloading. | @jeswan `"cola"` is not a valid dataset identifier (you can check the up-to-date list on https://huggingface.co/datasets) but you can find cola inside glue. | Hi,
I'm following the [quick tour](https://huggingface.co/nlp/quicktour.html) and tried to load the glue dataset:
```
>>> from nlp import load_dataset
>>> dataset = load_dataset('glue', 'mrpc', split='train')
```
However, this ran into a `ConnectionError` saying it could not reach the URL (just pasting the last few lines):
```
/net/vaosl01/opt/NFS/su0/miniconda3/envs/hf/lib/python3.7/site-packages/nlp/utils/file_utils.py in get_from_cache(url, cache_dir, force_download, proxies, etag_timeout, resume_download, user_agent, local_files_only)
354 " to False."
355 )
--> 356 raise ConnectionError("Couldn't reach {}".format(url))
357
358 # From now on, connected is True.
ConnectionError: Couldn't reach https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc
```
I tried glue with cola and sst2. I got the same error, just instead of mrpc in the URL, it was replaced with cola and sst2.
Since this was not working, I thought I'll try another dataset. So I tried downloading the imdb dataset:
```
ds = load_dataset('imdb', split='train')
```
This downloads the data, but it just blocks after that:
```
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.56k/4.56k [00:00<00:00, 1.38MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.07k/2.07k [00:00<00:00, 1.15MB/s]
Downloading and preparing dataset imdb/plain_text (download: 80.23 MiB, generated: 127.06 MiB, post-processed: Unknown sizetotal: 207.28 MiB) to /net/vaosl01/opt/NFS/su0/huggingface/datasets/imdb/plain_text/1.0.0/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743...
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 84.1M/84.1M [00:07<00:00, 11.1MB/s]
```
I checked the folder `$HF_HOME/datasets/downloads/extracted/<id>/aclImdb`. This folder is constantly growing in size. When I navigated to the train folder within, there was no file. However, the test folder seemed to be populating. The last time I checked it was 327M. I thought the Imdb dataset was smaller than that. My questions are:
1. Why is it still blocking? Is it still downloading?
2. I specified split as train, so why is the test folder being populated?
3. I read somewhere that after downloading, `nlp` converts the text files into some sort of `arrow` files, which will also take a while. Is this also happening here?
Thanks.
| 23 | Couldn't reach certain URLs and for the ones that can be reached, code just blocks after downloading.
Hi,
I'm following the [quick tour](https://huggingface.co/nlp/quicktour.html) and tried to load the glue dataset:
```
>>> from nlp import load_dataset
>>> dataset = load_dataset('glue', 'mrpc', split='train')
```
However, this ran into a `ConnectionError` saying it could not reach the URL (just pasting the last few lines):
```
/net/vaosl01/opt/NFS/su0/miniconda3/envs/hf/lib/python3.7/site-packages/nlp/utils/file_utils.py in get_from_cache(url, cache_dir, force_download, proxies, etag_timeout, resume_download, user_agent, local_files_only)
354 " to False."
355 )
--> 356 raise ConnectionError("Couldn't reach {}".format(url))
357
358 # From now on, connected is True.
ConnectionError: Couldn't reach https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc
```
I tried glue with cola and sst2. I got the same error, just instead of mrpc in the URL, it was replaced with cola and sst2.
Since this was not working, I thought I'll try another dataset. So I tried downloading the imdb dataset:
```
ds = load_dataset('imdb', split='train')
```
This downloads the data, but it just blocks after that:
```
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.56k/4.56k [00:00<00:00, 1.38MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.07k/2.07k [00:00<00:00, 1.15MB/s]
Downloading and preparing dataset imdb/plain_text (download: 80.23 MiB, generated: 127.06 MiB, post-processed: Unknown sizetotal: 207.28 MiB) to /net/vaosl01/opt/NFS/su0/huggingface/datasets/imdb/plain_text/1.0.0/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743...
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 84.1M/84.1M [00:07<00:00, 11.1MB/s]
```
I checked the folder `$HF_HOME/datasets/downloads/extracted/<id>/aclImdb`. This folder is constantly growing in size. When I navigated to the train folder within, there was no file. However, the test folder seemed to be populating. The last time I checked it was 327M. I thought the Imdb dataset was smaller than that. My questions are:
1. Why is it still blocking? Is it still downloading?
2. I specified split as train, so why is the test folder being populated?
3. I read somewhere that after downloading, `nlp` converts the text files into some sort of `arrow` files, which will also take a while. Is this also happening here?
Thanks.
@jeswan `"cola"` is not a valid dataset identifier (you can check the up-to-date list on https://huggingface.co/datasets) but you can find cola inside glue. | [
-0.1126361117,
-0.0148183554,
-0.051447019,
0.2499043196,
0.2150653601,
-0.1080485284,
-0.0023044972,
0.1171094924,
0.144177109,
-0.1348839998,
-0.4256687164,
-0.0689157769,
0.1447226703,
0.1289690882,
0.2742245495,
-0.0121177407,
-0.1593030542,
-0.0744766146,
-0.0916164368,
0.0177008733,
-0.2575626075,
0.3606528938,
-0.2036086917,
0.1764501333,
-0.1922913343,
-0.0935443714,
-0.1606315374,
0.1398070753,
-0.1229681,
-0.3245213628,
0.0947915614,
0.3048552871,
0.0481999665,
0.3857812881,
-0.0001219456,
-0.0564450845,
0.5723122358,
-0.0555413514,
-0.3910083175,
-0.4353300929,
-0.2779271305,
-0.0589924008,
0.3559851348,
-0.0667514801,
0.1307542026,
0.1634608656,
0.0878986865,
-0.2419316173,
0.2720349729,
0.2631810606,
0.1377919018,
0.0733034015,
0.135902822,
0.0714743882,
0.3174402416,
-0.2632397711,
0.0046694875,
0.3708162308,
0.3326880336,
-0.2914192677,
0.0472084209,
0.0577221401,
-0.2056468874,
0.0583761223,
0.1841482967,
0.1635259986,
-0.2578425407,
-0.5462061167,
0.0292285308,
0.3213653862,
0.2748624086,
0.0140512735,
-0.4840654135,
-0.3165414333,
-0.0286721066,
0.117361702,
0.2789926827,
0.1334595382,
-0.2538383007,
0.0309729986,
-0.5450455546,
-0.1321875602,
0.0033701584,
0.5532462001,
0.110110417,
-0.0098114721,
0.0639422238,
0.1689393818,
0.3128623068,
-0.039525412,
0.1148342639,
-0.0232362803,
0.2994756401,
0.1872626394,
-0.0450580791,
-0.0265388787,
0.0512052402,
0.423658222,
0.1454074085,
0.1372273564,
0.1992810965,
0.0405514576,
-0.1392972916,
0.1321616471,
0.154836446,
-0.0140558481,
-0.0127520002,
0.0798757747,
0.6293035746,
0.3603570759,
-0.1680347174,
0.0711007714,
-0.1136870608,
-0.0785522014,
-0.3989925683,
-0.0077087712,
-0.1199620143,
-0.2840939462,
-0.1622229815,
-0.0771392286,
-0.1368773729,
0.015072003,
0.0852004364,
0.3928339183,
-0.3108936548,
0.2066409141,
-0.087643221,
0.3793321848,
-0.1213533953,
0.03193371,
-0.093083322,
0.1856426001,
-0.1889153719,
0.2517305613,
0.4859229922,
0.0784360468,
0.3209572732,
-0.3183522522,
-0.1498349011,
-0.2291524708,
0.0177800283,
-0.3440530598,
0.0556666888,
0.0532135479,
0.3014577925,
0.3389766812,
-0.0610681996,
-0.1870908141,
-0.1006211489,
-0.210953325,
-0.2240745276,
-0.1662693322,
0.3674654663,
0.1121519357,
-0.4041665196,
-0.1559496522,
-0.1344942153,
-0.0048293001,
-0.0961962938,
-0.09488298,
-0.1139310896,
-0.0595752336,
-0.1584252119,
-0.2009519637,
0.3062125444,
0.6965600848,
-0.1322717071,
-0.2766488791,
-0.1275887787,
0.0412126295,
-0.0782897323,
0.5020376444,
-0.0070994124,
0.1006360203,
-0.4256359339,
0.2437321395,
0.5672341585,
-0.2894795835,
-0.3902561367,
0.5320960879,
-0.0154578015,
0.0364814922,
0.1849974543,
0.0225607976,
0.1312482357,
-0.0723037198,
0.4729861319,
0.5410935283,
-0.0449327826,
-0.097043097,
-0.2779183388,
-0.1605926156,
0.1289731562,
0.1257044673,
0.2235037386,
0.0212897565,
-0.0032731071,
-0.07970649,
0.2332623005,
0.2292819172,
0.0552492663,
0.2484000623,
-0.1013189852,
0.1788351536,
-0.1047973037,
-0.138165012,
-0.3142167926,
0.2160551846,
-0.0629846156,
0.1218498349,
-0.3327452242,
0.2367784679,
-0.1883799136,
-0.1930933297,
0.0038399529,
-0.0187658556,
0.0700897947,
0.2044758946,
0.1313028187,
0.0753204897,
-0.252273351,
0.5734855533,
0.0272470787,
0.17247729,
-0.3160077929,
0.6796264648,
-0.1542851031,
-0.1917633861,
0.1090582535,
0.3169533014,
0.2635818422,
-0.2635530233,
0.0134694465,
0.302254647,
-0.1289991438,
0.1917851418,
0.2702764273,
0.0180152878,
0.3637426794,
-0.2720224559,
0.112630181,
0.0491892844,
0.1340865195,
-0.1252142191,
0.1797027588,
0.2376715988,
-0.2186419964,
0.6319780946,
0.4755755067,
-0.0091404244,
0.1447476745,
-0.1831095517,
-0.2134259641,
-0.2173276842,
0.6007688642,
0.2402254045,
0.2317632884,
-0.1197274774,
-0.1488307267,
0.1870428473,
0.3490747213,
0.0156642795,
0.0556957573,
0.2178223431,
-0.1637913585,
-0.1998849511,
-0.1010186076,
0.3804473877,
0.2209592462,
0.1598925442,
0.1135151312,
0.0212801415,
-0.2521316707,
-0.3306075037,
0.2030387223,
0.1683369875,
-0.0889686048,
0.0819642395,
0.2019194812,
0.1494821161,
-0.0947705805,
-0.4382835031,
0.0053284764,
-0.0233715661,
-0.3714634776,
-0.0647936165,
-0.4457282126,
-0.6604163051,
0.0467653386,
-0.1721000075,
-0.2792550027,
-0.6464509964,
-0.1037923098,
0.2034946233,
0.0090117678,
0.0356712416,
-0.2063029557,
-0.0062602162,
-0.2111669779,
-0.0129493354,
-0.1801104844,
-0.3260892034,
-0.0629034564,
0.0395990461,
0.3534009755,
0.0574422888,
0.1485337168,
-0.2441106141,
-0.1611089408,
-0.1612909138,
-0.1101249978,
0.1928355992,
-0.1821730584,
0.3608712554,
0.210953027,
0.4903082252,
0.2029088438,
-0.2494545579,
0.3121626675,
-0.1433432698,
0.0774956197,
-0.0519494228,
0.00776859,
-0.0471087135,
0.1043554097,
-0.009999685,
-0.4713087976,
-0.4650454223,
0.2359145433,
-0.0598597676,
0.2176339924,
0.1203419566,
-0.0410525948,
0.2929947376,
-0.0266636126,
0.1280007064,
-0.0411785506,
-0.335452199,
0.5650700927,
-0.2066883445,
-0.200068891,
-0.0150569454,
0.0398595184,
-0.2016784847,
0.0807285681,
-0.706532836,
-0.2792834044,
-0.4299278259,
0.0273612291,
0.2272491008,
-0.0665317029,
-0.0501207896,
-0.2217651308,
-0.0760566369,
-0.0140880942,
-0.4060665965,
-0.002945669,
0.3619812429,
0.34752509,
0.1065039933,
0.5122762918,
-0.0753892884,
0.2725412846,
-0.0739140138,
0.074788481,
0.6189817786,
0.0821447149,
0.3144125044,
-0.1913373172,
-0.2690410614,
0.1089695394,
-0.3142777085,
0.0521756932,
0.135019213,
0.2109739184,
-0.315036118,
-0.5289623737,
0.1727685779,
-0.4613903463,
-0.1889806092,
0.0283964258,
-0.0316219777,
-0.2145556957,
0.2021682113,
0.0356191248,
-0.0524336472,
-0.2252570987,
-0.0457175002,
0.1494724751,
0.2521129847,
0.0396240354,
-0.1105960086,
-0.1379540563,
-0.4256442189,
0.2749572992,
0.1958888471,
0.3104602396,
-0.2696571052,
-0.0499456562,
0.0475214384,
-0.0248349123,
0.5041224957,
-0.413585037,
0.2516334951,
0.1580863446,
0.0192974806,
-0.6474224925,
0.0887409523,
-0.1247082651,
0.140357852,
0.6630887389,
0.2826696932,
-0.1967945695,
0.1195999384,
-0.0139654195,
0.1940147579,
0.1074986011,
0.074658528,
-0.0067793205,
-0.2686882615,
-0.258589834,
-0.2295415699,
-0.0385957509,
0.0707758367,
-0.0640728846,
-0.3239623904,
0.313452512,
-0.4003783166,
-0.0395803563,
0.0364647582,
0.1761944294,
-0.1073655561,
0.0827874094,
0.2307302058,
0.1615782678,
-0.1797794253,
0.4915142059,
-0.3460823894,
0.0986249447,
0.5306717753,
-0.0260188431,
0.1837768108,
0.6057763696,
0.1359436363,
0.1512894332,
0.0835238099,
0.1909215301,
0.2960534692,
0.3395085335,
0.2108196318,
-0.0150642507,
-0.2821374536,
-0.3250032663,
0.3341209888,
-0.0021029636,
-0.0231160447,
0.203830868,
0.0800836086,
-0.2553892732,
0.2971126139,
-0.3381646872,
1.1482989788,
0.0942293257,
0.1294937432,
-0.1617402732,
-0.2525260448,
0.664984405,
-0.1067687273,
0.1237173378,
-0.0469338596,
-0.1145090163,
-0.1157740727,
-0.1703817248,
-0.2114433497,
0.0383758545,
-0.2149141729,
0.5331194997,
0.0755661726,
-0.011315844,
0.1618145555,
0.4351915419,
-0.1023056209,
-0.0456139371,
-0.5257045627,
0.0877487957,
-0.0442998372,
0.251855433,
-0.0140865408,
-0.1485860944,
-0.2059696317,
0.0053139403,
-0.460152626,
0.195518598,
-0.6179571152,
0.2456761748,
-0.0609877743,
-0.3103727996,
-0.1943773627,
-0.0575053133,
0.1287862659,
-0.0164992549,
-0.3511246145,
0.0874211118,
0.2061313093,
-0.1411282271,
0.1736899614,
0.044138521,
0.164144069,
-0.1359500438,
-0.0259153843,
-0.0478294007,
0.0252407752,
-0.327275902,
0.0321034938,
-0.1285141706,
0.0396088324,
0.0648923367,
-0.1876661777,
-0.138992995,
-0.1790035963,
-0.2322731316,
0.0341175534,
-0.0430923775,
0.1417648941,
-0.0858425647,
0.3707483709,
-0.2654457688,
-0.1623957753,
0.514253974,
-0.3053885996,
0.0012645051,
0.339337945,
0.3020928502,
-0.3583813012,
-0.0925877392,
-0.1114679947,
-0.182228446,
-0.3933392763,
-0.0003346093,
-0.2642870843,
0.0666862428,
-0.0891283527,
0.0259226672,
0.1505149603,
0.1860915571,
-0.1181162968,
-0.6274638176,
-0.1790272892,
0.4015604258,
-0.0270391516,
0.0485014804,
-0.2964763045,
0.1652343869,
0.4207265377,
-0.038622357,
-0.2305992842,
0.1560417414,
-0.4415417314,
0.0104380976,
0.3247362375,
-0.3346687555,
0.0485882014,
-0.0424531065,
0.0563485101,
0.0833297819,
-0.1790659428,
-0.0608193502,
-0.1101867408,
0.1974409223,
0.0511797369,
-0.3207356334,
-0.0964924246,
-0.0979489833,
0.2066011727,
-0.0818445981,
-0.0759879872,
-0.0132813677,
0.0211342238,
-0.2951014936,
0.1266243756,
-0.0999268442,
-0.1372245848,
0.3212112784,
-0.0065162778,
0.2825661004,
0.31738621,
0.1064765155,
0.0883072764,
-0.0717145056,
-0.0931004435,
-0.1241107881,
-0.0829005092,
0.1949428916,
0.4736435711,
-0.0916404426,
0.0512026548,
0.056989029,
0.3979812264,
0.5058016181,
0.0556126162,
-0.1075793058,
0.1179757714,
0.0908085853,
0.0191554651,
0.2518128753,
0.2627198696,
0.1216955855,
0.0234896839,
0.0976518095,
0.0498040468,
-0.072479248,
0.0851857364,
0.0754873529,
0.658939302,
0.2015366554,
-0.0219140425,
-0.1459541917,
0.0476531833,
-0.052735962,
0.0410785601,
-0.0996242315,
0.0687502399,
-0.1266630888,
0.0534233898,
-0.0639401898,
-0.1313263625,
0.3249319494,
0.2482801676,
-0.2003667057,
-0.0873380154,
0.2849597335,
0.0983348638,
0.1369966418,
-0.2204144299,
0.3333669603,
-0.1611332744,
0.0659119487,
-0.1380517781,
0.4353381991,
0.1926470995,
-0.3216385543,
0.0582482442,
-0.3349764943,
-0.4865925312,
0.0540139154,
-0.1999975294,
-0.1715221256,
0.1560162902,
0.2826381624,
0.0135861393,
-0.7253263593,
-0.1404954493,
-0.191246748,
0.2329851389,
-0.3082933724,
0.4661636353,
-0.1038244218,
0.0777672082,
-0.0195024386,
0.1526184827,
0.3781438768,
0.3050532639,
-0.4279477298,
0.0964921266,
0.2019782364,
-0.0506127886,
-0.0096545927,
0.2302980721,
0.4857645333,
-0.1341397464,
0.49281317,
0.0891052559,
-0.1842340231,
0.268684566,
-0.0354274102,
0.1412736028,
-0.075104326,
0.2763310373,
-0.0875300691,
-0.0513292328,
-0.2288813144,
-0.1396887898,
-0.5556664467,
-0.1096579731,
-0.0335210338,
-0.597343564,
0.0323540643,
-0.0905735493,
0.0411371812,
0.0053060539,
0.6298922896,
0.569093585,
0.5683694482,
-0.3809540272,
-0.3364818692,
-0.3252818882,
0.0403119996,
-0.1251098514,
-0.0124701113,
-0.1701543033,
0.2908909619,
-0.1745342016,
0.2047265768,
-0.0137992352,
0.674814105,
-0.1523850262,
0.174304381,
-0.2713821828,
-0.0484786294,
-0.0313814692,
0.0573393479,
-0.1030780375,
-0.0273056477,
0.0516433083,
-0.5177119374,
-0.0516983978,
0.275902003,
-0.2825131118,
0.1339704096,
0.2366518974,
0.3119562268,
0.1032112837,
0.762863338,
-0.1227935404,
-0.1470052153,
-0.4071493149,
-0.6882560849,
-0.024067834,
-0.1192929298,
0.0928618908,
0.1546038091,
-0.0933203995,
0.202130571,
-0.1746086925,
0.190074563,
-0.0879269838,
0.4516708553,
0.1424891651,
-0.3678413332,
0.0650075525,
-0.0495071746,
0.0439471081,
-0.0063380934,
-0.2057439983,
0.1007040143,
-0.1159255207,
-0.3511935472,
0.5208381414,
-0.0540382303,
-0.0175135434,
-0.1145697683,
0.2055764198,
0.0688287169,
0.212995559,
-0.3332383037,
-0.2800841928,
0.3444099128,
0.0583391972,
-0.43530792,
0.2324450016,
-0.0068449657,
-0.0778454095,
-0.0766063109,
-0.0418283679,
0.2216901779,
-0.2184318751,
-0.192987904,
-0.067809172
] |
https://github.com/huggingface/datasets/issues/575 | Couldn't reach certain URLs and for the ones that can be reached, code just blocks after downloading. | Hi. Closing this one since #626 updated the glue urls.
> 1. Why is it still blocking? Is it still downloading?
After downloading it generates the arrow file by iterating through the examples.
The number of examples processed by second is shown during the processing (not sure why it was not the case for you)
> 2. I specified split as train, so why is the test folder being populated?
It downloads every split
| Hi,
I'm following the [quick tour](https://huggingface.co/nlp/quicktour.html) and tried to load the glue dataset:
```
>>> from nlp import load_dataset
>>> dataset = load_dataset('glue', 'mrpc', split='train')
```
However, this ran into a `ConnectionError` saying it could not reach the URL (just pasting the last few lines):
```
/net/vaosl01/opt/NFS/su0/miniconda3/envs/hf/lib/python3.7/site-packages/nlp/utils/file_utils.py in get_from_cache(url, cache_dir, force_download, proxies, etag_timeout, resume_download, user_agent, local_files_only)
354 " to False."
355 )
--> 356 raise ConnectionError("Couldn't reach {}".format(url))
357
358 # From now on, connected is True.
ConnectionError: Couldn't reach https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc
```
I tried glue with cola and sst2. I got the same error, just instead of mrpc in the URL, it was replaced with cola and sst2.
Since this was not working, I thought I'll try another dataset. So I tried downloading the imdb dataset:
```
ds = load_dataset('imdb', split='train')
```
This downloads the data, but it just blocks after that:
```
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.56k/4.56k [00:00<00:00, 1.38MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.07k/2.07k [00:00<00:00, 1.15MB/s]
Downloading and preparing dataset imdb/plain_text (download: 80.23 MiB, generated: 127.06 MiB, post-processed: Unknown sizetotal: 207.28 MiB) to /net/vaosl01/opt/NFS/su0/huggingface/datasets/imdb/plain_text/1.0.0/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743...
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 84.1M/84.1M [00:07<00:00, 11.1MB/s]
```
I checked the folder `$HF_HOME/datasets/downloads/extracted/<id>/aclImdb`. This folder is constantly growing in size. When I navigated to the train folder within, there was no file. However, the test folder seemed to be populating. The last time I checked it was 327M. I thought the Imdb dataset was smaller than that. My questions are:
1. Why is it still blocking? Is it still downloading?
2. I specified split as train, so why is the test folder being populated?
3. I read somewhere that after downloading, `nlp` converts the text files into some sort of `arrow` files, which will also take a while. Is this also happening here?
Thanks.
| 74 | Couldn't reach certain URLs and for the ones that can be reached, code just blocks after downloading.
Hi,
I'm following the [quick tour](https://huggingface.co/nlp/quicktour.html) and tried to load the glue dataset:
```
>>> from nlp import load_dataset
>>> dataset = load_dataset('glue', 'mrpc', split='train')
```
However, this ran into a `ConnectionError` saying it could not reach the URL (just pasting the last few lines):
```
/net/vaosl01/opt/NFS/su0/miniconda3/envs/hf/lib/python3.7/site-packages/nlp/utils/file_utils.py in get_from_cache(url, cache_dir, force_download, proxies, etag_timeout, resume_download, user_agent, local_files_only)
354 " to False."
355 )
--> 356 raise ConnectionError("Couldn't reach {}".format(url))
357
358 # From now on, connected is True.
ConnectionError: Couldn't reach https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc
```
I tried glue with cola and sst2. I got the same error, just instead of mrpc in the URL, it was replaced with cola and sst2.
Since this was not working, I thought I'll try another dataset. So I tried downloading the imdb dataset:
```
ds = load_dataset('imdb', split='train')
```
This downloads the data, but it just blocks after that:
```
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.56k/4.56k [00:00<00:00, 1.38MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.07k/2.07k [00:00<00:00, 1.15MB/s]
Downloading and preparing dataset imdb/plain_text (download: 80.23 MiB, generated: 127.06 MiB, post-processed: Unknown sizetotal: 207.28 MiB) to /net/vaosl01/opt/NFS/su0/huggingface/datasets/imdb/plain_text/1.0.0/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743...
Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 84.1M/84.1M [00:07<00:00, 11.1MB/s]
```
I checked the folder `$HF_HOME/datasets/downloads/extracted/<id>/aclImdb`. This folder is constantly growing in size. When I navigated to the train folder within, there was no file. However, the test folder seemed to be populating. The last time I checked it was 327M. I thought the Imdb dataset was smaller than that. My questions are:
1. Why is it still blocking? Is it still downloading?
2. I specified split as train, so why is the test folder being populated?
3. I read somewhere that after downloading, `nlp` converts the text files into some sort of `arrow` files, which will also take a while. Is this also happening here?
Thanks.
Hi. Closing this one since #626 updated the glue urls.
> 1. Why is it still blocking? Is it still downloading?
After downloading it generates the arrow file by iterating through the examples.
The number of examples processed by second is shown during the processing (not sure why it was not the case for you)
> 2. I specified split as train, so why is the test folder being populated?
It downloads every split
| [
-0.1126361117,
-0.0148183554,
-0.051447019,
0.2499043196,
0.2150653601,
-0.1080485284,
-0.0023044972,
0.1171094924,
0.144177109,
-0.1348839998,
-0.4256687164,
-0.0689157769,
0.1447226703,
0.1289690882,
0.2742245495,
-0.0121177407,
-0.1593030542,
-0.0744766146,
-0.0916164368,
0.0177008733,
-0.2575626075,
0.3606528938,
-0.2036086917,
0.1764501333,
-0.1922913343,
-0.0935443714,
-0.1606315374,
0.1398070753,
-0.1229681,
-0.3245213628,
0.0947915614,
0.3048552871,
0.0481999665,
0.3857812881,
-0.0001219456,
-0.0564450845,
0.5723122358,
-0.0555413514,
-0.3910083175,
-0.4353300929,
-0.2779271305,
-0.0589924008,
0.3559851348,
-0.0667514801,
0.1307542026,
0.1634608656,
0.0878986865,
-0.2419316173,
0.2720349729,
0.2631810606,
0.1377919018,
0.0733034015,
0.135902822,
0.0714743882,
0.3174402416,
-0.2632397711,
0.0046694875,
0.3708162308,
0.3326880336,
-0.2914192677,
0.0472084209,
0.0577221401,
-0.2056468874,
0.0583761223,
0.1841482967,
0.1635259986,
-0.2578425407,
-0.5462061167,
0.0292285308,
0.3213653862,
0.2748624086,
0.0140512735,
-0.4840654135,
-0.3165414333,
-0.0286721066,
0.117361702,
0.2789926827,
0.1334595382,
-0.2538383007,
0.0309729986,
-0.5450455546,
-0.1321875602,
0.0033701584,
0.5532462001,
0.110110417,
-0.0098114721,
0.0639422238,
0.1689393818,
0.3128623068,
-0.039525412,
0.1148342639,
-0.0232362803,
0.2994756401,
0.1872626394,
-0.0450580791,
-0.0265388787,
0.0512052402,
0.423658222,
0.1454074085,
0.1372273564,
0.1992810965,
0.0405514576,
-0.1392972916,
0.1321616471,
0.154836446,
-0.0140558481,
-0.0127520002,
0.0798757747,
0.6293035746,
0.3603570759,
-0.1680347174,
0.0711007714,
-0.1136870608,
-0.0785522014,
-0.3989925683,
-0.0077087712,
-0.1199620143,
-0.2840939462,
-0.1622229815,
-0.0771392286,
-0.1368773729,
0.015072003,
0.0852004364,
0.3928339183,
-0.3108936548,
0.2066409141,
-0.087643221,
0.3793321848,
-0.1213533953,
0.03193371,
-0.093083322,
0.1856426001,
-0.1889153719,
0.2517305613,
0.4859229922,
0.0784360468,
0.3209572732,
-0.3183522522,
-0.1498349011,
-0.2291524708,
0.0177800283,
-0.3440530598,
0.0556666888,
0.0532135479,
0.3014577925,
0.3389766812,
-0.0610681996,
-0.1870908141,
-0.1006211489,
-0.210953325,
-0.2240745276,
-0.1662693322,
0.3674654663,
0.1121519357,
-0.4041665196,
-0.1559496522,
-0.1344942153,
-0.0048293001,
-0.0961962938,
-0.09488298,
-0.1139310896,
-0.0595752336,
-0.1584252119,
-0.2009519637,
0.3062125444,
0.6965600848,
-0.1322717071,
-0.2766488791,
-0.1275887787,
0.0412126295,
-0.0782897323,
0.5020376444,
-0.0070994124,
0.1006360203,
-0.4256359339,
0.2437321395,
0.5672341585,
-0.2894795835,
-0.3902561367,
0.5320960879,
-0.0154578015,
0.0364814922,
0.1849974543,
0.0225607976,
0.1312482357,
-0.0723037198,
0.4729861319,
0.5410935283,
-0.0449327826,
-0.097043097,
-0.2779183388,
-0.1605926156,
0.1289731562,
0.1257044673,
0.2235037386,
0.0212897565,
-0.0032731071,
-0.07970649,
0.2332623005,
0.2292819172,
0.0552492663,
0.2484000623,
-0.1013189852,
0.1788351536,
-0.1047973037,
-0.138165012,
-0.3142167926,
0.2160551846,
-0.0629846156,
0.1218498349,
-0.3327452242,
0.2367784679,
-0.1883799136,
-0.1930933297,
0.0038399529,
-0.0187658556,
0.0700897947,
0.2044758946,
0.1313028187,
0.0753204897,
-0.252273351,
0.5734855533,
0.0272470787,
0.17247729,
-0.3160077929,
0.6796264648,
-0.1542851031,
-0.1917633861,
0.1090582535,
0.3169533014,
0.2635818422,
-0.2635530233,
0.0134694465,
0.302254647,
-0.1289991438,
0.1917851418,
0.2702764273,
0.0180152878,
0.3637426794,
-0.2720224559,
0.112630181,
0.0491892844,
0.1340865195,
-0.1252142191,
0.1797027588,
0.2376715988,
-0.2186419964,
0.6319780946,
0.4755755067,
-0.0091404244,
0.1447476745,
-0.1831095517,
-0.2134259641,
-0.2173276842,
0.6007688642,
0.2402254045,
0.2317632884,
-0.1197274774,
-0.1488307267,
0.1870428473,
0.3490747213,
0.0156642795,
0.0556957573,
0.2178223431,
-0.1637913585,
-0.1998849511,
-0.1010186076,
0.3804473877,
0.2209592462,
0.1598925442,
0.1135151312,
0.0212801415,
-0.2521316707,
-0.3306075037,
0.2030387223,
0.1683369875,
-0.0889686048,
0.0819642395,
0.2019194812,
0.1494821161,
-0.0947705805,
-0.4382835031,
0.0053284764,
-0.0233715661,
-0.3714634776,
-0.0647936165,
-0.4457282126,
-0.6604163051,
0.0467653386,
-0.1721000075,
-0.2792550027,
-0.6464509964,
-0.1037923098,
0.2034946233,
0.0090117678,
0.0356712416,
-0.2063029557,
-0.0062602162,
-0.2111669779,
-0.0129493354,
-0.1801104844,
-0.3260892034,
-0.0629034564,
0.0395990461,
0.3534009755,
0.0574422888,
0.1485337168,
-0.2441106141,
-0.1611089408,
-0.1612909138,
-0.1101249978,
0.1928355992,
-0.1821730584,
0.3608712554,
0.210953027,
0.4903082252,
0.2029088438,
-0.2494545579,
0.3121626675,
-0.1433432698,
0.0774956197,
-0.0519494228,
0.00776859,
-0.0471087135,
0.1043554097,
-0.009999685,
-0.4713087976,
-0.4650454223,
0.2359145433,
-0.0598597676,
0.2176339924,
0.1203419566,
-0.0410525948,
0.2929947376,
-0.0266636126,
0.1280007064,
-0.0411785506,
-0.335452199,
0.5650700927,
-0.2066883445,
-0.200068891,
-0.0150569454,
0.0398595184,
-0.2016784847,
0.0807285681,
-0.706532836,
-0.2792834044,
-0.4299278259,
0.0273612291,
0.2272491008,
-0.0665317029,
-0.0501207896,
-0.2217651308,
-0.0760566369,
-0.0140880942,
-0.4060665965,
-0.002945669,
0.3619812429,
0.34752509,
0.1065039933,
0.5122762918,
-0.0753892884,
0.2725412846,
-0.0739140138,
0.074788481,
0.6189817786,
0.0821447149,
0.3144125044,
-0.1913373172,
-0.2690410614,
0.1089695394,
-0.3142777085,
0.0521756932,
0.135019213,
0.2109739184,
-0.315036118,
-0.5289623737,
0.1727685779,
-0.4613903463,
-0.1889806092,
0.0283964258,
-0.0316219777,
-0.2145556957,
0.2021682113,
0.0356191248,
-0.0524336472,
-0.2252570987,
-0.0457175002,
0.1494724751,
0.2521129847,
0.0396240354,
-0.1105960086,
-0.1379540563,
-0.4256442189,
0.2749572992,
0.1958888471,
0.3104602396,
-0.2696571052,
-0.0499456562,
0.0475214384,
-0.0248349123,
0.5041224957,
-0.413585037,
0.2516334951,
0.1580863446,
0.0192974806,
-0.6474224925,
0.0887409523,
-0.1247082651,
0.140357852,
0.6630887389,
0.2826696932,
-0.1967945695,
0.1195999384,
-0.0139654195,
0.1940147579,
0.1074986011,
0.074658528,
-0.0067793205,
-0.2686882615,
-0.258589834,
-0.2295415699,
-0.0385957509,
0.0707758367,
-0.0640728846,
-0.3239623904,
0.313452512,
-0.4003783166,
-0.0395803563,
0.0364647582,
0.1761944294,
-0.1073655561,
0.0827874094,
0.2307302058,
0.1615782678,
-0.1797794253,
0.4915142059,
-0.3460823894,
0.0986249447,
0.5306717753,
-0.0260188431,
0.1837768108,
0.6057763696,
0.1359436363,
0.1512894332,
0.0835238099,
0.1909215301,
0.2960534692,
0.3395085335,
0.2108196318,
-0.0150642507,
-0.2821374536,
-0.3250032663,
0.3341209888,
-0.0021029636,
-0.0231160447,
0.203830868,
0.0800836086,
-0.2553892732,
0.2971126139,
-0.3381646872,
1.1482989788,
0.0942293257,
0.1294937432,
-0.1617402732,
-0.2525260448,
0.664984405,
-0.1067687273,
0.1237173378,
-0.0469338596,
-0.1145090163,
-0.1157740727,
-0.1703817248,
-0.2114433497,
0.0383758545,
-0.2149141729,
0.5331194997,
0.0755661726,
-0.011315844,
0.1618145555,
0.4351915419,
-0.1023056209,
-0.0456139371,
-0.5257045627,
0.0877487957,
-0.0442998372,
0.251855433,
-0.0140865408,
-0.1485860944,
-0.2059696317,
0.0053139403,
-0.460152626,
0.195518598,
-0.6179571152,
0.2456761748,
-0.0609877743,
-0.3103727996,
-0.1943773627,
-0.0575053133,
0.1287862659,
-0.0164992549,
-0.3511246145,
0.0874211118,
0.2061313093,
-0.1411282271,
0.1736899614,
0.044138521,
0.164144069,
-0.1359500438,
-0.0259153843,
-0.0478294007,
0.0252407752,
-0.327275902,
0.0321034938,
-0.1285141706,
0.0396088324,
0.0648923367,
-0.1876661777,
-0.138992995,
-0.1790035963,
-0.2322731316,
0.0341175534,
-0.0430923775,
0.1417648941,
-0.0858425647,
0.3707483709,
-0.2654457688,
-0.1623957753,
0.514253974,
-0.3053885996,
0.0012645051,
0.339337945,
0.3020928502,
-0.3583813012,
-0.0925877392,
-0.1114679947,
-0.182228446,
-0.3933392763,
-0.0003346093,
-0.2642870843,
0.0666862428,
-0.0891283527,
0.0259226672,
0.1505149603,
0.1860915571,
-0.1181162968,
-0.6274638176,
-0.1790272892,
0.4015604258,
-0.0270391516,
0.0485014804,
-0.2964763045,
0.1652343869,
0.4207265377,
-0.038622357,
-0.2305992842,
0.1560417414,
-0.4415417314,
0.0104380976,
0.3247362375,
-0.3346687555,
0.0485882014,
-0.0424531065,
0.0563485101,
0.0833297819,
-0.1790659428,
-0.0608193502,
-0.1101867408,
0.1974409223,
0.0511797369,
-0.3207356334,
-0.0964924246,
-0.0979489833,
0.2066011727,
-0.0818445981,
-0.0759879872,
-0.0132813677,
0.0211342238,
-0.2951014936,
0.1266243756,
-0.0999268442,
-0.1372245848,
0.3212112784,
-0.0065162778,
0.2825661004,
0.31738621,
0.1064765155,
0.0883072764,
-0.0717145056,
-0.0931004435,
-0.1241107881,
-0.0829005092,
0.1949428916,
0.4736435711,
-0.0916404426,
0.0512026548,
0.056989029,
0.3979812264,
0.5058016181,
0.0556126162,
-0.1075793058,
0.1179757714,
0.0908085853,
0.0191554651,
0.2518128753,
0.2627198696,
0.1216955855,
0.0234896839,
0.0976518095,
0.0498040468,
-0.072479248,
0.0851857364,
0.0754873529,
0.658939302,
0.2015366554,
-0.0219140425,
-0.1459541917,
0.0476531833,
-0.052735962,
0.0410785601,
-0.0996242315,
0.0687502399,
-0.1266630888,
0.0534233898,
-0.0639401898,
-0.1313263625,
0.3249319494,
0.2482801676,
-0.2003667057,
-0.0873380154,
0.2849597335,
0.0983348638,
0.1369966418,
-0.2204144299,
0.3333669603,
-0.1611332744,
0.0659119487,
-0.1380517781,
0.4353381991,
0.1926470995,
-0.3216385543,
0.0582482442,
-0.3349764943,
-0.4865925312,
0.0540139154,
-0.1999975294,
-0.1715221256,
0.1560162902,
0.2826381624,
0.0135861393,
-0.7253263593,
-0.1404954493,
-0.191246748,
0.2329851389,
-0.3082933724,
0.4661636353,
-0.1038244218,
0.0777672082,
-0.0195024386,
0.1526184827,
0.3781438768,
0.3050532639,
-0.4279477298,
0.0964921266,
0.2019782364,
-0.0506127886,
-0.0096545927,
0.2302980721,
0.4857645333,
-0.1341397464,
0.49281317,
0.0891052559,
-0.1842340231,
0.268684566,
-0.0354274102,
0.1412736028,
-0.075104326,
0.2763310373,
-0.0875300691,
-0.0513292328,
-0.2288813144,
-0.1396887898,
-0.5556664467,
-0.1096579731,
-0.0335210338,
-0.597343564,
0.0323540643,
-0.0905735493,
0.0411371812,
0.0053060539,
0.6298922896,
0.569093585,
0.5683694482,
-0.3809540272,
-0.3364818692,
-0.3252818882,
0.0403119996,
-0.1251098514,
-0.0124701113,
-0.1701543033,
0.2908909619,
-0.1745342016,
0.2047265768,
-0.0137992352,
0.674814105,
-0.1523850262,
0.174304381,
-0.2713821828,
-0.0484786294,
-0.0313814692,
0.0573393479,
-0.1030780375,
-0.0273056477,
0.0516433083,
-0.5177119374,
-0.0516983978,
0.275902003,
-0.2825131118,
0.1339704096,
0.2366518974,
0.3119562268,
0.1032112837,
0.762863338,
-0.1227935404,
-0.1470052153,
-0.4071493149,
-0.6882560849,
-0.024067834,
-0.1192929298,
0.0928618908,
0.1546038091,
-0.0933203995,
0.202130571,
-0.1746086925,
0.190074563,
-0.0879269838,
0.4516708553,
0.1424891651,
-0.3678413332,
0.0650075525,
-0.0495071746,
0.0439471081,
-0.0063380934,
-0.2057439983,
0.1007040143,
-0.1159255207,
-0.3511935472,
0.5208381414,
-0.0540382303,
-0.0175135434,
-0.1145697683,
0.2055764198,
0.0688287169,
0.212995559,
-0.3332383037,
-0.2800841928,
0.3444099128,
0.0583391972,
-0.43530792,
0.2324450016,
-0.0068449657,
-0.0778454095,
-0.0766063109,
-0.0418283679,
0.2216901779,
-0.2184318751,
-0.192987904,
-0.067809172
] |
https://github.com/huggingface/datasets/issues/568 | `metric.compute` throws `ArrowInvalid` error | Could you try to update to `datasets>=1.0.0` (we changed the name of the library) and try again ?
If is was related to the distributed setup settings it must be fixed.
If it was related to empty metric inputs it's going to be fixed in #654 | I get the following error with `rouge.compute`. It happens only with distributed training, and it occurs randomly I can't easily reproduce it. This is using `nlp==0.4.0`
```
File "/home/beltagy/trainer.py", line 92, in validation_step
rouge_scores = rouge.compute(predictions=generated_str, references=gold_str, rouge_types=['rouge2', 'rouge1', 'rougeL'])
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/metric.py", line 224, in compute
self.finalize(timeout=timeout)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/metric.py", line 213, in finalize
self.data = Dataset(**reader.read_files(node_files))
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 217, in read_files
dataset_kwargs = self._read_files(files=files, info=self._info, original_instructions=original_instructions)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 162, in _read_files
pa_table: pa.Table = self._get_dataset_from_filename(f_dict)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 276, in _get_dataset_from_filename
f = pa.ipc.open_stream(mmap)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/pyarrow/ipc.py", line 173, in open_stream
return RecordBatchStreamReader(source)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/pyarrow/ipc.py", line 64, in __init__
self._open(source)
File "pyarrow/ipc.pxi", line 469, in pyarrow.lib._RecordBatchStreamReader._open
File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status
pyarrow.lib.ArrowInvalid: Tried reading schema message, was null or length 0
``` | 46 | `metric.compute` throws `ArrowInvalid` error
I get the following error with `rouge.compute`. It happens only with distributed training, and it occurs randomly I can't easily reproduce it. This is using `nlp==0.4.0`
```
File "/home/beltagy/trainer.py", line 92, in validation_step
rouge_scores = rouge.compute(predictions=generated_str, references=gold_str, rouge_types=['rouge2', 'rouge1', 'rougeL'])
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/metric.py", line 224, in compute
self.finalize(timeout=timeout)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/metric.py", line 213, in finalize
self.data = Dataset(**reader.read_files(node_files))
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 217, in read_files
dataset_kwargs = self._read_files(files=files, info=self._info, original_instructions=original_instructions)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 162, in _read_files
pa_table: pa.Table = self._get_dataset_from_filename(f_dict)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 276, in _get_dataset_from_filename
f = pa.ipc.open_stream(mmap)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/pyarrow/ipc.py", line 173, in open_stream
return RecordBatchStreamReader(source)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/pyarrow/ipc.py", line 64, in __init__
self._open(source)
File "pyarrow/ipc.pxi", line 469, in pyarrow.lib._RecordBatchStreamReader._open
File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status
pyarrow.lib.ArrowInvalid: Tried reading schema message, was null or length 0
```
Could you try to update to `datasets>=1.0.0` (we changed the name of the library) and try again ?
If is was related to the distributed setup settings it must be fixed.
If it was related to empty metric inputs it's going to be fixed in #654 | [
-0.4075995088,
-0.2417834401,
0.0564271882,
0.2862924635,
0.3149878681,
-0.1689800173,
-0.1591831297,
0.2923863828,
-0.1545248926,
0.404512465,
0.052560769,
0.5000802279,
-0.110934034,
-0.3641475439,
-0.094238624,
-0.1911459416,
-0.132249862,
0.0535938777,
0.062323276,
-0.27793926,
-0.3765875399,
0.0985769331,
-0.213160187,
0.3267943859,
-0.1932070553,
-0.352106452,
0.1751868874,
0.1816707104,
-0.3589943647,
-0.5263903737,
0.270960331,
-0.2937383056,
0.0037994012,
0.1183883101,
-0.0001191874,
0.1296565831,
0.3363070786,
-0.0975791886,
-0.0041522235,
-0.1153502315,
-0.0072664618,
-0.0410662368,
0.3097949624,
-0.1023336351,
0.1306816339,
-0.1691229492,
-0.2776664793,
-0.2377287,
0.2970283926,
0.3777066171,
0.0835813582,
0.2802878022,
-0.0036600381,
-0.2270234972,
-0.1957391202,
-0.2623838186,
-0.0339899026,
0.7621634007,
0.0858869031,
-0.2124260813,
0.0804969147,
0.2712042034,
0.3043485582,
0.1928762496,
0.4343168736,
-0.2398330122,
0.2804590166,
-0.0029740194,
-0.1896128356,
0.1256078333,
-0.0713662803,
-0.100053072,
-0.2646154165,
0.2437917441,
0.2630293965,
-0.5339516997,
-0.0313142352,
0.2237599045,
-0.0698004961,
0.0458760746,
-0.2400446981,
0.0889872909,
-0.1897265464,
0.0984044969,
0.1639557183,
0.1306429654,
0.0151548758,
0.2536820471,
0.0861733928,
0.0687299967,
-0.3007565737,
0.3731757998,
-0.2901466787,
-0.0234653316,
-0.4769314229,
-0.1539030373,
-0.0954834223,
0.0382488966,
0.0169571415,
0.1846893281,
0.6731566787,
-0.0968768746,
0.2253451943,
0.3182925284,
0.1315828562,
0.2608218789,
-0.2110043317,
0.5256140828,
0.085005872,
-0.0524349436,
-0.0642039701,
0.0298820063,
0.2078828812,
-0.6180190444,
0.1823917031,
0.3520443141,
0.0247816071,
0.1011308506,
-0.5585082769,
-0.1874741912,
-0.420809865,
0.1441124827,
0.1162039489,
0.0702416301,
-0.0098983496,
0.1733296812,
0.2674608827,
0.2471818775,
-0.1467235237,
-0.3565670848,
-0.2359511256,
0.1969589591,
-0.2147191912,
0.0490038693,
-0.0045696869,
-0.0573172793,
0.1841363609,
0.0902260095,
0.0255614072,
-0.1761726439,
0.5448207855,
-0.2204471081,
-0.0769295543,
0.0257569235,
-0.3781879544,
-0.2873994112,
0.2297773361,
0.1334010065,
0.0374503136,
-0.1065509319,
-0.2053183913,
-0.339736104,
0.245522052,
0.1057561487,
0.3015189171,
-0.0896548331,
0.2021172345,
0.1027188599,
0.1186711043,
-0.1679326296,
0.1719893068,
-0.2009639442,
-0.1101605296,
0.0065555163,
0.0725459382,
-0.0001085326,
-0.1014669612,
0.1107224673,
0.1415620744,
0.1545991153,
0.0349534899,
0.4689892232,
-0.2500037849,
0.2202892452,
0.0234923288,
-0.3029282093,
0.528673172,
-0.9485340118,
-0.2145376801,
0.0507870018,
-0.1609340161,
-0.5299546719,
-0.1245955154,
-0.1013834327,
-0.14016518,
0.1376925707,
0.3832188845,
0.0421504676,
-0.1681723297,
-0.2119558156,
-0.2906454802,
0.0716141984,
0.2468958348,
0.1549105942,
0.0768484175,
0.0673316866,
0.0960902274,
-0.2710838318,
0.0719325766,
0.031280987,
-0.0309002474,
0.21320647,
-0.1415379792,
-0.1438583881,
0.4165872037,
-0.0793787912,
0.2842076123,
0.111213401,
-0.3692342341,
-0.1871005148,
-0.2376632988,
0.010988988,
-0.321739912,
0.0674757808,
-0.0672569126,
-0.0084718885,
0.0498080254,
0.0204174649,
0.2208720297,
0.1332468837,
-0.314958781,
-0.221074909,
-0.4542838931,
-0.0306472704,
-0.1260975599,
0.0629274398,
0.0274377,
-0.2461946607,
0.0837176889,
0.1811004877,
0.1607606709,
-0.0457624197,
-0.2479195297,
0.4671177566,
0.0894629359,
0.0979054421,
0.4763184488,
-0.1720581353,
0.1261307001,
-0.2955785096,
-0.3338620663,
0.2377319932,
0.3096677661,
0.0850774422,
0.1167071983,
0.2165963054,
-0.3274511695,
-0.0149450377,
-0.0602281615,
0.2954147756,
0.2199822068,
0.1660232395,
-0.1664329022,
0.0484349728,
0.1911973953,
-0.1303873956,
0.3926314116,
-0.2581726015,
-0.3667015135,
-0.2043807507,
0.181483686,
0.1247726828,
0.0273572467,
-0.2630419731,
0.2203620225,
0.289526403,
0.0579391271,
0.1685936302,
0.4083439708,
-0.0838987455,
0.0149456188,
0.2547497153,
-0.0987791568,
-0.1508772373,
0.0551874526,
0.0896368921,
0.1410101652,
0.3735878766,
0.20658499,
0.066329509,
-0.3672612906,
-0.1021529809,
0.2359736562,
0.2093067616,
-0.0221333206,
0.4142640233,
-0.0017026067,
0.2776545286,
-0.1761570275,
-0.1489999443,
-0.1172154695,
-0.1813317239,
0.0745565295,
0.0275700893,
-0.0813107938,
0.2263070047,
0.2115995884,
0.3438274264,
0.3297129273,
0.0593649372,
0.0696416348,
-0.1464842409,
-0.1094476432,
-0.1209726632,
-0.1135561168,
0.0799370557,
0.1879223585,
-0.0791876689,
0.2124753743,
-0.1500808597,
-0.0069222972,
-0.0119672026,
0.0123387612,
-0.0033973921,
0.3124579489,
-0.4193602204,
-0.6186662316,
0.1612169147,
0.385740906,
-0.1651764214,
-0.1729446352,
0.1802292913,
0.0099462569,
-0.0189915374,
-0.1486455798,
-0.3686355352,
-0.4087260664,
-0.3623013496,
0.3295705616,
0.0106304288,
-0.151791513,
0.1586494744,
0.0680437461,
0.1036148816,
0.2316938043,
-0.072649166,
0.0208136328,
-0.0302375779,
0.2664110959,
-0.2160870731,
-0.2734994292,
-0.0737787634,
-0.1136966869,
0.3361842036,
0.4906623065,
-0.1465971172,
-0.2178464532,
0.1189223751,
0.1014849469,
-0.5226739049,
0.1076531932,
0.5523732305,
-0.1283312887,
-0.0396113843,
-0.167928502,
0.3205575347,
0.4238614142,
-0.0645173267,
0.0601097792,
-0.1110718101,
0.1943630278,
0.1050654352,
0.7501339912,
0.2554731965,
-0.0024978623,
0.1955066621,
0.0408721343,
0.1205498874,
0.1490538716,
-0.0575667098,
0.2131000161,
0.132183224,
-0.1164221466,
-0.0873296857,
-0.1998210698,
0.0960989073,
-0.1304622591,
-0.0857842937,
0.1583220661,
0.0810298473,
-0.1173954904,
-0.7013341188,
0.5111745,
0.252673775,
0.1756167859,
-0.1056384966,
0.0872493535,
0.2066688389,
-0.1311405599,
0.409368515,
0.0597587079,
0.074431017,
-0.349694401,
-0.0763832182,
0.3868115544,
0.1509345323,
0.5822172761,
-0.0802687705,
0.21143049,
0.1476574838,
0.0032643154,
0.1764951646,
-0.4874756038,
-0.0024383105,
0.0024490803,
-0.3182802498,
-0.7047183514,
0.0278707147,
-0.2246012539,
0.206982255,
-0.0390860066,
0.2247408479,
-0.0905075222,
-0.0705308169,
0.0914267749,
-0.0246232711,
-0.2360410839,
-0.0437541045,
-0.4384620786,
-0.1007980704,
-0.0079878271,
0.3548947275,
0.2158469856,
0.0242721401,
0.2047317922,
0.0260864347,
-0.4139422178,
0.2024677247,
-0.1164640635,
-0.1058038324,
0.2377330959,
-0.2303407043,
-0.108623758,
0.1344453394,
0.0973666832,
0.1317892671,
0.3259263635,
0.1949754059,
-0.7740001082,
-0.088198185,
-0.0107650757,
0.0912961513,
0.1395355463,
-0.1086556315,
-0.0830894262,
-0.1653795838,
-0.0371895023,
-0.1362712979,
-0.0624065995,
0.2667119503,
-0.0437853113,
0.2141999006,
-0.0918838233,
0.3918713033,
0.068367824,
-0.0616464168,
0.6337690353,
-0.0885594189,
-0.1122552603,
0.0026831776,
0.4313241839,
0.6851307154,
0.3991339803,
-0.228653118,
0.5048454404,
-0.2030972838,
0.2501058578,
-0.2921942472,
0.355974704,
-0.3496274948,
-0.2911680639,
0.0264539197,
-0.1216659844,
0.0053273784,
0.112544015,
0.0203567222,
0.293941468,
-0.0912705213,
-0.285271734,
-0.0292424466,
0.000849247,
-0.2626026869,
-0.0537060425,
0.0257885084,
-0.0062578768,
-0.154360339,
-0.0253193565,
-0.1340198219,
-0.2451973855,
-0.1659551561,
-0.1010503322,
0.1498210877,
-0.0198500641,
-0.2273241878,
0.4922192395,
0.3724727035,
-0.1266026348,
-0.1150008589,
0.2851443887,
0.4384121299,
-0.0976753235,
-0.0905349627,
0.31263116,
-0.1261665076,
-0.208791241,
-0.1333058774,
0.2157017887,
0.0356788933,
-0.1506470144,
-0.2951406538,
0.0754734874,
-0.3010198772,
-0.2734057307,
0.0300635993,
0.0591304302,
-0.2400633246,
0.2178402543,
0.1667967886,
-0.1111844108,
-0.3868942261,
-0.0963923931,
0.0177950989,
0.1052126214,
-0.3568502963,
0.0832291618,
-0.0021255678,
-0.3438081741,
0.304222554,
0.3032567501,
-0.1221484989,
0.0216173455,
0.3507730663,
-0.2816230655,
0.0514554605,
-0.2250771224,
-0.0476727039,
0.181617111,
-0.3708329201,
0.3871158063,
-0.1974066049,
0.2173273563,
0.1861217767,
0.5506521463,
0.1998645067,
0.0305284075,
0.0216815174,
-0.380900681,
-0.2811960876,
0.0749436393,
-0.1057024598,
-0.1554641277,
-0.0670694187,
0.0661703944,
-0.063046135,
0.1681352407,
-0.171198234,
0.0182612017,
-0.4026429355,
0.1384999752,
-0.2358006537,
-0.1427794248,
-0.2918180227,
0.1429324597,
-0.1053561717,
0.474832803,
-0.1834553927,
-0.0251475256,
0.0208592713,
0.1604124606,
0.1491888762,
0.2476717234,
0.0554539412,
-0.0948637724,
-0.4114232063,
-0.0237531289,
-0.1994517595,
-0.1094006598,
-0.1021044031,
0.1169446781,
-0.1483196467,
0.2587158084,
-0.08256156,
-0.1026576832,
0.2263112217,
0.0078644082,
-0.4149519503,
-0.0237012357,
0.1375002265,
-0.1295652092,
-0.4047167301,
-0.0717707872,
-0.1746264249,
0.0327313282,
0.2356805056,
-0.3694476187,
-0.0406393744,
-0.3534324765,
0.3480404019,
0.4453604519,
-0.4016061723,
-0.082194142,
0.0196638964,
0.0796483979,
-0.2390368283,
0.034882836,
-0.0927760005,
0.4075060189,
-0.0362942517,
0.0683817044,
0.2137535065,
-0.3845719099,
-0.0951184332,
0.1793112904,
-0.1105756089,
-0.5657695532,
0.1834153533,
-0.0904027224,
-0.0515370555,
0.0342613012,
0.4253362119,
0.3455854654,
0.2773361802,
0.3787548542,
-0.1718463004,
0.3203333616,
0.2994098663,
-0.0327492468,
-0.0350371301,
-0.0182346478,
-0.3241395354,
-0.0569084585,
0.1216446087,
-0.2019241303,
0.1269882172,
0.4522325099,
-0.5222978592,
-0.5407586694,
-0.1329666823,
-0.1093747318,
-0.0489516035,
0.2321895063,
-0.5116029978,
-0.1040365696,
-0.0905236602,
-0.0982390866,
0.2170719206,
0.5896323919,
0.5254977942,
0.2307589501,
0.0998488888,
-0.4661765099,
-0.0672629625,
0.5261719823,
0.3482173979,
-0.3474154472,
0.0702709258,
0.2197811007,
-0.0478889421,
-0.183631435,
0.4682468176,
0.1106717438,
0.4221268296,
-0.3368284106,
-0.0574066825,
-0.0913393348,
0.0558165908,
0.1883187592,
0.4640161395,
0.4181195498,
-0.129423514,
0.4254340529,
-0.080789417,
-0.0253634658,
0.341093719,
-0.1127058044,
0.5322978497,
-0.1863977462,
0.2962701321,
-0.1552475691,
0.1272640824,
-0.123293668,
0.0020005666,
-0.5153282285,
0.3860243559,
-0.2916573882,
0.0725814104,
0.0253605954,
-0.2714993656,
0.0217010379,
-0.2019304931,
0.411830157,
0.2544834316,
-0.0213221777,
-0.1645656526,
-0.3641456664,
-0.3480191529,
0.1979252845,
-0.034725897,
-0.0323350094,
0.0377679542,
0.2458747029,
-0.0899565667,
0.2194444686,
0.013613224,
0.3184031844,
0.1633765996,
0.41916278,
-0.4301504195,
-0.0815249607,
-0.3555105329,
-0.1731370986,
0.2258822173,
-0.1960834861,
0.0569190457,
-0.0774388835,
-0.0041140169,
0.2366613895,
-0.4406209588,
0.0182623416,
0.4891802371,
0.4408986866,
0.1522385776,
0.2984648645,
0.0991085842,
0.0148842633,
-0.1326737553,
-0.3605557084,
0.0081424378,
0.3292992711,
0.0611586869,
0.3952177763,
-0.1376784742,
0.1447614878,
-0.2830951214,
0.3723944426,
-0.1469096392,
-0.0667034388,
-0.0228484198,
0.0460147187,
-0.0089648142,
0.1270910949,
-0.0131383464,
0.1258745193,
0.1562906057,
0.0732178167,
-0.2892946005,
-0.2071134746,
0.5095154643,
-0.1520722061,
-0.4998913705,
0.1289651096,
0.3653585017,
0.8499706388,
-0.1110302135,
-0.4012587965,
0.0702341348,
0.1562006772,
-0.04534363,
-0.0882447213,
-0.0701677203,
0.0586662814,
0.1032952517,
0.1094178706,
0.0976529792,
-0.0599054694,
0.0193276852,
0.2159416974,
-0.1393846869
] |
https://github.com/huggingface/datasets/issues/568 | `metric.compute` throws `ArrowInvalid` error | Closing this one as it was fixed in #654
Feel free to re-open if you have other questions | I get the following error with `rouge.compute`. It happens only with distributed training, and it occurs randomly I can't easily reproduce it. This is using `nlp==0.4.0`
```
File "/home/beltagy/trainer.py", line 92, in validation_step
rouge_scores = rouge.compute(predictions=generated_str, references=gold_str, rouge_types=['rouge2', 'rouge1', 'rougeL'])
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/metric.py", line 224, in compute
self.finalize(timeout=timeout)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/metric.py", line 213, in finalize
self.data = Dataset(**reader.read_files(node_files))
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 217, in read_files
dataset_kwargs = self._read_files(files=files, info=self._info, original_instructions=original_instructions)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 162, in _read_files
pa_table: pa.Table = self._get_dataset_from_filename(f_dict)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 276, in _get_dataset_from_filename
f = pa.ipc.open_stream(mmap)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/pyarrow/ipc.py", line 173, in open_stream
return RecordBatchStreamReader(source)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/pyarrow/ipc.py", line 64, in __init__
self._open(source)
File "pyarrow/ipc.pxi", line 469, in pyarrow.lib._RecordBatchStreamReader._open
File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status
pyarrow.lib.ArrowInvalid: Tried reading schema message, was null or length 0
``` | 18 | `metric.compute` throws `ArrowInvalid` error
I get the following error with `rouge.compute`. It happens only with distributed training, and it occurs randomly I can't easily reproduce it. This is using `nlp==0.4.0`
```
File "/home/beltagy/trainer.py", line 92, in validation_step
rouge_scores = rouge.compute(predictions=generated_str, references=gold_str, rouge_types=['rouge2', 'rouge1', 'rougeL'])
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/metric.py", line 224, in compute
self.finalize(timeout=timeout)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/metric.py", line 213, in finalize
self.data = Dataset(**reader.read_files(node_files))
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 217, in read_files
dataset_kwargs = self._read_files(files=files, info=self._info, original_instructions=original_instructions)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 162, in _read_files
pa_table: pa.Table = self._get_dataset_from_filename(f_dict)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/nlp/arrow_reader.py", line 276, in _get_dataset_from_filename
f = pa.ipc.open_stream(mmap)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/pyarrow/ipc.py", line 173, in open_stream
return RecordBatchStreamReader(source)
File "/home/beltagy/miniconda3/envs/allennlp/lib/python3.7/site-packages/pyarrow/ipc.py", line 64, in __init__
self._open(source)
File "pyarrow/ipc.pxi", line 469, in pyarrow.lib._RecordBatchStreamReader._open
File "pyarrow/error.pxi", line 122, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status
pyarrow.lib.ArrowInvalid: Tried reading schema message, was null or length 0
```
Closing this one as it was fixed in #654
Feel free to re-open if you have other questions | [
-0.4075995088,
-0.2417834401,
0.0564271882,
0.2862924635,
0.3149878681,
-0.1689800173,
-0.1591831297,
0.2923863828,
-0.1545248926,
0.404512465,
0.052560769,
0.5000802279,
-0.110934034,
-0.3641475439,
-0.094238624,
-0.1911459416,
-0.132249862,
0.0535938777,
0.062323276,
-0.27793926,
-0.3765875399,
0.0985769331,
-0.213160187,
0.3267943859,
-0.1932070553,
-0.352106452,
0.1751868874,
0.1816707104,
-0.3589943647,
-0.5263903737,
0.270960331,
-0.2937383056,
0.0037994012,
0.1183883101,
-0.0001191874,
0.1296565831,
0.3363070786,
-0.0975791886,
-0.0041522235,
-0.1153502315,
-0.0072664618,
-0.0410662368,
0.3097949624,
-0.1023336351,
0.1306816339,
-0.1691229492,
-0.2776664793,
-0.2377287,
0.2970283926,
0.3777066171,
0.0835813582,
0.2802878022,
-0.0036600381,
-0.2270234972,
-0.1957391202,
-0.2623838186,
-0.0339899026,
0.7621634007,
0.0858869031,
-0.2124260813,
0.0804969147,
0.2712042034,
0.3043485582,
0.1928762496,
0.4343168736,
-0.2398330122,
0.2804590166,
-0.0029740194,
-0.1896128356,
0.1256078333,
-0.0713662803,
-0.100053072,
-0.2646154165,
0.2437917441,
0.2630293965,
-0.5339516997,
-0.0313142352,
0.2237599045,
-0.0698004961,
0.0458760746,
-0.2400446981,
0.0889872909,
-0.1897265464,
0.0984044969,
0.1639557183,
0.1306429654,
0.0151548758,
0.2536820471,
0.0861733928,
0.0687299967,
-0.3007565737,
0.3731757998,
-0.2901466787,
-0.0234653316,
-0.4769314229,
-0.1539030373,
-0.0954834223,
0.0382488966,
0.0169571415,
0.1846893281,
0.6731566787,
-0.0968768746,
0.2253451943,
0.3182925284,
0.1315828562,
0.2608218789,
-0.2110043317,
0.5256140828,
0.085005872,
-0.0524349436,
-0.0642039701,
0.0298820063,
0.2078828812,
-0.6180190444,
0.1823917031,
0.3520443141,
0.0247816071,
0.1011308506,
-0.5585082769,
-0.1874741912,
-0.420809865,
0.1441124827,
0.1162039489,
0.0702416301,
-0.0098983496,
0.1733296812,
0.2674608827,
0.2471818775,
-0.1467235237,
-0.3565670848,
-0.2359511256,
0.1969589591,
-0.2147191912,
0.0490038693,
-0.0045696869,
-0.0573172793,
0.1841363609,
0.0902260095,
0.0255614072,
-0.1761726439,
0.5448207855,
-0.2204471081,
-0.0769295543,
0.0257569235,
-0.3781879544,
-0.2873994112,
0.2297773361,
0.1334010065,
0.0374503136,
-0.1065509319,
-0.2053183913,
-0.339736104,
0.245522052,
0.1057561487,
0.3015189171,
-0.0896548331,
0.2021172345,
0.1027188599,
0.1186711043,
-0.1679326296,
0.1719893068,
-0.2009639442,
-0.1101605296,
0.0065555163,
0.0725459382,
-0.0001085326,
-0.1014669612,
0.1107224673,
0.1415620744,
0.1545991153,
0.0349534899,
0.4689892232,
-0.2500037849,
0.2202892452,
0.0234923288,
-0.3029282093,
0.528673172,
-0.9485340118,
-0.2145376801,
0.0507870018,
-0.1609340161,
-0.5299546719,
-0.1245955154,
-0.1013834327,
-0.14016518,
0.1376925707,
0.3832188845,
0.0421504676,
-0.1681723297,
-0.2119558156,
-0.2906454802,
0.0716141984,
0.2468958348,
0.1549105942,
0.0768484175,
0.0673316866,
0.0960902274,
-0.2710838318,
0.0719325766,
0.031280987,
-0.0309002474,
0.21320647,
-0.1415379792,
-0.1438583881,
0.4165872037,
-0.0793787912,
0.2842076123,
0.111213401,
-0.3692342341,
-0.1871005148,
-0.2376632988,
0.010988988,
-0.321739912,
0.0674757808,
-0.0672569126,
-0.0084718885,
0.0498080254,
0.0204174649,
0.2208720297,
0.1332468837,
-0.314958781,
-0.221074909,
-0.4542838931,
-0.0306472704,
-0.1260975599,
0.0629274398,
0.0274377,
-0.2461946607,
0.0837176889,
0.1811004877,
0.1607606709,
-0.0457624197,
-0.2479195297,
0.4671177566,
0.0894629359,
0.0979054421,
0.4763184488,
-0.1720581353,
0.1261307001,
-0.2955785096,
-0.3338620663,
0.2377319932,
0.3096677661,
0.0850774422,
0.1167071983,
0.2165963054,
-0.3274511695,
-0.0149450377,
-0.0602281615,
0.2954147756,
0.2199822068,
0.1660232395,
-0.1664329022,
0.0484349728,
0.1911973953,
-0.1303873956,
0.3926314116,
-0.2581726015,
-0.3667015135,
-0.2043807507,
0.181483686,
0.1247726828,
0.0273572467,
-0.2630419731,
0.2203620225,
0.289526403,
0.0579391271,
0.1685936302,
0.4083439708,
-0.0838987455,
0.0149456188,
0.2547497153,
-0.0987791568,
-0.1508772373,
0.0551874526,
0.0896368921,
0.1410101652,
0.3735878766,
0.20658499,
0.066329509,
-0.3672612906,
-0.1021529809,
0.2359736562,
0.2093067616,
-0.0221333206,
0.4142640233,
-0.0017026067,
0.2776545286,
-0.1761570275,
-0.1489999443,
-0.1172154695,
-0.1813317239,
0.0745565295,
0.0275700893,
-0.0813107938,
0.2263070047,
0.2115995884,
0.3438274264,
0.3297129273,
0.0593649372,
0.0696416348,
-0.1464842409,
-0.1094476432,
-0.1209726632,
-0.1135561168,
0.0799370557,
0.1879223585,
-0.0791876689,
0.2124753743,
-0.1500808597,
-0.0069222972,
-0.0119672026,
0.0123387612,
-0.0033973921,
0.3124579489,
-0.4193602204,
-0.6186662316,
0.1612169147,
0.385740906,
-0.1651764214,
-0.1729446352,
0.1802292913,
0.0099462569,
-0.0189915374,
-0.1486455798,
-0.3686355352,
-0.4087260664,
-0.3623013496,
0.3295705616,
0.0106304288,
-0.151791513,
0.1586494744,
0.0680437461,
0.1036148816,
0.2316938043,
-0.072649166,
0.0208136328,
-0.0302375779,
0.2664110959,
-0.2160870731,
-0.2734994292,
-0.0737787634,
-0.1136966869,
0.3361842036,
0.4906623065,
-0.1465971172,
-0.2178464532,
0.1189223751,
0.1014849469,
-0.5226739049,
0.1076531932,
0.5523732305,
-0.1283312887,
-0.0396113843,
-0.167928502,
0.3205575347,
0.4238614142,
-0.0645173267,
0.0601097792,
-0.1110718101,
0.1943630278,
0.1050654352,
0.7501339912,
0.2554731965,
-0.0024978623,
0.1955066621,
0.0408721343,
0.1205498874,
0.1490538716,
-0.0575667098,
0.2131000161,
0.132183224,
-0.1164221466,
-0.0873296857,
-0.1998210698,
0.0960989073,
-0.1304622591,
-0.0857842937,
0.1583220661,
0.0810298473,
-0.1173954904,
-0.7013341188,
0.5111745,
0.252673775,
0.1756167859,
-0.1056384966,
0.0872493535,
0.2066688389,
-0.1311405599,
0.409368515,
0.0597587079,
0.074431017,
-0.349694401,
-0.0763832182,
0.3868115544,
0.1509345323,
0.5822172761,
-0.0802687705,
0.21143049,
0.1476574838,
0.0032643154,
0.1764951646,
-0.4874756038,
-0.0024383105,
0.0024490803,
-0.3182802498,
-0.7047183514,
0.0278707147,
-0.2246012539,
0.206982255,
-0.0390860066,
0.2247408479,
-0.0905075222,
-0.0705308169,
0.0914267749,
-0.0246232711,
-0.2360410839,
-0.0437541045,
-0.4384620786,
-0.1007980704,
-0.0079878271,
0.3548947275,
0.2158469856,
0.0242721401,
0.2047317922,
0.0260864347,
-0.4139422178,
0.2024677247,
-0.1164640635,
-0.1058038324,
0.2377330959,
-0.2303407043,
-0.108623758,
0.1344453394,
0.0973666832,
0.1317892671,
0.3259263635,
0.1949754059,
-0.7740001082,
-0.088198185,
-0.0107650757,
0.0912961513,
0.1395355463,
-0.1086556315,
-0.0830894262,
-0.1653795838,
-0.0371895023,
-0.1362712979,
-0.0624065995,
0.2667119503,
-0.0437853113,
0.2141999006,
-0.0918838233,
0.3918713033,
0.068367824,
-0.0616464168,
0.6337690353,
-0.0885594189,
-0.1122552603,
0.0026831776,
0.4313241839,
0.6851307154,
0.3991339803,
-0.228653118,
0.5048454404,
-0.2030972838,
0.2501058578,
-0.2921942472,
0.355974704,
-0.3496274948,
-0.2911680639,
0.0264539197,
-0.1216659844,
0.0053273784,
0.112544015,
0.0203567222,
0.293941468,
-0.0912705213,
-0.285271734,
-0.0292424466,
0.000849247,
-0.2626026869,
-0.0537060425,
0.0257885084,
-0.0062578768,
-0.154360339,
-0.0253193565,
-0.1340198219,
-0.2451973855,
-0.1659551561,
-0.1010503322,
0.1498210877,
-0.0198500641,
-0.2273241878,
0.4922192395,
0.3724727035,
-0.1266026348,
-0.1150008589,
0.2851443887,
0.4384121299,
-0.0976753235,
-0.0905349627,
0.31263116,
-0.1261665076,
-0.208791241,
-0.1333058774,
0.2157017887,
0.0356788933,
-0.1506470144,
-0.2951406538,
0.0754734874,
-0.3010198772,
-0.2734057307,
0.0300635993,
0.0591304302,
-0.2400633246,
0.2178402543,
0.1667967886,
-0.1111844108,
-0.3868942261,
-0.0963923931,
0.0177950989,
0.1052126214,
-0.3568502963,
0.0832291618,
-0.0021255678,
-0.3438081741,
0.304222554,
0.3032567501,
-0.1221484989,
0.0216173455,
0.3507730663,
-0.2816230655,
0.0514554605,
-0.2250771224,
-0.0476727039,
0.181617111,
-0.3708329201,
0.3871158063,
-0.1974066049,
0.2173273563,
0.1861217767,
0.5506521463,
0.1998645067,
0.0305284075,
0.0216815174,
-0.380900681,
-0.2811960876,
0.0749436393,
-0.1057024598,
-0.1554641277,
-0.0670694187,
0.0661703944,
-0.063046135,
0.1681352407,
-0.171198234,
0.0182612017,
-0.4026429355,
0.1384999752,
-0.2358006537,
-0.1427794248,
-0.2918180227,
0.1429324597,
-0.1053561717,
0.474832803,
-0.1834553927,
-0.0251475256,
0.0208592713,
0.1604124606,
0.1491888762,
0.2476717234,
0.0554539412,
-0.0948637724,
-0.4114232063,
-0.0237531289,
-0.1994517595,
-0.1094006598,
-0.1021044031,
0.1169446781,
-0.1483196467,
0.2587158084,
-0.08256156,
-0.1026576832,
0.2263112217,
0.0078644082,
-0.4149519503,
-0.0237012357,
0.1375002265,
-0.1295652092,
-0.4047167301,
-0.0717707872,
-0.1746264249,
0.0327313282,
0.2356805056,
-0.3694476187,
-0.0406393744,
-0.3534324765,
0.3480404019,
0.4453604519,
-0.4016061723,
-0.082194142,
0.0196638964,
0.0796483979,
-0.2390368283,
0.034882836,
-0.0927760005,
0.4075060189,
-0.0362942517,
0.0683817044,
0.2137535065,
-0.3845719099,
-0.0951184332,
0.1793112904,
-0.1105756089,
-0.5657695532,
0.1834153533,
-0.0904027224,
-0.0515370555,
0.0342613012,
0.4253362119,
0.3455854654,
0.2773361802,
0.3787548542,
-0.1718463004,
0.3203333616,
0.2994098663,
-0.0327492468,
-0.0350371301,
-0.0182346478,
-0.3241395354,
-0.0569084585,
0.1216446087,
-0.2019241303,
0.1269882172,
0.4522325099,
-0.5222978592,
-0.5407586694,
-0.1329666823,
-0.1093747318,
-0.0489516035,
0.2321895063,
-0.5116029978,
-0.1040365696,
-0.0905236602,
-0.0982390866,
0.2170719206,
0.5896323919,
0.5254977942,
0.2307589501,
0.0998488888,
-0.4661765099,
-0.0672629625,
0.5261719823,
0.3482173979,
-0.3474154472,
0.0702709258,
0.2197811007,
-0.0478889421,
-0.183631435,
0.4682468176,
0.1106717438,
0.4221268296,
-0.3368284106,
-0.0574066825,
-0.0913393348,
0.0558165908,
0.1883187592,
0.4640161395,
0.4181195498,
-0.129423514,
0.4254340529,
-0.080789417,
-0.0253634658,
0.341093719,
-0.1127058044,
0.5322978497,
-0.1863977462,
0.2962701321,
-0.1552475691,
0.1272640824,
-0.123293668,
0.0020005666,
-0.5153282285,
0.3860243559,
-0.2916573882,
0.0725814104,
0.0253605954,
-0.2714993656,
0.0217010379,
-0.2019304931,
0.411830157,
0.2544834316,
-0.0213221777,
-0.1645656526,
-0.3641456664,
-0.3480191529,
0.1979252845,
-0.034725897,
-0.0323350094,
0.0377679542,
0.2458747029,
-0.0899565667,
0.2194444686,
0.013613224,
0.3184031844,
0.1633765996,
0.41916278,
-0.4301504195,
-0.0815249607,
-0.3555105329,
-0.1731370986,
0.2258822173,
-0.1960834861,
0.0569190457,
-0.0774388835,
-0.0041140169,
0.2366613895,
-0.4406209588,
0.0182623416,
0.4891802371,
0.4408986866,
0.1522385776,
0.2984648645,
0.0991085842,
0.0148842633,
-0.1326737553,
-0.3605557084,
0.0081424378,
0.3292992711,
0.0611586869,
0.3952177763,
-0.1376784742,
0.1447614878,
-0.2830951214,
0.3723944426,
-0.1469096392,
-0.0667034388,
-0.0228484198,
0.0460147187,
-0.0089648142,
0.1270910949,
-0.0131383464,
0.1258745193,
0.1562906057,
0.0732178167,
-0.2892946005,
-0.2071134746,
0.5095154643,
-0.1520722061,
-0.4998913705,
0.1289651096,
0.3653585017,
0.8499706388,
-0.1110302135,
-0.4012587965,
0.0702341348,
0.1562006772,
-0.04534363,
-0.0882447213,
-0.0701677203,
0.0586662814,
0.1032952517,
0.1094178706,
0.0976529792,
-0.0599054694,
0.0193276852,
0.2159416974,
-0.1393846869
] |
https://github.com/huggingface/datasets/issues/565 | No module named 'nlp.logging' | Thanks for reporting.
Apparently this is a versioning issue: the lib downloaded the `bleurt` script from the master branch where we did this change recently. We'll fix that in a new release this week or early next week. Cc @thomwolf
Until that, I'd suggest you to download the right bleurt folder from github ([this one](https://github.com/huggingface/nlp/tree/0.4.0/metrics/bleurt)) and do
```python
from nlp import load_metric
bleurt = load_metric("path/to/bleurt/folder")
```
To download it you can either clone the repo or download the `bleurt.py` file and place it in a folder named `bleurt` | Hi, I am using nlp version 0.4.0. Trying to use bleurt as an eval metric, however, the bleurt script imports nlp.logging which creates the following error. What am I missing?
```
>>> import nlp
2020-09-02 13:47:09.210310: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.1
>>> bleurt = nlp.load_metric("bleurt")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/load.py", line 443, in load_metric
metric_cls = import_main_class(module_path, dataset=False)
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/load.py", line 61, in import_main_class
module = importlib.import_module(module_path)
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/metrics/bleurt/43448cf2959ea81d3ae0e71c5c8ee31dc15eed9932f197f5f50673cbcecff2b5/bleurt.py", line 20, in <module>
from nlp.logging import get_logger
ModuleNotFoundError: No module named 'nlp.logging'
```
Just to show once again that I can't import the logging module:
```
>>> import nlp
2020-09-02 13:48:38.190621: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.1
>>> nlp.__version__
'0.4.0'
>>> from nlp.logging import get_logger
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'nlp.logging'
``` | 88 | No module named 'nlp.logging'
Hi, I am using nlp version 0.4.0. Trying to use bleurt as an eval metric, however, the bleurt script imports nlp.logging which creates the following error. What am I missing?
```
>>> import nlp
2020-09-02 13:47:09.210310: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.1
>>> bleurt = nlp.load_metric("bleurt")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/load.py", line 443, in load_metric
metric_cls = import_main_class(module_path, dataset=False)
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/load.py", line 61, in import_main_class
module = importlib.import_module(module_path)
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/metrics/bleurt/43448cf2959ea81d3ae0e71c5c8ee31dc15eed9932f197f5f50673cbcecff2b5/bleurt.py", line 20, in <module>
from nlp.logging import get_logger
ModuleNotFoundError: No module named 'nlp.logging'
```
Just to show once again that I can't import the logging module:
```
>>> import nlp
2020-09-02 13:48:38.190621: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.1
>>> nlp.__version__
'0.4.0'
>>> from nlp.logging import get_logger
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'nlp.logging'
```
Thanks for reporting.
Apparently this is a versioning issue: the lib downloaded the `bleurt` script from the master branch where we did this change recently. We'll fix that in a new release this week or early next week. Cc @thomwolf
Until that, I'd suggest you to download the right bleurt folder from github ([this one](https://github.com/huggingface/nlp/tree/0.4.0/metrics/bleurt)) and do
```python
from nlp import load_metric
bleurt = load_metric("path/to/bleurt/folder")
```
To download it you can either clone the repo or download the `bleurt.py` file and place it in a folder named `bleurt` | [
0.0177447423,
-0.3961700797,
-0.020808693,
-0.2511642575,
0.2170931995,
-0.0493597016,
0.2021474391,
0.3259569407,
0.0938875079,
-0.0865904689,
0.1572887897,
0.3769111633,
-0.6784248352,
0.1394218504,
0.266335994,
-0.12211252,
-0.1415551603,
0.1301273257,
0.4967285991,
-0.0840893388,
-0.1551875025,
0.2894214094,
-0.1251254231,
0.1515606046,
-0.2375950068,
0.1572917253,
0.1200009286,
-0.0619186759,
-0.0776275992,
-0.3836638331,
0.0709403008,
-0.3669740856,
0.3694148958,
0.0429326966,
-0.0001146313,
-0.2010704577,
0.3691684604,
-0.0789470449,
-0.3076004088,
-0.4952629209,
0.0295941681,
-0.5176200271,
0.2748708725,
-0.1576721817,
0.1493027359,
-0.2864572406,
0.236662358,
-0.3097524941,
0.1938051879,
0.4161175191,
0.2068959922,
-0.0221976228,
-0.0768315569,
0.0798193589,
0.0021626409,
-0.0918753371,
-0.0400300622,
0.3646021485,
0.168252632,
-0.2020519376,
-0.2094081938,
0.0075307786,
0.0123418346,
-0.0307615437,
0.4454036355,
0.0410988405,
0.4342163801,
0.074441053,
0.1132745743,
0.0331304595,
0.3260389566,
-0.2414315194,
0.212452963,
0.2793805003,
0.2856969833,
-0.6526635289,
0.0322942175,
0.3493676782,
-0.1642587781,
-0.197570473,
-0.3211483359,
-0.1605102122,
-0.0767731518,
0.2774336934,
-0.0267948285,
0.2369682938,
-0.1171351373,
-0.0183081776,
0.0479584374,
0.0457222573,
0.0512731485,
0.3074578941,
0.057945542,
0.2783441842,
-0.0295916498,
0.0939965397,
0.3766543567,
0.0809350461,
-0.3641400039,
0.0373017378,
-0.079293482,
-0.2028974295,
0.0826360881,
0.1005701423,
0.1280951351,
0.482413888,
0.580224812,
0.0861539841,
0.1211155951,
0.1061830968,
0.0224337727,
0.1806165278,
0.0178395286,
-0.2114957571,
-0.0291193798,
-0.0329338312,
-0.0712679997,
-0.0087653175,
-0.225730598,
-0.0556233525,
-0.226583153,
-0.0307782758,
0.1380365938,
0.3203542531,
-0.2769456506,
-0.3906171024,
0.5350921154,
0.1231341511,
-0.2232646346,
-0.0321768299,
-0.1262368709,
0.5046452284,
-0.4369258285,
-0.1370982528,
0.1434805095,
-0.079778634,
0.4327976704,
-0.1558859944,
-0.1006637812,
-0.2113870382,
-0.0947680995,
0.2198672444,
-0.0771774203,
-0.0099428501,
-0.0664816722,
-0.3026093841,
0.0548938662,
-0.166654557,
-0.1033994406,
-0.0218892321,
-0.1572465152,
-0.314260453,
-0.2244265676,
0.1509291828,
-0.339389205,
-0.2686005831,
0.2011346817,
0.0782152712,
-0.0370585546,
-0.382678926,
-0.1565063298,
-0.1386647373,
0.0589940399,
-0.0779407248,
0.1836319864,
-0.0987103879,
0.0726181418,
-0.3945175707,
-0.3643251956,
0.2603408694,
0.3422808647,
-0.1949789524,
-0.2013522238,
0.2731038928,
-0.102845259,
-0.1197199449,
0.625600636,
-0.2656615078,
0.2711015642,
-0.2094728947,
-0.0848326683,
-0.312979579,
-0.0258243419,
-0.0342768952,
-0.1933759153,
-0.0780977458,
0.3543559611,
0.2127126455,
-0.0401688404,
-0.2943751812,
-0.1962094754,
-0.1480364501,
0.127735272,
0.1170277148,
0.3017060161,
0.0429290421,
0.1312471479,
0.5985457897,
-0.0901136175,
-0.0685132295,
-0.0750641674,
0.1411293149,
0.1807709634,
-0.2709914446,
0.1386938095,
-0.1707384884,
0.11958538,
-0.1768054366,
-0.3409103751,
0.0959559605,
0.0197771769,
0.2479204535,
-0.1413605511,
-0.123960048,
0.1175826788,
-0.3142854571,
0.103029348,
0.0294076856,
-0.1288950741,
0.3554204106,
-0.3825531006,
-0.0465956666,
-0.2218794525,
-0.1060576364,
-0.1710864007,
-0.1487459242,
-0.0094743166,
-0.225369066,
0.1783795953,
0.6324030161,
-0.0850066245,
0.23469688,
-0.0037055556,
-0.1035135984,
-0.0135539304,
0.267334491,
0.0565779246,
-0.0577415079,
-0.04824505,
-0.0602157377,
0.1752787381,
0.0162846372,
-0.0962974057,
-0.0791668072,
0.4494023621,
0.0644366145,
0.2025587261,
0.0629724562,
0.1511247605,
0.2264125645,
-0.0125720873,
-0.251065582,
0.0872601941,
-0.1092126593,
0.8314300776,
-0.1710356772,
0.2515508533,
-0.2746801078,
0.0517181605,
-0.2004824579,
0.6626396179,
0.2042044252,
0.0820649564,
0.1802143455,
-0.2810434103,
-0.0692387447,
0.1406126469,
0.0014659464,
0.3870842457,
0.2310895771,
-0.0509548783,
0.2067660242,
-0.1034538448,
-0.1705688089,
0.1831773669,
-0.0443812646,
-0.2436571866,
0.0247677565,
0.1182229668,
-0.1303318143,
-0.148140654,
-0.1530949473,
-0.1233130097,
0.1996000409,
-0.0350606404,
0.1603681445,
-0.3734544814,
-0.1339824796,
-0.2300897092,
-0.1811048836,
-0.2294768691,
-0.2983359098,
0.0284396522,
0.0945010856,
0.0437804013,
0.0856080055,
0.4398252964,
0.2297271639,
-0.2515345216,
-0.164912641,
-0.0883318782,
-0.4656352103,
-0.468177855,
0.0709280074,
0.1150502115,
0.231340304,
0.2546291053,
-0.1552509218,
0.1433706135,
-0.0437560007,
-0.5664675236,
-0.0181738716,
0.0860511139,
0.0709255114,
0.3076510131,
-0.0665008277,
-0.1981247813,
-0.30417189,
0.1178149879,
-0.2248548418,
0.0074068233,
0.2838935554,
-0.1443242431,
-0.0391860604,
-0.3543542922,
-0.378294915,
-0.226476714,
-0.5012185574,
0.0852806643,
0.0689317361,
-0.0958224535,
0.4386115968,
0.0011936165,
0.546356082,
-0.0049779676,
0.2645167112,
0.0071173199,
0.3665677309,
0.1443116814,
-0.2059244961,
-0.3020400107,
0.1247389689,
-0.123569876,
0.3829788268,
0.0842033327,
-0.3999667168,
-0.3124958575,
0.3001258075,
-0.4379544258,
0.0871433392,
0.1368220896,
0.223835066,
0.0173717588,
-0.0351180173,
-0.1492093056,
-0.1678653061,
0.1437754333,
0.1370408535,
0.7387798429,
-0.0642650574,
0.2167179286,
0.0948873982,
0.3520078361,
0.1043618768,
-0.3068550229,
0.073493652,
0.044195503,
0.2859509289,
0.1100363731,
-0.2359178811,
0.1601349562,
0.0701999888,
-0.1433646679,
0.3142092824,
-0.0822947621,
-0.2094070315,
-0.3659451008,
-0.0670050532,
0.1722669005,
-0.0579463691,
0.022408111,
-0.0850384161,
0.2522851825,
-0.0257229023,
-0.1219103634,
-0.3413434029,
-0.13929528,
0.0958548933,
0.2691056132,
-0.0436147973,
0.1610606313,
0.2585426867,
-0.1735364795,
-0.4186550379,
0.3763267398,
-0.0102597885,
0.075320363,
-0.1960622221,
-0.0525870547,
0.092378974,
0.1480779946,
-0.44502455,
0.0221822802,
-0.0322812051,
0.1799475551,
0.2203017622,
-0.420676738,
0.0985986739,
-0.0531908534,
0.1326412559,
0.6327111125,
-0.0288598686,
-0.1421376765,
0.0016262159,
0.3982035816,
0.0126885809,
0.0350449979,
-0.2267850041,
-0.4677222371,
-0.3101117015,
0.0240110382,
-0.4039106667,
-0.1081058159,
0.1898397356,
0.4054774046,
0.2305119336,
0.1628720611,
-0.2111326903,
0.2652072012,
-0.1094020531,
-0.0186428092,
-0.2338204235,
-0.1241569817,
-0.1765096486,
0.0746416599,
-0.2352045774,
-0.0406515226,
-0.1481790841,
-0.4993693531,
0.0345392264,
0.0694437623,
0.3608239889,
-0.0557454377,
-0.1037611514,
0.116895318,
-0.2295796573,
0.0434440225,
0.1421993971,
0.3446508944,
0.2187674642,
0.2476874888,
0.1366507411,
-0.0933893844,
0.3914398849,
-0.0074758343,
0.2628163099,
0.6665164232,
-0.2765218914,
-0.2091831565,
0.0268201232,
0.0259914808,
0.6081100106,
0.398050487,
-0.1217547059,
0.5636967421,
-0.0332953557,
0.3709041178,
-0.2529838383,
0.3598001003,
-0.2083512396,
0.2281369716,
0.1237296239,
-0.2150112987,
0.1339335889,
-0.3755696416,
-0.0285156742,
0.118047297,
-0.1512494236,
-0.1804248691,
0.0978514031,
0.2777609229,
-0.0858836025,
0.0670809075,
-0.3402469456,
0.1049265414,
-0.5141493678,
0.3371419609,
-0.1234820187,
-0.1297786832,
0.0850419328,
-0.0571585074,
-0.0586515144,
0.2024146914,
0.1764391065,
0.2453874797,
0.1791049242,
0.0015669912,
0.1329323202,
0.3271037936,
0.4160194397,
-0.1411470473,
-0.1413771659,
0.1801224351,
-0.2844994664,
-0.095366925,
0.2117823362,
0.2493563294,
0.3500953913,
-0.3208933473,
0.0167742968,
0.0921133757,
0.0947501361,
-0.3090744615,
0.2534044683,
0.3062737286,
-0.0042510061,
0.3142987192,
0.0549890064,
0.4683717489,
0.0037167668,
0.2038834691,
0.1448802203,
0.1391283274,
-0.188737154,
0.1346671879,
-0.0655602217,
-0.241187647,
0.0385502391,
0.1038154066,
0.0686358809,
-0.0115709677,
0.3635195494,
-0.1714936197,
-0.1635910571,
-0.0536272079,
-0.516215682,
-0.0766496137,
-0.2516143024,
0.2087754309,
0.2589107156,
-0.1144478023,
0.3502961397,
0.3396167159,
0.0072799213,
-0.0035638027,
-0.1966382563,
-0.1979462206,
-0.3698655963,
0.3572336435,
-0.4561418593,
-0.082159698,
0.0321729481,
0.5907013416,
0.0789091438,
0.0997670144,
-0.2613566518,
-0.1251947582,
-0.1099204049,
0.1504794508,
0.1698432714,
-0.3847261071,
0.1303612143,
0.2244988084,
0.0167800188,
0.039007917,
-0.3336630166,
-0.1411159784,
-0.2641183436,
0.1118496358,
-0.0036450233,
0.2566158175,
-0.0333224833,
-0.2517703474,
-0.2034108639,
-0.018271707,
-0.1530154049,
0.1000023782,
-0.1967068464,
0.1052413732,
-0.3052289486,
0.1879134625,
0.2350841761,
-0.2528903484,
0.0142704733,
0.1697072387,
0.0677078068,
-0.0516889878,
0.1439800113,
0.1308485568,
-0.2021969855,
-0.140556097,
-0.051432088,
0.0384289138,
0.272610724,
0.0369228795,
0.0304566324,
-0.5750378966,
-0.2175091058,
0.2021582723,
-0.0762686729,
-0.0241155475,
0.1872988641,
0.0907592252,
-0.2569817603,
0.2025824487,
-0.0290614069,
0.3365929425,
-0.2204918563,
-0.0248665288,
0.1196048781,
0.0173411071,
0.3703122437,
0.0830853581,
-0.1299898326,
-0.6242675185,
0.1826904714,
-0.3209092617,
-0.2156855762,
0.0443556458,
0.1597317606,
0.2159295529,
0.1625346839,
-0.1061164141,
-0.1998497993,
0.2159614265,
-0.1669278294,
-0.0119242184,
0.0806013793,
-0.0823677033,
-0.1484624743,
0.2061640173,
0.0952179134,
0.166023016,
-0.1503379196,
0.5230818391,
-0.2694709599,
0.1774446219,
-0.1627532244,
0.2485533506,
-0.0211531706,
0.3340613842,
0.3533409834,
0.1310349703,
-0.0641064793,
-0.0746059939,
0.3371084929,
0.1211458147,
-0.2316766679,
0.2225828171,
-0.2420721054,
0.0484225601,
-0.4349016547,
0.2174108922,
0.1711486578,
-0.0655465424,
-0.0372692198,
-0.1865080297,
0.0532959253,
-0.0720684081,
0.2679017186,
0.1563830674,
0.4642437994,
0.0486471541,
-0.2285406888,
0.2022953629,
0.0982780159,
0.0707497299,
0.3525558412,
0.6207302213,
0.3232076168,
-0.0066830702,
0.076551713,
-0.1554657966,
0.1863849759,
-0.010710666,
-0.1345582306,
-0.1445503831,
0.5972313285,
0.1582024395,
0.0774130821,
-0.5391673446,
0.0336950421,
-0.5927144289,
-0.0046076663,
0.1251920164,
0.3890294731,
-0.1128291786,
-0.1896719038,
0.058973074,
-0.1500435472,
-0.0723700151,
0.2690996528,
0.2779954672,
-0.009762153,
-0.2528842092,
-0.2057821155,
0.2209813446,
0.0981389284,
-0.1787584275,
-0.1262217164,
-0.0857612044,
-0.1896122843,
-0.1297743022,
-0.1223466843,
0.1015100479,
0.0939776078,
-0.5398162603,
-0.3409733474,
-0.1714630574,
0.0252823792,
-0.0521861874,
-0.0143680498,
-0.0892977268,
-0.2821857631,
-0.4951389432,
0.0742340088,
-0.098318845,
-0.0900789201,
0.2250671536,
0.5810576677,
0.2559540272,
-0.1868476719,
0.4078856111,
0.1008468419,
-0.122165069,
-0.2026498169,
0.074939236,
-0.0977012217,
-0.1044103131,
0.2023272216,
0.2365568429,
-0.3165029585,
0.1566905826,
-0.3306830823,
0.8244404197,
0.0351233482,
-0.2451609075,
-0.1146776378,
0.0505556166,
-0.0988109335,
0.3092764318,
0.2174610943,
0.4579276145,
-0.1223987117,
0.0678030476,
-0.3379890621,
-0.0043135285,
0.4632380307,
-0.6978576779,
-0.2602536678,
0.3898676932,
0.4401411414,
0.3855003715,
-0.1703183949,
-0.6865503788,
-0.2105175108,
0.0655455887,
0.0370597355,
0.1663675457,
0.3353217542,
-0.0993319675,
0.1353846043,
0.0923242867,
-0.2173089683,
0.2460509539,
-0.0228159688,
0.0020169988,
-0.2244127989
] |
https://github.com/huggingface/datasets/issues/565 | No module named 'nlp.logging' | Actually we can fix this on our side, this script didn't had to be updated. I'll do it in a few minutes | Hi, I am using nlp version 0.4.0. Trying to use bleurt as an eval metric, however, the bleurt script imports nlp.logging which creates the following error. What am I missing?
```
>>> import nlp
2020-09-02 13:47:09.210310: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.1
>>> bleurt = nlp.load_metric("bleurt")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/load.py", line 443, in load_metric
metric_cls = import_main_class(module_path, dataset=False)
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/load.py", line 61, in import_main_class
module = importlib.import_module(module_path)
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/metrics/bleurt/43448cf2959ea81d3ae0e71c5c8ee31dc15eed9932f197f5f50673cbcecff2b5/bleurt.py", line 20, in <module>
from nlp.logging import get_logger
ModuleNotFoundError: No module named 'nlp.logging'
```
Just to show once again that I can't import the logging module:
```
>>> import nlp
2020-09-02 13:48:38.190621: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.1
>>> nlp.__version__
'0.4.0'
>>> from nlp.logging import get_logger
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'nlp.logging'
``` | 22 | No module named 'nlp.logging'
Hi, I am using nlp version 0.4.0. Trying to use bleurt as an eval metric, however, the bleurt script imports nlp.logging which creates the following error. What am I missing?
```
>>> import nlp
2020-09-02 13:47:09.210310: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.1
>>> bleurt = nlp.load_metric("bleurt")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/load.py", line 443, in load_metric
metric_cls = import_main_class(module_path, dataset=False)
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/load.py", line 61, in import_main_class
module = importlib.import_module(module_path)
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/melody/anaconda3/envs/transformers/lib/python3.6/site-packages/nlp/metrics/bleurt/43448cf2959ea81d3ae0e71c5c8ee31dc15eed9932f197f5f50673cbcecff2b5/bleurt.py", line 20, in <module>
from nlp.logging import get_logger
ModuleNotFoundError: No module named 'nlp.logging'
```
Just to show once again that I can't import the logging module:
```
>>> import nlp
2020-09-02 13:48:38.190621: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.1
>>> nlp.__version__
'0.4.0'
>>> from nlp.logging import get_logger
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'nlp.logging'
```
Actually we can fix this on our side, this script didn't had to be updated. I'll do it in a few minutes | [
0.0177447423,
-0.3961700797,
-0.020808693,
-0.2511642575,
0.2170931995,
-0.0493597016,
0.2021474391,
0.3259569407,
0.0938875079,
-0.0865904689,
0.1572887897,
0.3769111633,
-0.6784248352,
0.1394218504,
0.266335994,
-0.12211252,
-0.1415551603,
0.1301273257,
0.4967285991,
-0.0840893388,
-0.1551875025,
0.2894214094,
-0.1251254231,
0.1515606046,
-0.2375950068,
0.1572917253,
0.1200009286,
-0.0619186759,
-0.0776275992,
-0.3836638331,
0.0709403008,
-0.3669740856,
0.3694148958,
0.0429326966,
-0.0001146313,
-0.2010704577,
0.3691684604,
-0.0789470449,
-0.3076004088,
-0.4952629209,
0.0295941681,
-0.5176200271,
0.2748708725,
-0.1576721817,
0.1493027359,
-0.2864572406,
0.236662358,
-0.3097524941,
0.1938051879,
0.4161175191,
0.2068959922,
-0.0221976228,
-0.0768315569,
0.0798193589,
0.0021626409,
-0.0918753371,
-0.0400300622,
0.3646021485,
0.168252632,
-0.2020519376,
-0.2094081938,
0.0075307786,
0.0123418346,
-0.0307615437,
0.4454036355,
0.0410988405,
0.4342163801,
0.074441053,
0.1132745743,
0.0331304595,
0.3260389566,
-0.2414315194,
0.212452963,
0.2793805003,
0.2856969833,
-0.6526635289,
0.0322942175,
0.3493676782,
-0.1642587781,
-0.197570473,
-0.3211483359,
-0.1605102122,
-0.0767731518,
0.2774336934,
-0.0267948285,
0.2369682938,
-0.1171351373,
-0.0183081776,
0.0479584374,
0.0457222573,
0.0512731485,
0.3074578941,
0.057945542,
0.2783441842,
-0.0295916498,
0.0939965397,
0.3766543567,
0.0809350461,
-0.3641400039,
0.0373017378,
-0.079293482,
-0.2028974295,
0.0826360881,
0.1005701423,
0.1280951351,
0.482413888,
0.580224812,
0.0861539841,
0.1211155951,
0.1061830968,
0.0224337727,
0.1806165278,
0.0178395286,
-0.2114957571,
-0.0291193798,
-0.0329338312,
-0.0712679997,
-0.0087653175,
-0.225730598,
-0.0556233525,
-0.226583153,
-0.0307782758,
0.1380365938,
0.3203542531,
-0.2769456506,
-0.3906171024,
0.5350921154,
0.1231341511,
-0.2232646346,
-0.0321768299,
-0.1262368709,
0.5046452284,
-0.4369258285,
-0.1370982528,
0.1434805095,
-0.079778634,
0.4327976704,
-0.1558859944,
-0.1006637812,
-0.2113870382,
-0.0947680995,
0.2198672444,
-0.0771774203,
-0.0099428501,
-0.0664816722,
-0.3026093841,
0.0548938662,
-0.166654557,
-0.1033994406,
-0.0218892321,
-0.1572465152,
-0.314260453,
-0.2244265676,
0.1509291828,
-0.339389205,
-0.2686005831,
0.2011346817,
0.0782152712,
-0.0370585546,
-0.382678926,
-0.1565063298,
-0.1386647373,
0.0589940399,
-0.0779407248,
0.1836319864,
-0.0987103879,
0.0726181418,
-0.3945175707,
-0.3643251956,
0.2603408694,
0.3422808647,
-0.1949789524,
-0.2013522238,
0.2731038928,
-0.102845259,
-0.1197199449,
0.625600636,
-0.2656615078,
0.2711015642,
-0.2094728947,
-0.0848326683,
-0.312979579,
-0.0258243419,
-0.0342768952,
-0.1933759153,
-0.0780977458,
0.3543559611,
0.2127126455,
-0.0401688404,
-0.2943751812,
-0.1962094754,
-0.1480364501,
0.127735272,
0.1170277148,
0.3017060161,
0.0429290421,
0.1312471479,
0.5985457897,
-0.0901136175,
-0.0685132295,
-0.0750641674,
0.1411293149,
0.1807709634,
-0.2709914446,
0.1386938095,
-0.1707384884,
0.11958538,
-0.1768054366,
-0.3409103751,
0.0959559605,
0.0197771769,
0.2479204535,
-0.1413605511,
-0.123960048,
0.1175826788,
-0.3142854571,
0.103029348,
0.0294076856,
-0.1288950741,
0.3554204106,
-0.3825531006,
-0.0465956666,
-0.2218794525,
-0.1060576364,
-0.1710864007,
-0.1487459242,
-0.0094743166,
-0.225369066,
0.1783795953,
0.6324030161,
-0.0850066245,
0.23469688,
-0.0037055556,
-0.1035135984,
-0.0135539304,
0.267334491,
0.0565779246,
-0.0577415079,
-0.04824505,
-0.0602157377,
0.1752787381,
0.0162846372,
-0.0962974057,
-0.0791668072,
0.4494023621,
0.0644366145,
0.2025587261,
0.0629724562,
0.1511247605,
0.2264125645,
-0.0125720873,
-0.251065582,
0.0872601941,
-0.1092126593,
0.8314300776,
-0.1710356772,
0.2515508533,
-0.2746801078,
0.0517181605,
-0.2004824579,
0.6626396179,
0.2042044252,
0.0820649564,
0.1802143455,
-0.2810434103,
-0.0692387447,
0.1406126469,
0.0014659464,
0.3870842457,
0.2310895771,
-0.0509548783,
0.2067660242,
-0.1034538448,
-0.1705688089,
0.1831773669,
-0.0443812646,
-0.2436571866,
0.0247677565,
0.1182229668,
-0.1303318143,
-0.148140654,
-0.1530949473,
-0.1233130097,
0.1996000409,
-0.0350606404,
0.1603681445,
-0.3734544814,
-0.1339824796,
-0.2300897092,
-0.1811048836,
-0.2294768691,
-0.2983359098,
0.0284396522,
0.0945010856,
0.0437804013,
0.0856080055,
0.4398252964,
0.2297271639,
-0.2515345216,
-0.164912641,
-0.0883318782,
-0.4656352103,
-0.468177855,
0.0709280074,
0.1150502115,
0.231340304,
0.2546291053,
-0.1552509218,
0.1433706135,
-0.0437560007,
-0.5664675236,
-0.0181738716,
0.0860511139,
0.0709255114,
0.3076510131,
-0.0665008277,
-0.1981247813,
-0.30417189,
0.1178149879,
-0.2248548418,
0.0074068233,
0.2838935554,
-0.1443242431,
-0.0391860604,
-0.3543542922,
-0.378294915,
-0.226476714,
-0.5012185574,
0.0852806643,
0.0689317361,
-0.0958224535,
0.4386115968,
0.0011936165,
0.546356082,
-0.0049779676,
0.2645167112,
0.0071173199,
0.3665677309,
0.1443116814,
-0.2059244961,
-0.3020400107,
0.1247389689,
-0.123569876,
0.3829788268,
0.0842033327,
-0.3999667168,
-0.3124958575,
0.3001258075,
-0.4379544258,
0.0871433392,
0.1368220896,
0.223835066,
0.0173717588,
-0.0351180173,
-0.1492093056,
-0.1678653061,
0.1437754333,
0.1370408535,
0.7387798429,
-0.0642650574,
0.2167179286,
0.0948873982,
0.3520078361,
0.1043618768,
-0.3068550229,
0.073493652,
0.044195503,
0.2859509289,
0.1100363731,
-0.2359178811,
0.1601349562,
0.0701999888,
-0.1433646679,
0.3142092824,
-0.0822947621,
-0.2094070315,
-0.3659451008,
-0.0670050532,
0.1722669005,
-0.0579463691,
0.022408111,
-0.0850384161,
0.2522851825,
-0.0257229023,
-0.1219103634,
-0.3413434029,
-0.13929528,
0.0958548933,
0.2691056132,
-0.0436147973,
0.1610606313,
0.2585426867,
-0.1735364795,
-0.4186550379,
0.3763267398,
-0.0102597885,
0.075320363,
-0.1960622221,
-0.0525870547,
0.092378974,
0.1480779946,
-0.44502455,
0.0221822802,
-0.0322812051,
0.1799475551,
0.2203017622,
-0.420676738,
0.0985986739,
-0.0531908534,
0.1326412559,
0.6327111125,
-0.0288598686,
-0.1421376765,
0.0016262159,
0.3982035816,
0.0126885809,
0.0350449979,
-0.2267850041,
-0.4677222371,
-0.3101117015,
0.0240110382,
-0.4039106667,
-0.1081058159,
0.1898397356,
0.4054774046,
0.2305119336,
0.1628720611,
-0.2111326903,
0.2652072012,
-0.1094020531,
-0.0186428092,
-0.2338204235,
-0.1241569817,
-0.1765096486,
0.0746416599,
-0.2352045774,
-0.0406515226,
-0.1481790841,
-0.4993693531,
0.0345392264,
0.0694437623,
0.3608239889,
-0.0557454377,
-0.1037611514,
0.116895318,
-0.2295796573,
0.0434440225,
0.1421993971,
0.3446508944,
0.2187674642,
0.2476874888,
0.1366507411,
-0.0933893844,
0.3914398849,
-0.0074758343,
0.2628163099,
0.6665164232,
-0.2765218914,
-0.2091831565,
0.0268201232,
0.0259914808,
0.6081100106,
0.398050487,
-0.1217547059,
0.5636967421,
-0.0332953557,
0.3709041178,
-0.2529838383,
0.3598001003,
-0.2083512396,
0.2281369716,
0.1237296239,
-0.2150112987,
0.1339335889,
-0.3755696416,
-0.0285156742,
0.118047297,
-0.1512494236,
-0.1804248691,
0.0978514031,
0.2777609229,
-0.0858836025,
0.0670809075,
-0.3402469456,
0.1049265414,
-0.5141493678,
0.3371419609,
-0.1234820187,
-0.1297786832,
0.0850419328,
-0.0571585074,
-0.0586515144,
0.2024146914,
0.1764391065,
0.2453874797,
0.1791049242,
0.0015669912,
0.1329323202,
0.3271037936,
0.4160194397,
-0.1411470473,
-0.1413771659,
0.1801224351,
-0.2844994664,
-0.095366925,
0.2117823362,
0.2493563294,
0.3500953913,
-0.3208933473,
0.0167742968,
0.0921133757,
0.0947501361,
-0.3090744615,
0.2534044683,
0.3062737286,
-0.0042510061,
0.3142987192,
0.0549890064,
0.4683717489,
0.0037167668,
0.2038834691,
0.1448802203,
0.1391283274,
-0.188737154,
0.1346671879,
-0.0655602217,
-0.241187647,
0.0385502391,
0.1038154066,
0.0686358809,
-0.0115709677,
0.3635195494,
-0.1714936197,
-0.1635910571,
-0.0536272079,
-0.516215682,
-0.0766496137,
-0.2516143024,
0.2087754309,
0.2589107156,
-0.1144478023,
0.3502961397,
0.3396167159,
0.0072799213,
-0.0035638027,
-0.1966382563,
-0.1979462206,
-0.3698655963,
0.3572336435,
-0.4561418593,
-0.082159698,
0.0321729481,
0.5907013416,
0.0789091438,
0.0997670144,
-0.2613566518,
-0.1251947582,
-0.1099204049,
0.1504794508,
0.1698432714,
-0.3847261071,
0.1303612143,
0.2244988084,
0.0167800188,
0.039007917,
-0.3336630166,
-0.1411159784,
-0.2641183436,
0.1118496358,
-0.0036450233,
0.2566158175,
-0.0333224833,
-0.2517703474,
-0.2034108639,
-0.018271707,
-0.1530154049,
0.1000023782,
-0.1967068464,
0.1052413732,
-0.3052289486,
0.1879134625,
0.2350841761,
-0.2528903484,
0.0142704733,
0.1697072387,
0.0677078068,
-0.0516889878,
0.1439800113,
0.1308485568,
-0.2021969855,
-0.140556097,
-0.051432088,
0.0384289138,
0.272610724,
0.0369228795,
0.0304566324,
-0.5750378966,
-0.2175091058,
0.2021582723,
-0.0762686729,
-0.0241155475,
0.1872988641,
0.0907592252,
-0.2569817603,
0.2025824487,
-0.0290614069,
0.3365929425,
-0.2204918563,
-0.0248665288,
0.1196048781,
0.0173411071,
0.3703122437,
0.0830853581,
-0.1299898326,
-0.6242675185,
0.1826904714,
-0.3209092617,
-0.2156855762,
0.0443556458,
0.1597317606,
0.2159295529,
0.1625346839,
-0.1061164141,
-0.1998497993,
0.2159614265,
-0.1669278294,
-0.0119242184,
0.0806013793,
-0.0823677033,
-0.1484624743,
0.2061640173,
0.0952179134,
0.166023016,
-0.1503379196,
0.5230818391,
-0.2694709599,
0.1774446219,
-0.1627532244,
0.2485533506,
-0.0211531706,
0.3340613842,
0.3533409834,
0.1310349703,
-0.0641064793,
-0.0746059939,
0.3371084929,
0.1211458147,
-0.2316766679,
0.2225828171,
-0.2420721054,
0.0484225601,
-0.4349016547,
0.2174108922,
0.1711486578,
-0.0655465424,
-0.0372692198,
-0.1865080297,
0.0532959253,
-0.0720684081,
0.2679017186,
0.1563830674,
0.4642437994,
0.0486471541,
-0.2285406888,
0.2022953629,
0.0982780159,
0.0707497299,
0.3525558412,
0.6207302213,
0.3232076168,
-0.0066830702,
0.076551713,
-0.1554657966,
0.1863849759,
-0.010710666,
-0.1345582306,
-0.1445503831,
0.5972313285,
0.1582024395,
0.0774130821,
-0.5391673446,
0.0336950421,
-0.5927144289,
-0.0046076663,
0.1251920164,
0.3890294731,
-0.1128291786,
-0.1896719038,
0.058973074,
-0.1500435472,
-0.0723700151,
0.2690996528,
0.2779954672,
-0.009762153,
-0.2528842092,
-0.2057821155,
0.2209813446,
0.0981389284,
-0.1787584275,
-0.1262217164,
-0.0857612044,
-0.1896122843,
-0.1297743022,
-0.1223466843,
0.1015100479,
0.0939776078,
-0.5398162603,
-0.3409733474,
-0.1714630574,
0.0252823792,
-0.0521861874,
-0.0143680498,
-0.0892977268,
-0.2821857631,
-0.4951389432,
0.0742340088,
-0.098318845,
-0.0900789201,
0.2250671536,
0.5810576677,
0.2559540272,
-0.1868476719,
0.4078856111,
0.1008468419,
-0.122165069,
-0.2026498169,
0.074939236,
-0.0977012217,
-0.1044103131,
0.2023272216,
0.2365568429,
-0.3165029585,
0.1566905826,
-0.3306830823,
0.8244404197,
0.0351233482,
-0.2451609075,
-0.1146776378,
0.0505556166,
-0.0988109335,
0.3092764318,
0.2174610943,
0.4579276145,
-0.1223987117,
0.0678030476,
-0.3379890621,
-0.0043135285,
0.4632380307,
-0.6978576779,
-0.2602536678,
0.3898676932,
0.4401411414,
0.3855003715,
-0.1703183949,
-0.6865503788,
-0.2105175108,
0.0655455887,
0.0370597355,
0.1663675457,
0.3353217542,
-0.0993319675,
0.1353846043,
0.0923242867,
-0.2173089683,
0.2460509539,
-0.0228159688,
0.0020169988,
-0.2244127989
] |
https://github.com/huggingface/datasets/issues/560 | Using custom DownloadConfig results in an error | From my limited understanding, part of the issue seems related to the `prepare_module` and `download_and_prepare` functions each handling the case where no config is passed. For example, `prepare_module` does mutate the object passed and forces the flags `extract_compressed_file` and `force_extract` to `True`.
See:
* https://github.com/huggingface/nlp/blob/5fb61e1012bda724a9b6b847307d90a1380abfa5/src/nlp/load.py#L227
* https://github.com/huggingface/nlp/blob/5fb61e1012bda724a9b6b847307d90a1380abfa5/src/nlp/builder.py#L388
Maybe a cleaner solution would be to always instantiate a default `DownloadConfig` object at the top-level, have it as non-optional for the lower-level functions and treat it as immutable. | ## Version / Environment
Ubuntu 18.04
Python 3.6.8
nlp 0.4.0
## Description
Loading `imdb` dataset works fine when when I don't specify any `download_config` argument. When I create a custom `DownloadConfig` object and pass it to the `nlp.load_dataset` function, this results in an error.
## How to reproduce
### Example without DownloadConfig --> works
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-without-dl-config-01/"
import logging
import nlp
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
imdb = nlp.load_dataset(path="imdb")
```
### Example with DownloadConfig --> doesn't work
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-with-dl-config-01/"
import logging
import nlp
from nlp.utils import DownloadConfig
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
download_config = DownloadConfig()
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
```
Error traceback:
```
Traceback (most recent call last):
File "/.../example_with_dl_config.py", line 13, in <module>
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
File "/.../python3.6/python3.6/site-packages/nlp/load.py", line 549, in load_dataset
download_config=download_config, download_mode=download_mode, ignore_verifications=ignore_verifications,
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 463, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/.../python3.6/python3.6/site-packages/nlp/datasets/imdb/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743/imdb.py", line 86, in _split_generators
arch_path = dl_manager.download_and_extract(_DOWNLOAD_URL)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 158, in download
self._record_sizes_checksums(url_or_urls, downloaded_path_or_paths)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 108, in _record_sizes_checksums
self._recorded_sizes_checksums[url] = get_size_checksum_dict(path)
File "/.../python3.6/python3.6/site-packages/nlp/utils/info_utils.py", line 79, in get_size_checksum_dict
with open(path, "rb") as f:
IsADirectoryError: [Errno 21] Is a directory: '/data/hf-test-with-dl-config-01/datasets/extracted/b6802c5b61824b2c1f7dbf7cda6696b5f2e22214e18d171ce1ed3be90c931ce5'
```
| 76 | Using custom DownloadConfig results in an error
## Version / Environment
Ubuntu 18.04
Python 3.6.8
nlp 0.4.0
## Description
Loading `imdb` dataset works fine when when I don't specify any `download_config` argument. When I create a custom `DownloadConfig` object and pass it to the `nlp.load_dataset` function, this results in an error.
## How to reproduce
### Example without DownloadConfig --> works
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-without-dl-config-01/"
import logging
import nlp
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
imdb = nlp.load_dataset(path="imdb")
```
### Example with DownloadConfig --> doesn't work
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-with-dl-config-01/"
import logging
import nlp
from nlp.utils import DownloadConfig
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
download_config = DownloadConfig()
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
```
Error traceback:
```
Traceback (most recent call last):
File "/.../example_with_dl_config.py", line 13, in <module>
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
File "/.../python3.6/python3.6/site-packages/nlp/load.py", line 549, in load_dataset
download_config=download_config, download_mode=download_mode, ignore_verifications=ignore_verifications,
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 463, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/.../python3.6/python3.6/site-packages/nlp/datasets/imdb/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743/imdb.py", line 86, in _split_generators
arch_path = dl_manager.download_and_extract(_DOWNLOAD_URL)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 158, in download
self._record_sizes_checksums(url_or_urls, downloaded_path_or_paths)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 108, in _record_sizes_checksums
self._recorded_sizes_checksums[url] = get_size_checksum_dict(path)
File "/.../python3.6/python3.6/site-packages/nlp/utils/info_utils.py", line 79, in get_size_checksum_dict
with open(path, "rb") as f:
IsADirectoryError: [Errno 21] Is a directory: '/data/hf-test-with-dl-config-01/datasets/extracted/b6802c5b61824b2c1f7dbf7cda6696b5f2e22214e18d171ce1ed3be90c931ce5'
```
From my limited understanding, part of the issue seems related to the `prepare_module` and `download_and_prepare` functions each handling the case where no config is passed. For example, `prepare_module` does mutate the object passed and forces the flags `extract_compressed_file` and `force_extract` to `True`.
See:
* https://github.com/huggingface/nlp/blob/5fb61e1012bda724a9b6b847307d90a1380abfa5/src/nlp/load.py#L227
* https://github.com/huggingface/nlp/blob/5fb61e1012bda724a9b6b847307d90a1380abfa5/src/nlp/builder.py#L388
Maybe a cleaner solution would be to always instantiate a default `DownloadConfig` object at the top-level, have it as non-optional for the lower-level functions and treat it as immutable. | [
-0.1150191873,
0.1608140916,
0.1123034507,
0.0204853825,
-0.076284565,
-0.0592143647,
0.287856251,
0.038644366,
0.3867914379,
-0.199264437,
-0.1571437418,
0.3636112809,
-0.2807904482,
-0.0396444052,
0.1373905391,
0.1345225573,
-0.3357851803,
0.1916719377,
0.2066786587,
-0.0992470607,
-0.4401439428,
0.4904035032,
-0.1363260299,
-0.1055810302,
-0.110463053,
0.296576947,
-0.0853593349,
0.6002389789,
-0.0052090101,
-0.3369758725,
0.5871986747,
0.0767925307,
0.2937195003,
0.0780295134,
-0.0001159093,
0.0705930069,
0.233681947,
-0.3101005554,
-0.3799628317,
-0.2622922659,
-0.0511954203,
-0.1243967116,
0.2814411819,
-0.1946799755,
-0.0140371546,
-0.0242790468,
0.0815582126,
0.0334612802,
0.0941411331,
0.2972892523,
0.1419871897,
-0.0229713544,
-0.2864385843,
-0.1697260588,
0.2262741178,
0.2600771785,
0.0480876788,
0.529278934,
0.3167857528,
-0.2207207382,
0.0189586282,
-0.3049542308,
-0.1514440179,
0.0622547455,
0.5781334639,
0.0206803083,
0.2644709647,
-0.0439109541,
-0.1765139103,
0.3859302402,
0.2035036087,
-0.2281823903,
-0.0819196925,
-0.3655665517,
0.1446021795,
-0.4327117801,
0.1780244559,
0.1168845296,
-0.2737576962,
0.0803427175,
-0.2586565316,
-0.1680172086,
-0.0763952136,
0.508751452,
0.2171330601,
0.3228864372,
-0.0183474869,
0.1714262664,
-0.0767273456,
-0.0576793477,
0.2524202466,
-0.3967368603,
0.0253676437,
0.1348519772,
-0.0958462358,
-0.0770933032,
0.1277973503,
0.0256881565,
-0.032763958,
0.1246376783,
0.0682526007,
-0.1068674028,
0.0644042641,
0.0538507961,
0.0521951392,
0.2863238752,
-0.076601021,
0.1167677566,
-0.2036598027,
0.166965574,
-0.0508588366,
-0.0176646411,
0.0861195251,
-0.0753239244,
-0.3973385692,
-0.1976340413,
0.1687505245,
0.0014936477,
-0.062957719,
-0.3396822214,
-0.0813493952,
0.1825858951,
0.3383554816,
0.2477038801,
-0.0689880326,
-0.2663638592,
0.1952616125,
0.2603036165,
-0.0893421248,
0.1586497575,
-0.0882690027,
-0.1036935896,
-0.1506858468,
-0.0623485371,
0.4825169444,
0.2577641904,
0.6017349958,
0.0311782584,
0.0021427795,
0.028184697,
-0.0155199431,
0.121614255,
0.0396912508,
0.1165007129,
0.0043118298,
0.0528847948,
0.0625880063,
-0.2535326779,
-0.2557505369,
-0.0347327106,
-0.1973153502,
-0.524072051,
-0.3210694194,
0.1228111237,
-0.3273596764,
-0.3346838951,
0.2478318512,
-0.0070183943,
0.0605553053,
-0.3129454851,
-0.0337527953,
0.0295482948,
-0.4143958092,
-0.2400684059,
-0.089732632,
0.3797774613,
0.0012104437,
-0.2380744517,
-0.4831040204,
-0.1488146484,
0.2068390548,
-0.0621500015,
-0.4448851645,
0.246032089,
-0.3705917299,
0.1259260625,
0.7066496015,
-0.593824327,
-0.4758626521,
0.7060874701,
-0.1530993283,
0.1541839838,
0.1148286909,
0.304777205,
0.3644038439,
-0.1938651949,
0.2743984461,
0.4499005675,
0.164540112,
-0.0376655795,
-0.1342466772,
-0.23661834,
-0.045294717,
0.069085665,
-0.2126619518,
0.3807031512,
0.0322404616,
0.2286941707,
0.2805711925,
0.0719812512,
-0.0035695881,
0.0699648708,
-0.0190698858,
0.2078529298,
-0.2160391062,
0.2683117986,
-0.5918144584,
0.3081949055,
-0.2088378668,
0.0111982739,
0.1865446866,
0.0156149939,
-0.3768416047,
-0.2749230266,
-0.1442953795,
-0.3913903832,
0.0948156938,
0.3689242899,
0.1382951587,
-0.1243662462,
-0.128129378,
0.3340530396,
-0.2191487551,
-0.0176076498,
-0.160417527,
-0.0221683644,
0.1522675604,
-0.1637908518,
0.0569931045,
0.1050849408,
0.2022018284,
0.171318993,
-0.0114864428,
0.4094794393,
-0.1540580094,
0.0669988245,
-0.0887555629,
-0.2120266557,
0.0864796489,
-0.0025835261,
0.2240314484,
0.0819460154,
0.1684666276,
-0.2278700173,
-0.0677172393,
0.1494402289,
0.0992673784,
0.3279950619,
0.0210087597,
0.1423207521,
0.1164701283,
-0.2840329409,
-0.0270047709,
-0.3489286304,
0.0761513859,
0.1613445729,
0.3522972465,
0.0295900796,
-0.2389680296,
-0.0452458039,
0.2232686281,
0.1981344521,
0.020454634,
0.1191776842,
-0.0028204955,
-0.0736614913,
0.029122645,
0.646499157,
0.8169702291,
0.1107555553,
-0.0627676994,
0.4516937435,
-0.271200031,
-0.2540046573,
0.2792920768,
-0.1479597688,
0.128902331,
0.1667046845,
0.0079042353,
0.0080788657,
-0.1439011097,
-0.4434363246,
-0.0547378883,
-0.0084888581,
-0.4380913377,
-0.0696293861,
-0.2960467339,
-0.4512907267,
-0.1279859245,
-0.2876826227,
-0.0485401712,
-0.2141085267,
-0.1592726707,
0.1131824553,
0.1047846079,
-0.0083433427,
-0.2631918788,
0.0197489858,
-0.2949642539,
-0.8628901839,
0.0819825381,
-0.0314405374,
-0.3450303674,
-0.0835405886,
0.0386167727,
0.1652457565,
0.0731320232,
-0.1557794511,
-0.2440003604,
0.1602343172,
-0.102919668,
0.1122969165,
0.1613500267,
0.3075021207,
0.3883804083,
-0.151153475,
0.2565557063,
-0.4360590875,
0.1637898684,
0.216210559,
-0.0127828941,
0.042474553,
0.0326361172,
-0.0875904411,
-0.3689766526,
-0.2115420103,
-0.164963752,
-0.4502527118,
-0.0885405242,
0.2849121392,
0.1961458921,
0.4581587315,
0.1651854366,
0.3243180513,
-0.164577499,
0.0822790712,
-0.0674817339,
-0.3117093146,
0.2791599035,
-0.47333318,
-0.078999877,
0.0859311223,
-0.2227194756,
0.2154388279,
0.1335568875,
-0.3884004951,
0.1996544152,
-0.1469651908,
0.0002487451,
-0.0470807627,
-0.0934200063,
0.2494484931,
-0.0822577029,
0.0479588732,
-0.2609183192,
-0.3588173389,
0.3365956545,
0.3105378151,
0.5375540853,
0.2759686112,
0.298624903,
-0.1452534199,
0.484687835,
0.1508263648,
-0.1560513377,
0.6103883386,
-0.1947127432,
0.460178256,
-0.0382341072,
-0.2387868762,
-0.0032228082,
0.3633524179,
-0.0172827691,
0.2315364033,
-0.061175216,
-0.3007302284,
-0.1902963668,
-0.1062507853,
-0.1545365453,
-0.2300316691,
-0.0597751439,
0.0295014493,
0.1523667872,
0.0646301508,
0.1584213674,
-0.0831473321,
-0.1170615405,
-0.1872001141,
0.4351291656,
-0.0597942546,
0.2172850668,
-0.0276044309,
0.2997870445,
-0.3493390977,
0.1422001272,
-0.2189184874,
0.3044459522,
-0.1182057634,
0.0613892823,
-0.0159720406,
0.1169418395,
0.2156412154,
-0.2203438282,
0.2306118459,
0.2625309825,
0.1650993079,
-0.6942800283,
-0.0943868831,
0.0507083386,
0.3705738485,
0.6355692148,
0.1778833866,
-0.1156035215,
-0.236060679,
0.1376714706,
-0.175421983,
-0.0925002024,
-0.4230397344,
-0.0664593875,
-0.3104560971,
-0.2801862955,
-0.3828446269,
-0.1292265207,
0.2305581719,
0.2301345766,
0.1135467812,
0.0101911668,
0.037024416,
0.0223108456,
-0.0750331581,
0.0257842913,
0.165501669,
0.2535585463,
-0.3498474061,
0.0041104993,
0.0184273291,
0.4556972086,
-0.1090065539,
-0.1824811697,
-0.0907647535,
0.0760888904,
0.1698763967,
0.2375927866,
-0.0469809063,
-0.278847754,
-0.0843370631,
-0.243196398,
0.0611785315,
0.1471871138,
0.2561113238,
-0.5553205013,
0.1812880933,
-0.3989014328,
0.5427383184,
-0.1028673053,
-0.0532118082,
0.1602560729,
-0.0507054217,
-0.3970302045,
0.3244518936,
-0.0401612632,
0.5586261153,
0.2404640019,
0.0567302071,
0.4123043418,
-0.4333569705,
0.340760529,
0.3820565343,
0.0827065483,
-0.2056448609,
0.2356446385,
0.1081555635,
-0.243430689,
0.1182639301,
0.4598893523,
-0.1859721243,
0.3259103298,
-0.155995518,
-0.4174779058,
0.0909638479,
0.437913388,
-0.2495400012,
-0.3017905354,
-0.317876339,
0.1256024539,
-0.0598925799,
0.3444619179,
-0.0450953618,
-0.313447684,
-0.3213608265,
-0.1474079192,
-0.2323344052,
0.1951861382,
-0.0521365814,
-0.1184979528,
0.1656087786,
0.0278663039,
0.1757667661,
0.3797241151,
-0.2905165851,
-0.0837100595,
-0.238989234,
0.083014667,
-0.1039217412,
-0.0695939809,
0.1519479156,
0.0730861127,
0.2014411688,
-0.1141102165,
-0.3429859281,
-0.1494406462,
-0.0445593409,
-0.201519087,
0.4617882073,
-0.0481094234,
0.0759339631,
-0.1242551059,
-0.492182821,
0.0628538653,
-0.0180092305,
0.1029132158,
0.0785280764,
0.0157989599,
-0.0287092943,
0.1853823364,
-0.1957310438,
-0.337682724,
-0.0281161536,
0.3746061921,
-0.3746852577,
0.290784657,
0.4278506935,
-0.1933621764,
0.1264045238,
-0.0584544875,
0.0083093792,
-0.5848217607,
-0.21616292,
0.103173539,
0.2297324985,
0.1647561044,
0.1329748482,
0.2493012697,
-0.0034954199,
0.180294469,
0.2565452754,
-0.3308794498,
-0.1663015783,
-0.0038664863,
-0.3690318465,
0.1466335207,
-0.1980223954,
0.2377352417,
0.3796905279,
-0.2016597688,
-0.2258140743,
0.1875278652,
-0.1729786396,
0.0496795438,
0.1979012191,
0.1081394702,
0.0217498988,
0.1827964783,
0.0240135714,
-0.0625099391,
-0.2874265909,
-0.133997336,
-0.3909823596,
0.1758334637,
0.226489827,
-0.1660331786,
-0.0547828153,
-0.2572703958,
-0.2181169391,
-0.0314531699,
-0.0373602659,
0.1865928918,
-0.0764184892,
0.3890279531,
-0.1978254914,
0.243920207,
-0.0793267787,
0.1648310423,
-0.1698616743,
0.3258366883,
0.0379705243,
0.1343825459,
-0.1742456257,
0.0088646784,
-0.1068821251,
-0.23138991,
-0.2788546383,
0.0202060938,
0.2709775269,
-0.0176595822,
0.2543971539,
0.2019493133,
-0.041505184,
0.3494022489,
-0.2206251472,
-0.126997605,
0.2625514865,
0.1083604246,
-0.1582948565,
0.222232461,
0.3854727745,
0.3658418059,
0.0349598154,
0.2504578829,
0.160232693,
-0.3160719275,
-0.0662196726,
-0.0450420454,
0.2565683424,
0.0481730364,
-0.007577572,
0.3368712068,
-0.2633073926,
0.200919345,
0.0543518141,
0.281752646,
0.2158437073,
0.4725826383,
-0.0022781901,
-0.1878002435,
-0.2764520943,
0.0665214807,
0.1926875412,
-0.5811085701,
0.1973800957,
-0.0308280066,
0.1413286328,
0.2872925699,
-0.1325432956,
0.0486671887,
-0.1231481582,
0.4628401101,
-0.0270577706,
0.2395099849,
0.0963632166,
-0.1682440042,
-0.1834776998,
0.0434651226,
-0.0890514627,
0.0064130276,
-0.169731617,
-0.2001912296,
-0.0572065897,
0.0989689454,
-0.2617081702,
-0.2644015551,
0.2126556039,
-0.0461101234,
-0.0920266286,
-0.2886495292,
0.1431360692,
-0.0127506964,
0.1704822779,
0.3351089358,
0.260812819,
0.2379632592,
0.5772153139,
-0.0407348014,
-0.1141021252,
0.1889009178,
0.0600244105,
-0.2130744606,
0.0773512274,
0.3676769137,
0.350633204,
0.3967398107,
0.0209642015,
-0.088220939,
-0.0021248413,
0.2406387329,
0.2035894692,
-0.1885024309,
0.7938402295,
0.0524094552,
0.1809537411,
-0.1163357049,
0.3631627262,
-0.4275178909,
-0.2053982019,
0.2878558934,
0.2680915594,
0.3093083799,
-0.0219094101,
0.019961793,
-0.186156258,
0.5015253425,
0.0297628418,
-0.1064066738,
-0.1711031944,
-0.0851539075,
-0.5885355473,
-0.1091987193,
-0.0785848945,
-0.3384309113,
0.0276817009,
0.444884479,
-0.2722951472,
0.0116580054,
-0.166905269,
0.1294877976,
0.0158836916,
0.1184921414,
0.091453433,
-0.1762313098,
-0.2772182822,
0.0589550808,
0.2688052058,
-0.3439936042,
-0.259808749,
-0.4583738744,
0.0284469128,
-0.0672116876,
-0.1894517392,
0.2401182503,
0.2790065706,
0.4852347672,
-0.043664895,
0.3151144981,
-0.0392435603,
-0.2778839767,
-0.3964206874,
0.1328743845,
-0.1942774653,
-0.0794956833,
-0.0010468708,
0.1259424835,
-0.32741189,
0.3908375204,
-0.290415585,
0.1902526021,
0.054690063,
-0.1452249736,
-0.0034683384,
0.1812205762,
-0.3123650849,
0.1539277732,
0.2389641851,
0.3021644056,
-0.1769242585,
0.2898985744,
-0.2135693431,
-0.2442595512,
0.5382839441,
-0.1240978539,
-0.0283330306,
0.1594702005,
0.2759436071,
-0.1947039664,
-0.1216938496,
-0.2669954896,
0.0924192965,
0.4286953509,
-0.2330364436,
0.1069558412,
-0.0202894509,
-0.2478489131,
0.0288677812,
0.0702999532,
0.0690405443,
0.1191773713,
-0.1778611839,
-0.2479634583,
-0.2421287
] |
https://github.com/huggingface/datasets/issues/560 | Using custom DownloadConfig results in an error | Thanks for the report, I'll take a look.
What is your specific use-case for providing a DownloadConfig object?
| ## Version / Environment
Ubuntu 18.04
Python 3.6.8
nlp 0.4.0
## Description
Loading `imdb` dataset works fine when when I don't specify any `download_config` argument. When I create a custom `DownloadConfig` object and pass it to the `nlp.load_dataset` function, this results in an error.
## How to reproduce
### Example without DownloadConfig --> works
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-without-dl-config-01/"
import logging
import nlp
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
imdb = nlp.load_dataset(path="imdb")
```
### Example with DownloadConfig --> doesn't work
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-with-dl-config-01/"
import logging
import nlp
from nlp.utils import DownloadConfig
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
download_config = DownloadConfig()
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
```
Error traceback:
```
Traceback (most recent call last):
File "/.../example_with_dl_config.py", line 13, in <module>
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
File "/.../python3.6/python3.6/site-packages/nlp/load.py", line 549, in load_dataset
download_config=download_config, download_mode=download_mode, ignore_verifications=ignore_verifications,
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 463, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/.../python3.6/python3.6/site-packages/nlp/datasets/imdb/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743/imdb.py", line 86, in _split_generators
arch_path = dl_manager.download_and_extract(_DOWNLOAD_URL)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 158, in download
self._record_sizes_checksums(url_or_urls, downloaded_path_or_paths)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 108, in _record_sizes_checksums
self._recorded_sizes_checksums[url] = get_size_checksum_dict(path)
File "/.../python3.6/python3.6/site-packages/nlp/utils/info_utils.py", line 79, in get_size_checksum_dict
with open(path, "rb") as f:
IsADirectoryError: [Errno 21] Is a directory: '/data/hf-test-with-dl-config-01/datasets/extracted/b6802c5b61824b2c1f7dbf7cda6696b5f2e22214e18d171ce1ed3be90c931ce5'
```
| 18 | Using custom DownloadConfig results in an error
## Version / Environment
Ubuntu 18.04
Python 3.6.8
nlp 0.4.0
## Description
Loading `imdb` dataset works fine when when I don't specify any `download_config` argument. When I create a custom `DownloadConfig` object and pass it to the `nlp.load_dataset` function, this results in an error.
## How to reproduce
### Example without DownloadConfig --> works
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-without-dl-config-01/"
import logging
import nlp
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
imdb = nlp.load_dataset(path="imdb")
```
### Example with DownloadConfig --> doesn't work
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-with-dl-config-01/"
import logging
import nlp
from nlp.utils import DownloadConfig
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
download_config = DownloadConfig()
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
```
Error traceback:
```
Traceback (most recent call last):
File "/.../example_with_dl_config.py", line 13, in <module>
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
File "/.../python3.6/python3.6/site-packages/nlp/load.py", line 549, in load_dataset
download_config=download_config, download_mode=download_mode, ignore_verifications=ignore_verifications,
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 463, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/.../python3.6/python3.6/site-packages/nlp/datasets/imdb/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743/imdb.py", line 86, in _split_generators
arch_path = dl_manager.download_and_extract(_DOWNLOAD_URL)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 158, in download
self._record_sizes_checksums(url_or_urls, downloaded_path_or_paths)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 108, in _record_sizes_checksums
self._recorded_sizes_checksums[url] = get_size_checksum_dict(path)
File "/.../python3.6/python3.6/site-packages/nlp/utils/info_utils.py", line 79, in get_size_checksum_dict
with open(path, "rb") as f:
IsADirectoryError: [Errno 21] Is a directory: '/data/hf-test-with-dl-config-01/datasets/extracted/b6802c5b61824b2c1f7dbf7cda6696b5f2e22214e18d171ce1ed3be90c931ce5'
```
Thanks for the report, I'll take a look.
What is your specific use-case for providing a DownloadConfig object?
| [
-0.1150191873,
0.1608140916,
0.1123034507,
0.0204853825,
-0.076284565,
-0.0592143647,
0.287856251,
0.038644366,
0.3867914379,
-0.199264437,
-0.1571437418,
0.3636112809,
-0.2807904482,
-0.0396444052,
0.1373905391,
0.1345225573,
-0.3357851803,
0.1916719377,
0.2066786587,
-0.0992470607,
-0.4401439428,
0.4904035032,
-0.1363260299,
-0.1055810302,
-0.110463053,
0.296576947,
-0.0853593349,
0.6002389789,
-0.0052090101,
-0.3369758725,
0.5871986747,
0.0767925307,
0.2937195003,
0.0780295134,
-0.0001159093,
0.0705930069,
0.233681947,
-0.3101005554,
-0.3799628317,
-0.2622922659,
-0.0511954203,
-0.1243967116,
0.2814411819,
-0.1946799755,
-0.0140371546,
-0.0242790468,
0.0815582126,
0.0334612802,
0.0941411331,
0.2972892523,
0.1419871897,
-0.0229713544,
-0.2864385843,
-0.1697260588,
0.2262741178,
0.2600771785,
0.0480876788,
0.529278934,
0.3167857528,
-0.2207207382,
0.0189586282,
-0.3049542308,
-0.1514440179,
0.0622547455,
0.5781334639,
0.0206803083,
0.2644709647,
-0.0439109541,
-0.1765139103,
0.3859302402,
0.2035036087,
-0.2281823903,
-0.0819196925,
-0.3655665517,
0.1446021795,
-0.4327117801,
0.1780244559,
0.1168845296,
-0.2737576962,
0.0803427175,
-0.2586565316,
-0.1680172086,
-0.0763952136,
0.508751452,
0.2171330601,
0.3228864372,
-0.0183474869,
0.1714262664,
-0.0767273456,
-0.0576793477,
0.2524202466,
-0.3967368603,
0.0253676437,
0.1348519772,
-0.0958462358,
-0.0770933032,
0.1277973503,
0.0256881565,
-0.032763958,
0.1246376783,
0.0682526007,
-0.1068674028,
0.0644042641,
0.0538507961,
0.0521951392,
0.2863238752,
-0.076601021,
0.1167677566,
-0.2036598027,
0.166965574,
-0.0508588366,
-0.0176646411,
0.0861195251,
-0.0753239244,
-0.3973385692,
-0.1976340413,
0.1687505245,
0.0014936477,
-0.062957719,
-0.3396822214,
-0.0813493952,
0.1825858951,
0.3383554816,
0.2477038801,
-0.0689880326,
-0.2663638592,
0.1952616125,
0.2603036165,
-0.0893421248,
0.1586497575,
-0.0882690027,
-0.1036935896,
-0.1506858468,
-0.0623485371,
0.4825169444,
0.2577641904,
0.6017349958,
0.0311782584,
0.0021427795,
0.028184697,
-0.0155199431,
0.121614255,
0.0396912508,
0.1165007129,
0.0043118298,
0.0528847948,
0.0625880063,
-0.2535326779,
-0.2557505369,
-0.0347327106,
-0.1973153502,
-0.524072051,
-0.3210694194,
0.1228111237,
-0.3273596764,
-0.3346838951,
0.2478318512,
-0.0070183943,
0.0605553053,
-0.3129454851,
-0.0337527953,
0.0295482948,
-0.4143958092,
-0.2400684059,
-0.089732632,
0.3797774613,
0.0012104437,
-0.2380744517,
-0.4831040204,
-0.1488146484,
0.2068390548,
-0.0621500015,
-0.4448851645,
0.246032089,
-0.3705917299,
0.1259260625,
0.7066496015,
-0.593824327,
-0.4758626521,
0.7060874701,
-0.1530993283,
0.1541839838,
0.1148286909,
0.304777205,
0.3644038439,
-0.1938651949,
0.2743984461,
0.4499005675,
0.164540112,
-0.0376655795,
-0.1342466772,
-0.23661834,
-0.045294717,
0.069085665,
-0.2126619518,
0.3807031512,
0.0322404616,
0.2286941707,
0.2805711925,
0.0719812512,
-0.0035695881,
0.0699648708,
-0.0190698858,
0.2078529298,
-0.2160391062,
0.2683117986,
-0.5918144584,
0.3081949055,
-0.2088378668,
0.0111982739,
0.1865446866,
0.0156149939,
-0.3768416047,
-0.2749230266,
-0.1442953795,
-0.3913903832,
0.0948156938,
0.3689242899,
0.1382951587,
-0.1243662462,
-0.128129378,
0.3340530396,
-0.2191487551,
-0.0176076498,
-0.160417527,
-0.0221683644,
0.1522675604,
-0.1637908518,
0.0569931045,
0.1050849408,
0.2022018284,
0.171318993,
-0.0114864428,
0.4094794393,
-0.1540580094,
0.0669988245,
-0.0887555629,
-0.2120266557,
0.0864796489,
-0.0025835261,
0.2240314484,
0.0819460154,
0.1684666276,
-0.2278700173,
-0.0677172393,
0.1494402289,
0.0992673784,
0.3279950619,
0.0210087597,
0.1423207521,
0.1164701283,
-0.2840329409,
-0.0270047709,
-0.3489286304,
0.0761513859,
0.1613445729,
0.3522972465,
0.0295900796,
-0.2389680296,
-0.0452458039,
0.2232686281,
0.1981344521,
0.020454634,
0.1191776842,
-0.0028204955,
-0.0736614913,
0.029122645,
0.646499157,
0.8169702291,
0.1107555553,
-0.0627676994,
0.4516937435,
-0.271200031,
-0.2540046573,
0.2792920768,
-0.1479597688,
0.128902331,
0.1667046845,
0.0079042353,
0.0080788657,
-0.1439011097,
-0.4434363246,
-0.0547378883,
-0.0084888581,
-0.4380913377,
-0.0696293861,
-0.2960467339,
-0.4512907267,
-0.1279859245,
-0.2876826227,
-0.0485401712,
-0.2141085267,
-0.1592726707,
0.1131824553,
0.1047846079,
-0.0083433427,
-0.2631918788,
0.0197489858,
-0.2949642539,
-0.8628901839,
0.0819825381,
-0.0314405374,
-0.3450303674,
-0.0835405886,
0.0386167727,
0.1652457565,
0.0731320232,
-0.1557794511,
-0.2440003604,
0.1602343172,
-0.102919668,
0.1122969165,
0.1613500267,
0.3075021207,
0.3883804083,
-0.151153475,
0.2565557063,
-0.4360590875,
0.1637898684,
0.216210559,
-0.0127828941,
0.042474553,
0.0326361172,
-0.0875904411,
-0.3689766526,
-0.2115420103,
-0.164963752,
-0.4502527118,
-0.0885405242,
0.2849121392,
0.1961458921,
0.4581587315,
0.1651854366,
0.3243180513,
-0.164577499,
0.0822790712,
-0.0674817339,
-0.3117093146,
0.2791599035,
-0.47333318,
-0.078999877,
0.0859311223,
-0.2227194756,
0.2154388279,
0.1335568875,
-0.3884004951,
0.1996544152,
-0.1469651908,
0.0002487451,
-0.0470807627,
-0.0934200063,
0.2494484931,
-0.0822577029,
0.0479588732,
-0.2609183192,
-0.3588173389,
0.3365956545,
0.3105378151,
0.5375540853,
0.2759686112,
0.298624903,
-0.1452534199,
0.484687835,
0.1508263648,
-0.1560513377,
0.6103883386,
-0.1947127432,
0.460178256,
-0.0382341072,
-0.2387868762,
-0.0032228082,
0.3633524179,
-0.0172827691,
0.2315364033,
-0.061175216,
-0.3007302284,
-0.1902963668,
-0.1062507853,
-0.1545365453,
-0.2300316691,
-0.0597751439,
0.0295014493,
0.1523667872,
0.0646301508,
0.1584213674,
-0.0831473321,
-0.1170615405,
-0.1872001141,
0.4351291656,
-0.0597942546,
0.2172850668,
-0.0276044309,
0.2997870445,
-0.3493390977,
0.1422001272,
-0.2189184874,
0.3044459522,
-0.1182057634,
0.0613892823,
-0.0159720406,
0.1169418395,
0.2156412154,
-0.2203438282,
0.2306118459,
0.2625309825,
0.1650993079,
-0.6942800283,
-0.0943868831,
0.0507083386,
0.3705738485,
0.6355692148,
0.1778833866,
-0.1156035215,
-0.236060679,
0.1376714706,
-0.175421983,
-0.0925002024,
-0.4230397344,
-0.0664593875,
-0.3104560971,
-0.2801862955,
-0.3828446269,
-0.1292265207,
0.2305581719,
0.2301345766,
0.1135467812,
0.0101911668,
0.037024416,
0.0223108456,
-0.0750331581,
0.0257842913,
0.165501669,
0.2535585463,
-0.3498474061,
0.0041104993,
0.0184273291,
0.4556972086,
-0.1090065539,
-0.1824811697,
-0.0907647535,
0.0760888904,
0.1698763967,
0.2375927866,
-0.0469809063,
-0.278847754,
-0.0843370631,
-0.243196398,
0.0611785315,
0.1471871138,
0.2561113238,
-0.5553205013,
0.1812880933,
-0.3989014328,
0.5427383184,
-0.1028673053,
-0.0532118082,
0.1602560729,
-0.0507054217,
-0.3970302045,
0.3244518936,
-0.0401612632,
0.5586261153,
0.2404640019,
0.0567302071,
0.4123043418,
-0.4333569705,
0.340760529,
0.3820565343,
0.0827065483,
-0.2056448609,
0.2356446385,
0.1081555635,
-0.243430689,
0.1182639301,
0.4598893523,
-0.1859721243,
0.3259103298,
-0.155995518,
-0.4174779058,
0.0909638479,
0.437913388,
-0.2495400012,
-0.3017905354,
-0.317876339,
0.1256024539,
-0.0598925799,
0.3444619179,
-0.0450953618,
-0.313447684,
-0.3213608265,
-0.1474079192,
-0.2323344052,
0.1951861382,
-0.0521365814,
-0.1184979528,
0.1656087786,
0.0278663039,
0.1757667661,
0.3797241151,
-0.2905165851,
-0.0837100595,
-0.238989234,
0.083014667,
-0.1039217412,
-0.0695939809,
0.1519479156,
0.0730861127,
0.2014411688,
-0.1141102165,
-0.3429859281,
-0.1494406462,
-0.0445593409,
-0.201519087,
0.4617882073,
-0.0481094234,
0.0759339631,
-0.1242551059,
-0.492182821,
0.0628538653,
-0.0180092305,
0.1029132158,
0.0785280764,
0.0157989599,
-0.0287092943,
0.1853823364,
-0.1957310438,
-0.337682724,
-0.0281161536,
0.3746061921,
-0.3746852577,
0.290784657,
0.4278506935,
-0.1933621764,
0.1264045238,
-0.0584544875,
0.0083093792,
-0.5848217607,
-0.21616292,
0.103173539,
0.2297324985,
0.1647561044,
0.1329748482,
0.2493012697,
-0.0034954199,
0.180294469,
0.2565452754,
-0.3308794498,
-0.1663015783,
-0.0038664863,
-0.3690318465,
0.1466335207,
-0.1980223954,
0.2377352417,
0.3796905279,
-0.2016597688,
-0.2258140743,
0.1875278652,
-0.1729786396,
0.0496795438,
0.1979012191,
0.1081394702,
0.0217498988,
0.1827964783,
0.0240135714,
-0.0625099391,
-0.2874265909,
-0.133997336,
-0.3909823596,
0.1758334637,
0.226489827,
-0.1660331786,
-0.0547828153,
-0.2572703958,
-0.2181169391,
-0.0314531699,
-0.0373602659,
0.1865928918,
-0.0764184892,
0.3890279531,
-0.1978254914,
0.243920207,
-0.0793267787,
0.1648310423,
-0.1698616743,
0.3258366883,
0.0379705243,
0.1343825459,
-0.1742456257,
0.0088646784,
-0.1068821251,
-0.23138991,
-0.2788546383,
0.0202060938,
0.2709775269,
-0.0176595822,
0.2543971539,
0.2019493133,
-0.041505184,
0.3494022489,
-0.2206251472,
-0.126997605,
0.2625514865,
0.1083604246,
-0.1582948565,
0.222232461,
0.3854727745,
0.3658418059,
0.0349598154,
0.2504578829,
0.160232693,
-0.3160719275,
-0.0662196726,
-0.0450420454,
0.2565683424,
0.0481730364,
-0.007577572,
0.3368712068,
-0.2633073926,
0.200919345,
0.0543518141,
0.281752646,
0.2158437073,
0.4725826383,
-0.0022781901,
-0.1878002435,
-0.2764520943,
0.0665214807,
0.1926875412,
-0.5811085701,
0.1973800957,
-0.0308280066,
0.1413286328,
0.2872925699,
-0.1325432956,
0.0486671887,
-0.1231481582,
0.4628401101,
-0.0270577706,
0.2395099849,
0.0963632166,
-0.1682440042,
-0.1834776998,
0.0434651226,
-0.0890514627,
0.0064130276,
-0.169731617,
-0.2001912296,
-0.0572065897,
0.0989689454,
-0.2617081702,
-0.2644015551,
0.2126556039,
-0.0461101234,
-0.0920266286,
-0.2886495292,
0.1431360692,
-0.0127506964,
0.1704822779,
0.3351089358,
0.260812819,
0.2379632592,
0.5772153139,
-0.0407348014,
-0.1141021252,
0.1889009178,
0.0600244105,
-0.2130744606,
0.0773512274,
0.3676769137,
0.350633204,
0.3967398107,
0.0209642015,
-0.088220939,
-0.0021248413,
0.2406387329,
0.2035894692,
-0.1885024309,
0.7938402295,
0.0524094552,
0.1809537411,
-0.1163357049,
0.3631627262,
-0.4275178909,
-0.2053982019,
0.2878558934,
0.2680915594,
0.3093083799,
-0.0219094101,
0.019961793,
-0.186156258,
0.5015253425,
0.0297628418,
-0.1064066738,
-0.1711031944,
-0.0851539075,
-0.5885355473,
-0.1091987193,
-0.0785848945,
-0.3384309113,
0.0276817009,
0.444884479,
-0.2722951472,
0.0116580054,
-0.166905269,
0.1294877976,
0.0158836916,
0.1184921414,
0.091453433,
-0.1762313098,
-0.2772182822,
0.0589550808,
0.2688052058,
-0.3439936042,
-0.259808749,
-0.4583738744,
0.0284469128,
-0.0672116876,
-0.1894517392,
0.2401182503,
0.2790065706,
0.4852347672,
-0.043664895,
0.3151144981,
-0.0392435603,
-0.2778839767,
-0.3964206874,
0.1328743845,
-0.1942774653,
-0.0794956833,
-0.0010468708,
0.1259424835,
-0.32741189,
0.3908375204,
-0.290415585,
0.1902526021,
0.054690063,
-0.1452249736,
-0.0034683384,
0.1812205762,
-0.3123650849,
0.1539277732,
0.2389641851,
0.3021644056,
-0.1769242585,
0.2898985744,
-0.2135693431,
-0.2442595512,
0.5382839441,
-0.1240978539,
-0.0283330306,
0.1594702005,
0.2759436071,
-0.1947039664,
-0.1216938496,
-0.2669954896,
0.0924192965,
0.4286953509,
-0.2330364436,
0.1069558412,
-0.0202894509,
-0.2478489131,
0.0288677812,
0.0702999532,
0.0690405443,
0.1191773713,
-0.1778611839,
-0.2479634583,
-0.2421287
] |
https://github.com/huggingface/datasets/issues/560 | Using custom DownloadConfig results in an error | Thanks. Our use case involves running a training job behind a corporate firewall with no access to any external resources (S3, GCP or other web resources).
I was thinking about a 2-steps process:
1) Download the resources / artifacts using some secure corporate channel, ie run `nlp.load_dataset()` without a specific `DownloadConfig`. After that, collect the files from the `$HF_HOME` folder
2) Copy the `$HF_HOME` folder in the firewalled environment. Run `nlp.load_dataset()` with a custom config `DownloadConfig(local_files_only=True)`
However this ends up a bit clunky in practice, even when solving the `DownloadConfig` issue above. For example, the `filename` hash computed in `get_from_cache()` differs in the `local_files_only=False` vs `local_files_only=True` case (local case defaults `etag` to `None`, which results in a different hash). So effectively step 2) above doesn't work because the hash computed differs from the hash in the cache folder. Some hacks / workaround are possible but this solution becomes very convoluted.
https://github.com/huggingface/nlp/blob/c214aa5a4430c1df1bcd0619fd94d6abdf9d2da7/src/nlp/utils/file_utils.py#L417
Would you recommend a different path?
| ## Version / Environment
Ubuntu 18.04
Python 3.6.8
nlp 0.4.0
## Description
Loading `imdb` dataset works fine when when I don't specify any `download_config` argument. When I create a custom `DownloadConfig` object and pass it to the `nlp.load_dataset` function, this results in an error.
## How to reproduce
### Example without DownloadConfig --> works
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-without-dl-config-01/"
import logging
import nlp
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
imdb = nlp.load_dataset(path="imdb")
```
### Example with DownloadConfig --> doesn't work
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-with-dl-config-01/"
import logging
import nlp
from nlp.utils import DownloadConfig
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
download_config = DownloadConfig()
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
```
Error traceback:
```
Traceback (most recent call last):
File "/.../example_with_dl_config.py", line 13, in <module>
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
File "/.../python3.6/python3.6/site-packages/nlp/load.py", line 549, in load_dataset
download_config=download_config, download_mode=download_mode, ignore_verifications=ignore_verifications,
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 463, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/.../python3.6/python3.6/site-packages/nlp/datasets/imdb/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743/imdb.py", line 86, in _split_generators
arch_path = dl_manager.download_and_extract(_DOWNLOAD_URL)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 158, in download
self._record_sizes_checksums(url_or_urls, downloaded_path_or_paths)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 108, in _record_sizes_checksums
self._recorded_sizes_checksums[url] = get_size_checksum_dict(path)
File "/.../python3.6/python3.6/site-packages/nlp/utils/info_utils.py", line 79, in get_size_checksum_dict
with open(path, "rb") as f:
IsADirectoryError: [Errno 21] Is a directory: '/data/hf-test-with-dl-config-01/datasets/extracted/b6802c5b61824b2c1f7dbf7cda6696b5f2e22214e18d171ce1ed3be90c931ce5'
```
| 157 | Using custom DownloadConfig results in an error
## Version / Environment
Ubuntu 18.04
Python 3.6.8
nlp 0.4.0
## Description
Loading `imdb` dataset works fine when when I don't specify any `download_config` argument. When I create a custom `DownloadConfig` object and pass it to the `nlp.load_dataset` function, this results in an error.
## How to reproduce
### Example without DownloadConfig --> works
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-without-dl-config-01/"
import logging
import nlp
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
imdb = nlp.load_dataset(path="imdb")
```
### Example with DownloadConfig --> doesn't work
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-with-dl-config-01/"
import logging
import nlp
from nlp.utils import DownloadConfig
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
download_config = DownloadConfig()
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
```
Error traceback:
```
Traceback (most recent call last):
File "/.../example_with_dl_config.py", line 13, in <module>
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
File "/.../python3.6/python3.6/site-packages/nlp/load.py", line 549, in load_dataset
download_config=download_config, download_mode=download_mode, ignore_verifications=ignore_verifications,
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 463, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/.../python3.6/python3.6/site-packages/nlp/datasets/imdb/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743/imdb.py", line 86, in _split_generators
arch_path = dl_manager.download_and_extract(_DOWNLOAD_URL)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 158, in download
self._record_sizes_checksums(url_or_urls, downloaded_path_or_paths)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 108, in _record_sizes_checksums
self._recorded_sizes_checksums[url] = get_size_checksum_dict(path)
File "/.../python3.6/python3.6/site-packages/nlp/utils/info_utils.py", line 79, in get_size_checksum_dict
with open(path, "rb") as f:
IsADirectoryError: [Errno 21] Is a directory: '/data/hf-test-with-dl-config-01/datasets/extracted/b6802c5b61824b2c1f7dbf7cda6696b5f2e22214e18d171ce1ed3be90c931ce5'
```
Thanks. Our use case involves running a training job behind a corporate firewall with no access to any external resources (S3, GCP or other web resources).
I was thinking about a 2-steps process:
1) Download the resources / artifacts using some secure corporate channel, ie run `nlp.load_dataset()` without a specific `DownloadConfig`. After that, collect the files from the `$HF_HOME` folder
2) Copy the `$HF_HOME` folder in the firewalled environment. Run `nlp.load_dataset()` with a custom config `DownloadConfig(local_files_only=True)`
However this ends up a bit clunky in practice, even when solving the `DownloadConfig` issue above. For example, the `filename` hash computed in `get_from_cache()` differs in the `local_files_only=False` vs `local_files_only=True` case (local case defaults `etag` to `None`, which results in a different hash). So effectively step 2) above doesn't work because the hash computed differs from the hash in the cache folder. Some hacks / workaround are possible but this solution becomes very convoluted.
https://github.com/huggingface/nlp/blob/c214aa5a4430c1df1bcd0619fd94d6abdf9d2da7/src/nlp/utils/file_utils.py#L417
Would you recommend a different path?
| [
-0.1150191873,
0.1608140916,
0.1123034507,
0.0204853825,
-0.076284565,
-0.0592143647,
0.287856251,
0.038644366,
0.3867914379,
-0.199264437,
-0.1571437418,
0.3636112809,
-0.2807904482,
-0.0396444052,
0.1373905391,
0.1345225573,
-0.3357851803,
0.1916719377,
0.2066786587,
-0.0992470607,
-0.4401439428,
0.4904035032,
-0.1363260299,
-0.1055810302,
-0.110463053,
0.296576947,
-0.0853593349,
0.6002389789,
-0.0052090101,
-0.3369758725,
0.5871986747,
0.0767925307,
0.2937195003,
0.0780295134,
-0.0001159093,
0.0705930069,
0.233681947,
-0.3101005554,
-0.3799628317,
-0.2622922659,
-0.0511954203,
-0.1243967116,
0.2814411819,
-0.1946799755,
-0.0140371546,
-0.0242790468,
0.0815582126,
0.0334612802,
0.0941411331,
0.2972892523,
0.1419871897,
-0.0229713544,
-0.2864385843,
-0.1697260588,
0.2262741178,
0.2600771785,
0.0480876788,
0.529278934,
0.3167857528,
-0.2207207382,
0.0189586282,
-0.3049542308,
-0.1514440179,
0.0622547455,
0.5781334639,
0.0206803083,
0.2644709647,
-0.0439109541,
-0.1765139103,
0.3859302402,
0.2035036087,
-0.2281823903,
-0.0819196925,
-0.3655665517,
0.1446021795,
-0.4327117801,
0.1780244559,
0.1168845296,
-0.2737576962,
0.0803427175,
-0.2586565316,
-0.1680172086,
-0.0763952136,
0.508751452,
0.2171330601,
0.3228864372,
-0.0183474869,
0.1714262664,
-0.0767273456,
-0.0576793477,
0.2524202466,
-0.3967368603,
0.0253676437,
0.1348519772,
-0.0958462358,
-0.0770933032,
0.1277973503,
0.0256881565,
-0.032763958,
0.1246376783,
0.0682526007,
-0.1068674028,
0.0644042641,
0.0538507961,
0.0521951392,
0.2863238752,
-0.076601021,
0.1167677566,
-0.2036598027,
0.166965574,
-0.0508588366,
-0.0176646411,
0.0861195251,
-0.0753239244,
-0.3973385692,
-0.1976340413,
0.1687505245,
0.0014936477,
-0.062957719,
-0.3396822214,
-0.0813493952,
0.1825858951,
0.3383554816,
0.2477038801,
-0.0689880326,
-0.2663638592,
0.1952616125,
0.2603036165,
-0.0893421248,
0.1586497575,
-0.0882690027,
-0.1036935896,
-0.1506858468,
-0.0623485371,
0.4825169444,
0.2577641904,
0.6017349958,
0.0311782584,
0.0021427795,
0.028184697,
-0.0155199431,
0.121614255,
0.0396912508,
0.1165007129,
0.0043118298,
0.0528847948,
0.0625880063,
-0.2535326779,
-0.2557505369,
-0.0347327106,
-0.1973153502,
-0.524072051,
-0.3210694194,
0.1228111237,
-0.3273596764,
-0.3346838951,
0.2478318512,
-0.0070183943,
0.0605553053,
-0.3129454851,
-0.0337527953,
0.0295482948,
-0.4143958092,
-0.2400684059,
-0.089732632,
0.3797774613,
0.0012104437,
-0.2380744517,
-0.4831040204,
-0.1488146484,
0.2068390548,
-0.0621500015,
-0.4448851645,
0.246032089,
-0.3705917299,
0.1259260625,
0.7066496015,
-0.593824327,
-0.4758626521,
0.7060874701,
-0.1530993283,
0.1541839838,
0.1148286909,
0.304777205,
0.3644038439,
-0.1938651949,
0.2743984461,
0.4499005675,
0.164540112,
-0.0376655795,
-0.1342466772,
-0.23661834,
-0.045294717,
0.069085665,
-0.2126619518,
0.3807031512,
0.0322404616,
0.2286941707,
0.2805711925,
0.0719812512,
-0.0035695881,
0.0699648708,
-0.0190698858,
0.2078529298,
-0.2160391062,
0.2683117986,
-0.5918144584,
0.3081949055,
-0.2088378668,
0.0111982739,
0.1865446866,
0.0156149939,
-0.3768416047,
-0.2749230266,
-0.1442953795,
-0.3913903832,
0.0948156938,
0.3689242899,
0.1382951587,
-0.1243662462,
-0.128129378,
0.3340530396,
-0.2191487551,
-0.0176076498,
-0.160417527,
-0.0221683644,
0.1522675604,
-0.1637908518,
0.0569931045,
0.1050849408,
0.2022018284,
0.171318993,
-0.0114864428,
0.4094794393,
-0.1540580094,
0.0669988245,
-0.0887555629,
-0.2120266557,
0.0864796489,
-0.0025835261,
0.2240314484,
0.0819460154,
0.1684666276,
-0.2278700173,
-0.0677172393,
0.1494402289,
0.0992673784,
0.3279950619,
0.0210087597,
0.1423207521,
0.1164701283,
-0.2840329409,
-0.0270047709,
-0.3489286304,
0.0761513859,
0.1613445729,
0.3522972465,
0.0295900796,
-0.2389680296,
-0.0452458039,
0.2232686281,
0.1981344521,
0.020454634,
0.1191776842,
-0.0028204955,
-0.0736614913,
0.029122645,
0.646499157,
0.8169702291,
0.1107555553,
-0.0627676994,
0.4516937435,
-0.271200031,
-0.2540046573,
0.2792920768,
-0.1479597688,
0.128902331,
0.1667046845,
0.0079042353,
0.0080788657,
-0.1439011097,
-0.4434363246,
-0.0547378883,
-0.0084888581,
-0.4380913377,
-0.0696293861,
-0.2960467339,
-0.4512907267,
-0.1279859245,
-0.2876826227,
-0.0485401712,
-0.2141085267,
-0.1592726707,
0.1131824553,
0.1047846079,
-0.0083433427,
-0.2631918788,
0.0197489858,
-0.2949642539,
-0.8628901839,
0.0819825381,
-0.0314405374,
-0.3450303674,
-0.0835405886,
0.0386167727,
0.1652457565,
0.0731320232,
-0.1557794511,
-0.2440003604,
0.1602343172,
-0.102919668,
0.1122969165,
0.1613500267,
0.3075021207,
0.3883804083,
-0.151153475,
0.2565557063,
-0.4360590875,
0.1637898684,
0.216210559,
-0.0127828941,
0.042474553,
0.0326361172,
-0.0875904411,
-0.3689766526,
-0.2115420103,
-0.164963752,
-0.4502527118,
-0.0885405242,
0.2849121392,
0.1961458921,
0.4581587315,
0.1651854366,
0.3243180513,
-0.164577499,
0.0822790712,
-0.0674817339,
-0.3117093146,
0.2791599035,
-0.47333318,
-0.078999877,
0.0859311223,
-0.2227194756,
0.2154388279,
0.1335568875,
-0.3884004951,
0.1996544152,
-0.1469651908,
0.0002487451,
-0.0470807627,
-0.0934200063,
0.2494484931,
-0.0822577029,
0.0479588732,
-0.2609183192,
-0.3588173389,
0.3365956545,
0.3105378151,
0.5375540853,
0.2759686112,
0.298624903,
-0.1452534199,
0.484687835,
0.1508263648,
-0.1560513377,
0.6103883386,
-0.1947127432,
0.460178256,
-0.0382341072,
-0.2387868762,
-0.0032228082,
0.3633524179,
-0.0172827691,
0.2315364033,
-0.061175216,
-0.3007302284,
-0.1902963668,
-0.1062507853,
-0.1545365453,
-0.2300316691,
-0.0597751439,
0.0295014493,
0.1523667872,
0.0646301508,
0.1584213674,
-0.0831473321,
-0.1170615405,
-0.1872001141,
0.4351291656,
-0.0597942546,
0.2172850668,
-0.0276044309,
0.2997870445,
-0.3493390977,
0.1422001272,
-0.2189184874,
0.3044459522,
-0.1182057634,
0.0613892823,
-0.0159720406,
0.1169418395,
0.2156412154,
-0.2203438282,
0.2306118459,
0.2625309825,
0.1650993079,
-0.6942800283,
-0.0943868831,
0.0507083386,
0.3705738485,
0.6355692148,
0.1778833866,
-0.1156035215,
-0.236060679,
0.1376714706,
-0.175421983,
-0.0925002024,
-0.4230397344,
-0.0664593875,
-0.3104560971,
-0.2801862955,
-0.3828446269,
-0.1292265207,
0.2305581719,
0.2301345766,
0.1135467812,
0.0101911668,
0.037024416,
0.0223108456,
-0.0750331581,
0.0257842913,
0.165501669,
0.2535585463,
-0.3498474061,
0.0041104993,
0.0184273291,
0.4556972086,
-0.1090065539,
-0.1824811697,
-0.0907647535,
0.0760888904,
0.1698763967,
0.2375927866,
-0.0469809063,
-0.278847754,
-0.0843370631,
-0.243196398,
0.0611785315,
0.1471871138,
0.2561113238,
-0.5553205013,
0.1812880933,
-0.3989014328,
0.5427383184,
-0.1028673053,
-0.0532118082,
0.1602560729,
-0.0507054217,
-0.3970302045,
0.3244518936,
-0.0401612632,
0.5586261153,
0.2404640019,
0.0567302071,
0.4123043418,
-0.4333569705,
0.340760529,
0.3820565343,
0.0827065483,
-0.2056448609,
0.2356446385,
0.1081555635,
-0.243430689,
0.1182639301,
0.4598893523,
-0.1859721243,
0.3259103298,
-0.155995518,
-0.4174779058,
0.0909638479,
0.437913388,
-0.2495400012,
-0.3017905354,
-0.317876339,
0.1256024539,
-0.0598925799,
0.3444619179,
-0.0450953618,
-0.313447684,
-0.3213608265,
-0.1474079192,
-0.2323344052,
0.1951861382,
-0.0521365814,
-0.1184979528,
0.1656087786,
0.0278663039,
0.1757667661,
0.3797241151,
-0.2905165851,
-0.0837100595,
-0.238989234,
0.083014667,
-0.1039217412,
-0.0695939809,
0.1519479156,
0.0730861127,
0.2014411688,
-0.1141102165,
-0.3429859281,
-0.1494406462,
-0.0445593409,
-0.201519087,
0.4617882073,
-0.0481094234,
0.0759339631,
-0.1242551059,
-0.492182821,
0.0628538653,
-0.0180092305,
0.1029132158,
0.0785280764,
0.0157989599,
-0.0287092943,
0.1853823364,
-0.1957310438,
-0.337682724,
-0.0281161536,
0.3746061921,
-0.3746852577,
0.290784657,
0.4278506935,
-0.1933621764,
0.1264045238,
-0.0584544875,
0.0083093792,
-0.5848217607,
-0.21616292,
0.103173539,
0.2297324985,
0.1647561044,
0.1329748482,
0.2493012697,
-0.0034954199,
0.180294469,
0.2565452754,
-0.3308794498,
-0.1663015783,
-0.0038664863,
-0.3690318465,
0.1466335207,
-0.1980223954,
0.2377352417,
0.3796905279,
-0.2016597688,
-0.2258140743,
0.1875278652,
-0.1729786396,
0.0496795438,
0.1979012191,
0.1081394702,
0.0217498988,
0.1827964783,
0.0240135714,
-0.0625099391,
-0.2874265909,
-0.133997336,
-0.3909823596,
0.1758334637,
0.226489827,
-0.1660331786,
-0.0547828153,
-0.2572703958,
-0.2181169391,
-0.0314531699,
-0.0373602659,
0.1865928918,
-0.0764184892,
0.3890279531,
-0.1978254914,
0.243920207,
-0.0793267787,
0.1648310423,
-0.1698616743,
0.3258366883,
0.0379705243,
0.1343825459,
-0.1742456257,
0.0088646784,
-0.1068821251,
-0.23138991,
-0.2788546383,
0.0202060938,
0.2709775269,
-0.0176595822,
0.2543971539,
0.2019493133,
-0.041505184,
0.3494022489,
-0.2206251472,
-0.126997605,
0.2625514865,
0.1083604246,
-0.1582948565,
0.222232461,
0.3854727745,
0.3658418059,
0.0349598154,
0.2504578829,
0.160232693,
-0.3160719275,
-0.0662196726,
-0.0450420454,
0.2565683424,
0.0481730364,
-0.007577572,
0.3368712068,
-0.2633073926,
0.200919345,
0.0543518141,
0.281752646,
0.2158437073,
0.4725826383,
-0.0022781901,
-0.1878002435,
-0.2764520943,
0.0665214807,
0.1926875412,
-0.5811085701,
0.1973800957,
-0.0308280066,
0.1413286328,
0.2872925699,
-0.1325432956,
0.0486671887,
-0.1231481582,
0.4628401101,
-0.0270577706,
0.2395099849,
0.0963632166,
-0.1682440042,
-0.1834776998,
0.0434651226,
-0.0890514627,
0.0064130276,
-0.169731617,
-0.2001912296,
-0.0572065897,
0.0989689454,
-0.2617081702,
-0.2644015551,
0.2126556039,
-0.0461101234,
-0.0920266286,
-0.2886495292,
0.1431360692,
-0.0127506964,
0.1704822779,
0.3351089358,
0.260812819,
0.2379632592,
0.5772153139,
-0.0407348014,
-0.1141021252,
0.1889009178,
0.0600244105,
-0.2130744606,
0.0773512274,
0.3676769137,
0.350633204,
0.3967398107,
0.0209642015,
-0.088220939,
-0.0021248413,
0.2406387329,
0.2035894692,
-0.1885024309,
0.7938402295,
0.0524094552,
0.1809537411,
-0.1163357049,
0.3631627262,
-0.4275178909,
-0.2053982019,
0.2878558934,
0.2680915594,
0.3093083799,
-0.0219094101,
0.019961793,
-0.186156258,
0.5015253425,
0.0297628418,
-0.1064066738,
-0.1711031944,
-0.0851539075,
-0.5885355473,
-0.1091987193,
-0.0785848945,
-0.3384309113,
0.0276817009,
0.444884479,
-0.2722951472,
0.0116580054,
-0.166905269,
0.1294877976,
0.0158836916,
0.1184921414,
0.091453433,
-0.1762313098,
-0.2772182822,
0.0589550808,
0.2688052058,
-0.3439936042,
-0.259808749,
-0.4583738744,
0.0284469128,
-0.0672116876,
-0.1894517392,
0.2401182503,
0.2790065706,
0.4852347672,
-0.043664895,
0.3151144981,
-0.0392435603,
-0.2778839767,
-0.3964206874,
0.1328743845,
-0.1942774653,
-0.0794956833,
-0.0010468708,
0.1259424835,
-0.32741189,
0.3908375204,
-0.290415585,
0.1902526021,
0.054690063,
-0.1452249736,
-0.0034683384,
0.1812205762,
-0.3123650849,
0.1539277732,
0.2389641851,
0.3021644056,
-0.1769242585,
0.2898985744,
-0.2135693431,
-0.2442595512,
0.5382839441,
-0.1240978539,
-0.0283330306,
0.1594702005,
0.2759436071,
-0.1947039664,
-0.1216938496,
-0.2669954896,
0.0924192965,
0.4286953509,
-0.2330364436,
0.1069558412,
-0.0202894509,
-0.2478489131,
0.0288677812,
0.0702999532,
0.0690405443,
0.1191773713,
-0.1778611839,
-0.2479634583,
-0.2421287
] |
https://github.com/huggingface/datasets/issues/560 | Using custom DownloadConfig results in an error | I see.
Probably the easiest way for you would be that we add simple serialization/deserialization methods to the Dataset and DatasetDict objects once the data files have been downloaded and all the dataset is processed.
What do you think @lhoestq ? | ## Version / Environment
Ubuntu 18.04
Python 3.6.8
nlp 0.4.0
## Description
Loading `imdb` dataset works fine when when I don't specify any `download_config` argument. When I create a custom `DownloadConfig` object and pass it to the `nlp.load_dataset` function, this results in an error.
## How to reproduce
### Example without DownloadConfig --> works
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-without-dl-config-01/"
import logging
import nlp
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
imdb = nlp.load_dataset(path="imdb")
```
### Example with DownloadConfig --> doesn't work
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-with-dl-config-01/"
import logging
import nlp
from nlp.utils import DownloadConfig
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
download_config = DownloadConfig()
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
```
Error traceback:
```
Traceback (most recent call last):
File "/.../example_with_dl_config.py", line 13, in <module>
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
File "/.../python3.6/python3.6/site-packages/nlp/load.py", line 549, in load_dataset
download_config=download_config, download_mode=download_mode, ignore_verifications=ignore_verifications,
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 463, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/.../python3.6/python3.6/site-packages/nlp/datasets/imdb/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743/imdb.py", line 86, in _split_generators
arch_path = dl_manager.download_and_extract(_DOWNLOAD_URL)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 158, in download
self._record_sizes_checksums(url_or_urls, downloaded_path_or_paths)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 108, in _record_sizes_checksums
self._recorded_sizes_checksums[url] = get_size_checksum_dict(path)
File "/.../python3.6/python3.6/site-packages/nlp/utils/info_utils.py", line 79, in get_size_checksum_dict
with open(path, "rb") as f:
IsADirectoryError: [Errno 21] Is a directory: '/data/hf-test-with-dl-config-01/datasets/extracted/b6802c5b61824b2c1f7dbf7cda6696b5f2e22214e18d171ce1ed3be90c931ce5'
```
| 41 | Using custom DownloadConfig results in an error
## Version / Environment
Ubuntu 18.04
Python 3.6.8
nlp 0.4.0
## Description
Loading `imdb` dataset works fine when when I don't specify any `download_config` argument. When I create a custom `DownloadConfig` object and pass it to the `nlp.load_dataset` function, this results in an error.
## How to reproduce
### Example without DownloadConfig --> works
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-without-dl-config-01/"
import logging
import nlp
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
imdb = nlp.load_dataset(path="imdb")
```
### Example with DownloadConfig --> doesn't work
```python
import os
os.environ["HF_HOME"] = "/data/hf-test-with-dl-config-01/"
import logging
import nlp
from nlp.utils import DownloadConfig
logging.basicConfig(level=logging.INFO)
if __name__ == "__main__":
download_config = DownloadConfig()
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
```
Error traceback:
```
Traceback (most recent call last):
File "/.../example_with_dl_config.py", line 13, in <module>
imdb = nlp.load_dataset(path="imdb", download_config=download_config)
File "/.../python3.6/python3.6/site-packages/nlp/load.py", line 549, in load_dataset
download_config=download_config, download_mode=download_mode, ignore_verifications=ignore_verifications,
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 463, in download_and_prepare
dl_manager=dl_manager, verify_infos=verify_infos, **download_and_prepare_kwargs
File "/.../python3.6/python3.6/site-packages/nlp/builder.py", line 518, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/.../python3.6/python3.6/site-packages/nlp/datasets/imdb/76cdbd7249ea3548c928bbf304258dab44d09cd3638d9da8d42480d1d1be3743/imdb.py", line 86, in _split_generators
arch_path = dl_manager.download_and_extract(_DOWNLOAD_URL)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 220, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 158, in download
self._record_sizes_checksums(url_or_urls, downloaded_path_or_paths)
File "/.../python3.6/python3.6/site-packages/nlp/utils/download_manager.py", line 108, in _record_sizes_checksums
self._recorded_sizes_checksums[url] = get_size_checksum_dict(path)
File "/.../python3.6/python3.6/site-packages/nlp/utils/info_utils.py", line 79, in get_size_checksum_dict
with open(path, "rb") as f:
IsADirectoryError: [Errno 21] Is a directory: '/data/hf-test-with-dl-config-01/datasets/extracted/b6802c5b61824b2c1f7dbf7cda6696b5f2e22214e18d171ce1ed3be90c931ce5'
```
I see.
Probably the easiest way for you would be that we add simple serialization/deserialization methods to the Dataset and DatasetDict objects once the data files have been downloaded and all the dataset is processed.
What do you think @lhoestq ? | [
-0.1150191873,
0.1608140916,
0.1123034507,
0.0204853825,
-0.076284565,
-0.0592143647,
0.287856251,
0.038644366,
0.3867914379,
-0.199264437,
-0.1571437418,
0.3636112809,
-0.2807904482,
-0.0396444052,
0.1373905391,
0.1345225573,
-0.3357851803,
0.1916719377,
0.2066786587,
-0.0992470607,
-0.4401439428,
0.4904035032,
-0.1363260299,
-0.1055810302,
-0.110463053,
0.296576947,
-0.0853593349,
0.6002389789,
-0.0052090101,
-0.3369758725,
0.5871986747,
0.0767925307,
0.2937195003,
0.0780295134,
-0.0001159093,
0.0705930069,
0.233681947,
-0.3101005554,
-0.3799628317,
-0.2622922659,
-0.0511954203,
-0.1243967116,
0.2814411819,
-0.1946799755,
-0.0140371546,
-0.0242790468,
0.0815582126,
0.0334612802,
0.0941411331,
0.2972892523,
0.1419871897,
-0.0229713544,
-0.2864385843,
-0.1697260588,
0.2262741178,
0.2600771785,
0.0480876788,
0.529278934,
0.3167857528,
-0.2207207382,
0.0189586282,
-0.3049542308,
-0.1514440179,
0.0622547455,
0.5781334639,
0.0206803083,
0.2644709647,
-0.0439109541,
-0.1765139103,
0.3859302402,
0.2035036087,
-0.2281823903,
-0.0819196925,
-0.3655665517,
0.1446021795,
-0.4327117801,
0.1780244559,
0.1168845296,
-0.2737576962,
0.0803427175,
-0.2586565316,
-0.1680172086,
-0.0763952136,
0.508751452,
0.2171330601,
0.3228864372,
-0.0183474869,
0.1714262664,
-0.0767273456,
-0.0576793477,
0.2524202466,
-0.3967368603,
0.0253676437,
0.1348519772,
-0.0958462358,
-0.0770933032,
0.1277973503,
0.0256881565,
-0.032763958,
0.1246376783,
0.0682526007,
-0.1068674028,
0.0644042641,
0.0538507961,
0.0521951392,
0.2863238752,
-0.076601021,
0.1167677566,
-0.2036598027,
0.166965574,
-0.0508588366,
-0.0176646411,
0.0861195251,
-0.0753239244,
-0.3973385692,
-0.1976340413,
0.1687505245,
0.0014936477,
-0.062957719,
-0.3396822214,
-0.0813493952,
0.1825858951,
0.3383554816,
0.2477038801,
-0.0689880326,
-0.2663638592,
0.1952616125,
0.2603036165,
-0.0893421248,
0.1586497575,
-0.0882690027,
-0.1036935896,
-0.1506858468,
-0.0623485371,
0.4825169444,
0.2577641904,
0.6017349958,
0.0311782584,
0.0021427795,
0.028184697,
-0.0155199431,
0.121614255,
0.0396912508,
0.1165007129,
0.0043118298,
0.0528847948,
0.0625880063,
-0.2535326779,
-0.2557505369,
-0.0347327106,
-0.1973153502,
-0.524072051,
-0.3210694194,
0.1228111237,
-0.3273596764,
-0.3346838951,
0.2478318512,
-0.0070183943,
0.0605553053,
-0.3129454851,
-0.0337527953,
0.0295482948,
-0.4143958092,
-0.2400684059,
-0.089732632,
0.3797774613,
0.0012104437,
-0.2380744517,
-0.4831040204,
-0.1488146484,
0.2068390548,
-0.0621500015,
-0.4448851645,
0.246032089,
-0.3705917299,
0.1259260625,
0.7066496015,
-0.593824327,
-0.4758626521,
0.7060874701,
-0.1530993283,
0.1541839838,
0.1148286909,
0.304777205,
0.3644038439,
-0.1938651949,
0.2743984461,
0.4499005675,
0.164540112,
-0.0376655795,
-0.1342466772,
-0.23661834,
-0.045294717,
0.069085665,
-0.2126619518,
0.3807031512,
0.0322404616,
0.2286941707,
0.2805711925,
0.0719812512,
-0.0035695881,
0.0699648708,
-0.0190698858,
0.2078529298,
-0.2160391062,
0.2683117986,
-0.5918144584,
0.3081949055,
-0.2088378668,
0.0111982739,
0.1865446866,
0.0156149939,
-0.3768416047,
-0.2749230266,
-0.1442953795,
-0.3913903832,
0.0948156938,
0.3689242899,
0.1382951587,
-0.1243662462,
-0.128129378,
0.3340530396,
-0.2191487551,
-0.0176076498,
-0.160417527,
-0.0221683644,
0.1522675604,
-0.1637908518,
0.0569931045,
0.1050849408,
0.2022018284,
0.171318993,
-0.0114864428,
0.4094794393,
-0.1540580094,
0.0669988245,
-0.0887555629,
-0.2120266557,
0.0864796489,
-0.0025835261,
0.2240314484,
0.0819460154,
0.1684666276,
-0.2278700173,
-0.0677172393,
0.1494402289,
0.0992673784,
0.3279950619,
0.0210087597,
0.1423207521,
0.1164701283,
-0.2840329409,
-0.0270047709,
-0.3489286304,
0.0761513859,
0.1613445729,
0.3522972465,
0.0295900796,
-0.2389680296,
-0.0452458039,
0.2232686281,
0.1981344521,
0.020454634,
0.1191776842,
-0.0028204955,
-0.0736614913,
0.029122645,
0.646499157,
0.8169702291,
0.1107555553,
-0.0627676994,
0.4516937435,
-0.271200031,
-0.2540046573,
0.2792920768,
-0.1479597688,
0.128902331,
0.1667046845,
0.0079042353,
0.0080788657,
-0.1439011097,
-0.4434363246,
-0.0547378883,
-0.0084888581,
-0.4380913377,
-0.0696293861,
-0.2960467339,
-0.4512907267,
-0.1279859245,
-0.2876826227,
-0.0485401712,
-0.2141085267,
-0.1592726707,
0.1131824553,
0.1047846079,
-0.0083433427,
-0.2631918788,
0.0197489858,
-0.2949642539,
-0.8628901839,
0.0819825381,
-0.0314405374,
-0.3450303674,
-0.0835405886,
0.0386167727,
0.1652457565,
0.0731320232,
-0.1557794511,
-0.2440003604,
0.1602343172,
-0.102919668,
0.1122969165,
0.1613500267,
0.3075021207,
0.3883804083,
-0.151153475,
0.2565557063,
-0.4360590875,
0.1637898684,
0.216210559,
-0.0127828941,
0.042474553,
0.0326361172,
-0.0875904411,
-0.3689766526,
-0.2115420103,
-0.164963752,
-0.4502527118,
-0.0885405242,
0.2849121392,
0.1961458921,
0.4581587315,
0.1651854366,
0.3243180513,
-0.164577499,
0.0822790712,
-0.0674817339,
-0.3117093146,
0.2791599035,
-0.47333318,
-0.078999877,
0.0859311223,
-0.2227194756,
0.2154388279,
0.1335568875,
-0.3884004951,
0.1996544152,
-0.1469651908,
0.0002487451,
-0.0470807627,
-0.0934200063,
0.2494484931,
-0.0822577029,
0.0479588732,
-0.2609183192,
-0.3588173389,
0.3365956545,
0.3105378151,
0.5375540853,
0.2759686112,
0.298624903,
-0.1452534199,
0.484687835,
0.1508263648,
-0.1560513377,
0.6103883386,
-0.1947127432,
0.460178256,
-0.0382341072,
-0.2387868762,
-0.0032228082,
0.3633524179,
-0.0172827691,
0.2315364033,
-0.061175216,
-0.3007302284,
-0.1902963668,
-0.1062507853,
-0.1545365453,
-0.2300316691,
-0.0597751439,
0.0295014493,
0.1523667872,
0.0646301508,
0.1584213674,
-0.0831473321,
-0.1170615405,
-0.1872001141,
0.4351291656,
-0.0597942546,
0.2172850668,
-0.0276044309,
0.2997870445,
-0.3493390977,
0.1422001272,
-0.2189184874,
0.3044459522,
-0.1182057634,
0.0613892823,
-0.0159720406,
0.1169418395,
0.2156412154,
-0.2203438282,
0.2306118459,
0.2625309825,
0.1650993079,
-0.6942800283,
-0.0943868831,
0.0507083386,
0.3705738485,
0.6355692148,
0.1778833866,
-0.1156035215,
-0.236060679,
0.1376714706,
-0.175421983,
-0.0925002024,
-0.4230397344,
-0.0664593875,
-0.3104560971,
-0.2801862955,
-0.3828446269,
-0.1292265207,
0.2305581719,
0.2301345766,
0.1135467812,
0.0101911668,
0.037024416,
0.0223108456,
-0.0750331581,
0.0257842913,
0.165501669,
0.2535585463,
-0.3498474061,
0.0041104993,
0.0184273291,
0.4556972086,
-0.1090065539,
-0.1824811697,
-0.0907647535,
0.0760888904,
0.1698763967,
0.2375927866,
-0.0469809063,
-0.278847754,
-0.0843370631,
-0.243196398,
0.0611785315,
0.1471871138,
0.2561113238,
-0.5553205013,
0.1812880933,
-0.3989014328,
0.5427383184,
-0.1028673053,
-0.0532118082,
0.1602560729,
-0.0507054217,
-0.3970302045,
0.3244518936,
-0.0401612632,
0.5586261153,
0.2404640019,
0.0567302071,
0.4123043418,
-0.4333569705,
0.340760529,
0.3820565343,
0.0827065483,
-0.2056448609,
0.2356446385,
0.1081555635,
-0.243430689,
0.1182639301,
0.4598893523,
-0.1859721243,
0.3259103298,
-0.155995518,
-0.4174779058,
0.0909638479,
0.437913388,
-0.2495400012,
-0.3017905354,
-0.317876339,
0.1256024539,
-0.0598925799,
0.3444619179,
-0.0450953618,
-0.313447684,
-0.3213608265,
-0.1474079192,
-0.2323344052,
0.1951861382,
-0.0521365814,
-0.1184979528,
0.1656087786,
0.0278663039,
0.1757667661,
0.3797241151,
-0.2905165851,
-0.0837100595,
-0.238989234,
0.083014667,
-0.1039217412,
-0.0695939809,
0.1519479156,
0.0730861127,
0.2014411688,
-0.1141102165,
-0.3429859281,
-0.1494406462,
-0.0445593409,
-0.201519087,
0.4617882073,
-0.0481094234,
0.0759339631,
-0.1242551059,
-0.492182821,
0.0628538653,
-0.0180092305,
0.1029132158,
0.0785280764,
0.0157989599,
-0.0287092943,
0.1853823364,
-0.1957310438,
-0.337682724,
-0.0281161536,
0.3746061921,
-0.3746852577,
0.290784657,
0.4278506935,
-0.1933621764,
0.1264045238,
-0.0584544875,
0.0083093792,
-0.5848217607,
-0.21616292,
0.103173539,
0.2297324985,
0.1647561044,
0.1329748482,
0.2493012697,
-0.0034954199,
0.180294469,
0.2565452754,
-0.3308794498,
-0.1663015783,
-0.0038664863,
-0.3690318465,
0.1466335207,
-0.1980223954,
0.2377352417,
0.3796905279,
-0.2016597688,
-0.2258140743,
0.1875278652,
-0.1729786396,
0.0496795438,
0.1979012191,
0.1081394702,
0.0217498988,
0.1827964783,
0.0240135714,
-0.0625099391,
-0.2874265909,
-0.133997336,
-0.3909823596,
0.1758334637,
0.226489827,
-0.1660331786,
-0.0547828153,
-0.2572703958,
-0.2181169391,
-0.0314531699,
-0.0373602659,
0.1865928918,
-0.0764184892,
0.3890279531,
-0.1978254914,
0.243920207,
-0.0793267787,
0.1648310423,
-0.1698616743,
0.3258366883,
0.0379705243,
0.1343825459,
-0.1742456257,
0.0088646784,
-0.1068821251,
-0.23138991,
-0.2788546383,
0.0202060938,
0.2709775269,
-0.0176595822,
0.2543971539,
0.2019493133,
-0.041505184,
0.3494022489,
-0.2206251472,
-0.126997605,
0.2625514865,
0.1083604246,
-0.1582948565,
0.222232461,
0.3854727745,
0.3658418059,
0.0349598154,
0.2504578829,
0.160232693,
-0.3160719275,
-0.0662196726,
-0.0450420454,
0.2565683424,
0.0481730364,
-0.007577572,
0.3368712068,
-0.2633073926,
0.200919345,
0.0543518141,
0.281752646,
0.2158437073,
0.4725826383,
-0.0022781901,
-0.1878002435,
-0.2764520943,
0.0665214807,
0.1926875412,
-0.5811085701,
0.1973800957,
-0.0308280066,
0.1413286328,
0.2872925699,
-0.1325432956,
0.0486671887,
-0.1231481582,
0.4628401101,
-0.0270577706,
0.2395099849,
0.0963632166,
-0.1682440042,
-0.1834776998,
0.0434651226,
-0.0890514627,
0.0064130276,
-0.169731617,
-0.2001912296,
-0.0572065897,
0.0989689454,
-0.2617081702,
-0.2644015551,
0.2126556039,
-0.0461101234,
-0.0920266286,
-0.2886495292,
0.1431360692,
-0.0127506964,
0.1704822779,
0.3351089358,
0.260812819,
0.2379632592,
0.5772153139,
-0.0407348014,
-0.1141021252,
0.1889009178,
0.0600244105,
-0.2130744606,
0.0773512274,
0.3676769137,
0.350633204,
0.3967398107,
0.0209642015,
-0.088220939,
-0.0021248413,
0.2406387329,
0.2035894692,
-0.1885024309,
0.7938402295,
0.0524094552,
0.1809537411,
-0.1163357049,
0.3631627262,
-0.4275178909,
-0.2053982019,
0.2878558934,
0.2680915594,
0.3093083799,
-0.0219094101,
0.019961793,
-0.186156258,
0.5015253425,
0.0297628418,
-0.1064066738,
-0.1711031944,
-0.0851539075,
-0.5885355473,
-0.1091987193,
-0.0785848945,
-0.3384309113,
0.0276817009,
0.444884479,
-0.2722951472,
0.0116580054,
-0.166905269,
0.1294877976,
0.0158836916,
0.1184921414,
0.091453433,
-0.1762313098,
-0.2772182822,
0.0589550808,
0.2688052058,
-0.3439936042,
-0.259808749,
-0.4583738744,
0.0284469128,
-0.0672116876,
-0.1894517392,
0.2401182503,
0.2790065706,
0.4852347672,
-0.043664895,
0.3151144981,
-0.0392435603,
-0.2778839767,
-0.3964206874,
0.1328743845,
-0.1942774653,
-0.0794956833,
-0.0010468708,
0.1259424835,
-0.32741189,
0.3908375204,
-0.290415585,
0.1902526021,
0.054690063,
-0.1452249736,
-0.0034683384,
0.1812205762,
-0.3123650849,
0.1539277732,
0.2389641851,
0.3021644056,
-0.1769242585,
0.2898985744,
-0.2135693431,
-0.2442595512,
0.5382839441,
-0.1240978539,
-0.0283330306,
0.1594702005,
0.2759436071,
-0.1947039664,
-0.1216938496,
-0.2669954896,
0.0924192965,
0.4286953509,
-0.2330364436,
0.1069558412,
-0.0202894509,
-0.2478489131,
0.0288677812,
0.0702999532,
0.0690405443,
0.1191773713,
-0.1778611839,
-0.2479634583,
-0.2421287
] |
https://github.com/huggingface/datasets/issues/554 | nlp downloads to its module path | Indeed this is a known issue arising from the fact that we try to be compatible with cloupickle.
Does this also happen if you are installing in a virtual environment? | I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`? | 30 | nlp downloads to its module path
I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`?
Indeed this is a known issue arising from the fact that we try to be compatible with cloupickle.
Does this also happen if you are installing in a virtual environment? | [
0.032684654,
0.2539582551,
0.1499527097,
0.034595944,
0.2147578746,
-0.1813212037,
-0.0204591844,
-0.0045513771,
0.1024630144,
-0.0508644581,
0.1380292475,
0.8880224228,
-0.4422509074,
0.0815137699,
0.3362325132,
0.3320015669,
-0.1796583235,
-0.0002896823,
-0.4403931201,
-0.0244459957,
-0.2120733708,
0.3455498219,
0.0295308828,
0.1748971939,
0.063062191,
-0.2714597583,
0.1764772087,
0.2207587659,
-0.2017824054,
-0.2983637452,
0.226238519,
0.0677954108,
0.0233546644,
0.2312597185,
-0.0001246572,
-0.1266600192,
0.1466836482,
-0.2211492062,
-0.5942363143,
-0.4229250848,
0.0836748034,
-0.150478214,
0.3097316623,
-0.2473911941,
0.40544644,
0.6272528172,
0.2100756168,
-0.0133367926,
0.0938679129,
0.1800402999,
0.1298221797,
0.3722444177,
-0.1461203247,
0.3540541828,
0.1991158426,
0.1898444742,
-0.2509310842,
0.5630121827,
-0.0468495563,
-0.1564090252,
0.1237195283,
-0.0159901716,
-0.1453781575,
-0.2201295644,
0.4053892493,
0.0232426971,
0.5421705246,
-0.1580371857,
-0.1846936792,
0.0027793022,
0.1580798924,
-0.6352775693,
-0.287543714,
-0.3459050655,
0.0451215543,
-0.1250388026,
0.520231843,
0.356515348,
-0.2196212262,
0.1207836568,
0.0627231374,
-0.3631261587,
-0.1046481356,
0.3527782857,
0.0758762509,
0.4675634205,
-0.0018831678,
0.0300993435,
0.3376193941,
0.2649235129,
0.3683657348,
0.1396362334,
0.0817186832,
0.294079423,
0.2760203183,
-0.0600531362,
-0.0974642634,
0.3331540525,
-0.0598650016,
0.0230616443,
0.133187294,
-0.1023270786,
-0.0971474722,
0.2347496599,
0.2408019602,
-0.2937855721,
-0.0199317224,
0.1705410331,
0.2132526934,
0.1521567255,
0.2788051367,
0.0625129789,
-0.0701542571,
-0.0998431966,
-0.1074584424,
-0.1125403047,
0.0609449297,
-0.1470347047,
-0.0614197776,
-0.1838051826,
-0.1485041827,
0.2996827662,
0.2301569879,
0.0253079459,
-0.1921527982,
0.1033579856,
0.2294240296,
0.0773208439,
-0.3032053709,
0.4586700201,
0.0294332616,
0.1733399034,
-0.333863318,
0.0488009304,
0.1534633487,
0.1618089527,
0.3029885888,
-0.2545303106,
-0.0548673309,
-0.1869947314,
0.0137977526,
0.2578772306,
-0.2314932644,
0.3605968654,
0.0254144147,
0.3670634031,
-0.2235783637,
-0.2853196859,
-0.459127605,
0.1197428852,
-0.2980468273,
-0.3935774863,
0.1862977743,
0.0347370356,
-0.2341651767,
-0.0807256252,
0.1144629121,
-0.6620378494,
0.2572255135,
-0.2190912962,
0.1985621005,
-0.2899951935,
-0.2044745386,
-0.287066102,
-0.0835915357,
0.4170090556,
0.1406827271,
-0.2434350997,
-0.2343944609,
-0.0062076645,
0.4222152531,
0.1802351028,
-0.1540426314,
-0.0480857156,
-0.2764587998,
-0.2796527743,
0.8423380256,
-0.3605710864,
-0.3386267126,
0.2357317805,
-0.5933494568,
-0.0297027007,
0.0297484212,
0.1759302914,
-0.4824041724,
-0.1026941463,
0.0016464125,
0.7592406869,
0.2715173364,
0.0893570483,
-0.3122756183,
-0.2933424711,
0.2515256107,
-0.1365661621,
-0.0736148506,
0.0352934077,
-0.0354195684,
0.5409154296,
0.5577622056,
0.2064170539,
0.0009641685,
0.3216338456,
0.5599620342,
0.1643118411,
-0.2873654366,
-0.1587283015,
0.0221836045,
-0.0262413397,
-0.8242056966,
0.1558987647,
-0.5611914992,
-0.2444066405,
-0.1042345688,
0.1596707404,
0.1721693426,
0.0421344638,
-0.0329916552,
0.2325980812,
0.053253755,
0.1800093055,
-0.1128172874,
0.4923970699,
-0.1988108456,
0.1270056516,
-0.1246594638,
0.1227729395,
-0.1702817231,
-0.1980680823,
-0.075983867,
0.4849921167,
0.1350174993,
-0.16885975,
0.028745465,
0.5297238231,
-0.1243886352,
0.1040724814,
-0.3746861517,
0.0564401075,
0.3235250711,
0.0784608424,
0.0241335873,
0.2581124306,
-0.047250133,
0.096939072,
-0.1821472943,
0.1139853597,
-0.3834557235,
0.0970008373,
0.1707638651,
0.2843179703,
0.0330999717,
0.0538514629,
-0.1857932508,
-0.0878795981,
0.2567599714,
0.2180498391,
0.4287630916,
0.3316881657,
0.0636420697,
-0.0161924548,
0.3589438796,
0.176491484,
0.1916054785,
0.0315665528,
0.0414776281,
-0.0381781124,
0.0644056201,
0.2960359454,
0.1246424466,
0.014352385,
0.1151519716,
0.2016739547,
-0.3002694845,
-0.1990069151,
0.1411170065,
0.0025231522,
0.2384323478,
0.1320197731,
0.1465293914,
-0.1913348436,
-0.1618212759,
0.0934488922,
0.0271345973,
0.1175144911,
-0.3330698609,
0.0698606893,
-0.5621653795,
-0.3618032336,
-0.6628376245,
0.1497510225,
-0.5517080426,
-0.2448141575,
-0.1711101085,
0.3884856999,
-0.0105236322,
-0.0930339843,
-0.0906841308,
-0.0874291807,
-0.4265939295,
-0.5033163428,
0.004608199,
0.0666369349,
-0.3229707479,
-0.152954191,
0.2286382467,
0.1363467574,
0.1716546118,
-0.045200631,
-0.4361419082,
0.0560571924,
-0.2377966046,
0.0539311729,
0.3451831639,
0.150955528,
0.2952370942,
-0.1043477207,
0.0696858466,
-0.2573922276,
0.0952087417,
-0.1837004721,
0.10741283,
0.0065734833,
0.1170466021,
0.0201883465,
-0.3216094673,
-0.1229665279,
-0.4191869795,
-0.4624138772,
0.3291350007,
0.2890336812,
0.3142391145,
0.8311158419,
-0.1648728848,
-0.033630237,
-0.1469503939,
0.1657741964,
0.2017698586,
0.0353781134,
0.2355373353,
-0.2269583046,
-0.127161473,
-0.1235050857,
0.0183706358,
0.0342531726,
0.3961653411,
-0.2507335246,
-0.0285672322,
-0.0195146985,
0.0932910517,
0.1410598904,
0.0818156376,
0.5214659572,
0.2803617418,
0.1878140569,
-0.0578506216,
-0.2552233338,
0.2268040627,
0.5429318547,
0.4109838903,
0.3115751743,
0.1435751468,
0.228055343,
0.3124432564,
-0.0587830879,
0.2156495899,
0.2798894942,
0.0142920408,
0.5034252405,
0.0840159357,
-0.0997241661,
-0.2104309797,
0.0449193753,
-0.1026227772,
0.0505262651,
0.1502986401,
-0.204491213,
-0.1009590924,
0.0065247491,
-0.0164251328,
-0.4088556468,
-0.0000800118,
0.0778421611,
0.3124976456,
0.1592760384,
0.0183890611,
-0.1750244796,
-0.2969207764,
0.2853291631,
0.3175712228,
-0.0364397578,
0.3643210828,
-0.2949272096,
0.0840502977,
-0.3474314809,
-0.3622846901,
-0.1021943763,
0.3788608015,
-0.095683448,
-0.034748964,
0.1840119213,
-0.1730255485,
0.1940385848,
-0.0323348194,
0.2411330044,
0.1861348152,
-0.0809579641,
-0.5383051038,
-0.0579552352,
0.0232510492,
0.4209907949,
0.2981883883,
-0.0265953802,
-0.0678072646,
-0.0809440315,
0.0771070793,
0.0870719925,
-0.1857691705,
-0.1419163644,
-0.2486446202,
-0.4294267297,
-0.2259023935,
0.0182488896,
0.0481491163,
0.1710205972,
0.2736574113,
-0.0725754276,
0.1815527976,
-0.123274684,
-0.0012340061,
-0.2885779142,
0.2251171619,
0.0260172263,
0.117623046,
-0.1872000396,
0.151719138,
-0.2706375718,
0.3499566913,
-0.0563942604,
0.1945848465,
-0.2746191323,
-0.1085656285,
0.3069174886,
0.5553453565,
-0.0173657443,
0.0041445307,
-0.0703744888,
-0.4417223632,
0.0162290409,
0.1672561169,
-0.1102345139,
-0.3083550334,
0.1300925165,
-0.2451556921,
0.2288478613,
0.1490294635,
-0.2295273542,
0.3482480049,
-0.1365226358,
-0.4190997183,
0.1697120816,
0.2539564967,
1.1712487936,
-0.1969949603,
0.2815850377,
0.1326329112,
-0.3524651825,
0.2458817214,
-0.2211244404,
0.2156592607,
-0.4574080706,
0.2059419304,
0.0255226046,
-0.3373978436,
-0.0713271499,
-0.0014053257,
0.1140249446,
0.2740054727,
-0.0155983493,
-0.2410453856,
0.1407308131,
0.3760003448,
-0.1842531413,
-0.2960530519,
-0.3168865442,
-0.0070973858,
-0.0626245514,
0.4845359027,
-0.1284253746,
-0.135467574,
0.1815513372,
-0.1964695752,
0.0611097217,
0.1740921438,
-0.3605586588,
0.0303991847,
-0.0645298511,
-0.0112135708,
-0.0753136724,
0.1288611889,
-0.0585708693,
0.5918825865,
0.0438420475,
-0.2288341224,
-0.06508708,
-0.321873337,
0.2259808183,
0.431986928,
0.0406468585,
-0.0441013239,
0.0701377392,
-0.4085544944,
-0.0115293935,
-0.3302131891,
0.2451235354,
-0.0626670122,
0.2489247322,
0.1979110539,
-0.3371073604,
0.0066836663,
-0.0553534627,
-0.0486360565,
0.0257921107,
0.1970211864,
-0.0889835209,
0.1510650963,
-0.1321661174,
-0.3250289559,
-0.0004809829,
0.5062594414,
-0.0911183655,
-0.0040777549,
0.2302749455,
-0.269975245,
-0.2008229494,
-0.1109198928,
-0.1100172326,
-0.4767767489,
-0.1046169102,
-0.0447583795,
-0.1264210939,
-0.3185575306,
-0.3657461405,
0.1917571425,
0.2100533247,
0.0222781897,
-0.3211942017,
-0.3799038231,
-0.1829974651,
0.0717362612,
0.2581097782,
0.0427504852,
0.2226271331,
0.0151269622,
0.0743718371,
-0.1445609927,
-0.1302338839,
-0.0362578705,
-0.2002600431,
0.0194211937,
0.127139613,
-0.0886328369,
0.0229697749,
0.0672451258,
0.0520130247,
0.1141587123,
0.023423072,
-0.0221919809,
-0.2302060425,
0.219851315,
0.1898467839,
-0.1088622212,
0.3095650971,
-0.4991005063,
-0.070901528,
-0.0725435764,
-0.2607086003,
0.3616879284,
0.1469866037,
-0.0562007241,
0.1453669071,
0.295206219,
-0.0725112334,
0.3919354975,
-0.3460869491,
-0.1231108606,
0.0871470496,
0.1882703006,
0.0799202472,
0.0217772797,
-0.2326597422,
-0.2026039511,
-0.1023405045,
0.0921721458,
0.1676259786,
-0.1014088988,
-0.16618976,
0.2122141719,
-0.0144444145,
0.4409589469,
0.0101479515,
-0.1732782125,
0.5315678716,
-0.0094004953,
-0.064459756,
-0.05264882,
0.1593039632,
-0.1526354551,
-0.0896262079,
0.4959959388,
0.2406764627,
-0.1837691218,
0.2814669907,
0.0132147484,
0.2747504115,
-0.0646927953,
0.0044510365,
0.0725781247,
-0.0291590393,
0.0261630323,
-0.180253759,
0.0871638954,
0.2104329467,
-0.0309833512,
-0.0112776197,
-0.1122565866,
-0.0845984668,
-0.100011386,
-0.0228976421,
-0.0171515122,
-0.2293916196,
-0.0004271418,
0.5783107281,
0.2522877753,
0.1274653971,
0.2180201113,
-0.4098689556,
0.0148700103,
0.1073544398,
0.1376322657,
-0.1755609512,
-0.0841578096,
-0.1308662891,
-0.0059651881,
-0.3466266096,
0.0762873143,
-0.0426451266,
-0.3841933012,
0.3021790385,
0.1633163393,
0.2153520882,
-0.3880352378,
0.0622003973,
0.163443774,
0.0764947757,
-0.2189602703,
0.1823269725,
-0.4646621346,
0.3215571642,
-0.2943389416,
-0.2306811064,
0.1723096818,
0.3910273314,
-0.044332698,
-0.1313530356,
0.034427084,
0.1779142916,
-0.2737198472,
-0.0303485319,
-0.0306203775,
0.7043982148,
0.2260795981,
-0.0646465644,
-0.0352959372,
-0.3021514714,
0.1523115784,
-0.2608656287,
0.0255338326,
0.6669097543,
0.3248121738,
-0.0320296362,
-0.0970164239,
0.1974642575,
-0.1992257535,
0.0635697842,
0.1859071851,
0.2536441088,
0.1504038423,
0.0420225635,
0.041113358,
0.3362579346,
0.3680168688,
0.0272583142,
0.1248696819,
0.076766327,
-0.0560113639,
-0.5223754048,
0.2010113001,
-0.2894558311,
-0.321692735,
-0.0663249642,
-0.1535450518,
-0.0217736512,
0.0549932458,
-0.2688910663,
0.2122062147,
0.1099134162,
-0.1559634656,
0.1555140167,
-0.0372735262,
0.1464592665,
-0.0764597654,
0.0325150564,
-0.407803297,
-0.0673259348,
-0.1238288507,
-0.0607972592,
-0.1650112271,
-0.126401037,
-0.1908153594,
0.2283847928,
0.1445747316,
-0.0110218255,
0.0957022458,
-0.0102242231,
-0.4300073981,
-0.3392842114,
0.0320978127,
-0.2900739014,
-0.0707346499,
-0.1296248138,
0.0680453479,
-0.2968205214,
-0.0652117878,
-0.6669613123,
0.3160367608,
-0.031404186,
0.1101648957,
0.3075266182,
0.0258794278,
-0.2131536305,
0.3563809693,
0.1680592597,
0.0822217762,
-0.0048589297,
0.0684650391,
-0.3529642224,
-0.1970295608,
0.4783397019,
-0.0554326996,
-0.1137766391,
0.1693717241,
0.3043698072,
-0.1784669608,
-0.0747171491,
-0.392665863,
-0.29596591,
0.1449329555,
-0.1627512425,
0.0588717163,
0.0743248463,
-0.3249978721,
0.0506933928,
-0.0538545623,
0.4153848886,
-0.0118989516,
-0.2158795297,
0.0724135861,
-0.3163644671
] |
https://github.com/huggingface/datasets/issues/554 | nlp downloads to its module path | > Indeed this is a know issue with the fact that we try to be compatible with cloupickle.
>
> Does this also happen if you are installing in a virtual environment?
Then it would work, because the package is in a writable path. | I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`? | 44 | nlp downloads to its module path
I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`?
> Indeed this is a know issue with the fact that we try to be compatible with cloupickle.
>
> Does this also happen if you are installing in a virtual environment?
Then it would work, because the package is in a writable path. | [
0.0366535038,
0.2192300558,
0.1506813467,
0.0508496016,
0.211217761,
-0.1820767373,
-0.0289929546,
-0.0183259472,
0.1189895868,
-0.038501516,
0.1243855208,
0.8661041856,
-0.4588844478,
0.1108861119,
0.3196983933,
0.3289009929,
-0.1757937819,
0.0106614307,
-0.3965616226,
-0.0296086818,
-0.2183234692,
0.357344538,
0.0127433538,
0.1647963375,
0.0456611216,
-0.2805224955,
0.1733622551,
0.2096365094,
-0.217550844,
-0.2872889936,
0.2205726802,
0.0743850246,
0.0198589563,
0.2375180721,
-0.0001259466,
-0.1407054365,
0.1535026133,
-0.2212115824,
-0.5928940177,
-0.4169440269,
0.09316542,
-0.1434484273,
0.301282227,
-0.2460529804,
0.3750994802,
0.6685693264,
0.1899010539,
-0.016812861,
0.0882839188,
0.1640617549,
0.120622769,
0.3852368593,
-0.1497680396,
0.3557633758,
0.1676639616,
0.1658293456,
-0.2532817125,
0.5958606005,
-0.0119573139,
-0.1434497088,
0.1398186684,
-0.0086900443,
-0.1408172101,
-0.22644943,
0.3995681405,
0.0035119578,
0.5514385104,
-0.1675672531,
-0.1826414764,
0.0073817167,
0.1411035359,
-0.6322299242,
-0.2780584097,
-0.3192712963,
0.0291319862,
-0.1716596186,
0.5286077857,
0.3607383072,
-0.1927577257,
0.1301568151,
0.053591378,
-0.3320131004,
-0.0955117345,
0.3633771837,
0.0818294883,
0.4446322918,
-0.0106203873,
0.0516017489,
0.3404613137,
0.2785789967,
0.3875051737,
0.1370326132,
0.0653612614,
0.2994748652,
0.260242641,
-0.0839846879,
-0.0904841721,
0.379472971,
-0.0536354855,
0.0383316204,
0.1583431363,
-0.1037541628,
-0.0878679901,
0.2539206743,
0.2582300901,
-0.2878366709,
-0.0004797317,
0.1772269905,
0.221204266,
0.1572444737,
0.2616729736,
0.0639050603,
-0.0639305934,
-0.1136355251,
-0.1016859412,
-0.0943213701,
0.0696436241,
-0.1461735219,
-0.0453582108,
-0.1766074896,
-0.15824911,
0.2957184911,
0.2490972877,
0.029546123,
-0.2259834409,
0.091073215,
0.2279287577,
0.0887092575,
-0.28389588,
0.4456202388,
0.0281989761,
0.1882093996,
-0.325899303,
0.0462065563,
0.1481839716,
0.1662860215,
0.3029671013,
-0.2558455169,
-0.0446326733,
-0.2242345661,
0.0183960646,
0.2445896566,
-0.2678958178,
0.3420211971,
0.0084262416,
0.3626170158,
-0.2094112635,
-0.3191003203,
-0.4718197286,
0.1166784614,
-0.3187378645,
-0.3920499086,
0.1970467418,
0.0210623592,
-0.228959024,
-0.060970474,
0.068465516,
-0.651692152,
0.2548455596,
-0.2134691477,
0.2012832612,
-0.2950476706,
-0.2088280022,
-0.2876151502,
-0.0944902301,
0.4052163363,
0.0979430825,
-0.2253577709,
-0.2361150533,
-0.0100293718,
0.4378159344,
0.1829201579,
-0.1644484401,
-0.0694435984,
-0.2827441692,
-0.2839721143,
0.8306162953,
-0.3531686664,
-0.3510719836,
0.2237608135,
-0.5753189325,
-0.0287065879,
0.0173471533,
0.1781504303,
-0.4840007424,
-0.1129441485,
-0.0302460119,
0.7501537204,
0.2822032869,
0.0799725726,
-0.3088636696,
-0.2825058103,
0.2679073811,
-0.1572667658,
-0.0863527507,
0.0594156347,
-0.0044739097,
0.520802319,
0.5502971411,
0.2040087283,
0.0037947781,
0.3402994275,
0.5700147748,
0.1585389227,
-0.2889187932,
-0.1508492231,
0.0248758718,
-0.0060575306,
-0.8670705557,
0.1508539915,
-0.5519194603,
-0.264341712,
-0.1136565804,
0.1670418084,
0.1619915664,
0.0302643515,
-0.045391269,
0.2329876423,
0.0791508332,
0.1598996222,
-0.1085994318,
0.4938773811,
-0.2198511362,
0.1199445426,
-0.0938763246,
0.1448795646,
-0.1620065868,
-0.2122443765,
-0.0867496878,
0.4839964509,
0.16699332,
-0.1863855422,
0.0142853335,
0.5541189909,
-0.0953233987,
0.1087048054,
-0.3931441009,
0.0577352792,
0.3354282975,
0.0622464269,
0.0238070693,
0.2601510882,
-0.0441210642,
0.0759528875,
-0.1567489505,
0.1118648723,
-0.3385421038,
0.0956279784,
0.1660504639,
0.2892734706,
0.0081915259,
0.0418534428,
-0.1756075621,
-0.0711489767,
0.2505502403,
0.2442206442,
0.4438203573,
0.3397473097,
0.0398136079,
-0.0256213248,
0.3678895831,
0.1768847853,
0.1766761541,
0.0291850101,
0.0396355055,
-0.0324894004,
0.0751422346,
0.2753883302,
0.1064354777,
-0.0058807097,
0.088465251,
0.2145641595,
-0.2644826174,
-0.174425751,
0.1345821917,
0.0081140362,
0.2572437525,
0.1063860059,
0.1480325311,
-0.1938436329,
-0.1374710202,
0.0872047991,
0.0261674672,
0.1036972404,
-0.3324562907,
0.0644228682,
-0.5570713878,
-0.3485969305,
-0.6488166451,
0.1673888117,
-0.5379947424,
-0.2224395126,
-0.1973872781,
0.4104062617,
0.0084689036,
-0.0865172967,
-0.0707253292,
-0.0870466083,
-0.4106624126,
-0.4929045141,
-0.0081483722,
0.077634573,
-0.3279863596,
-0.1643289924,
0.2320946604,
0.1121883392,
0.1739887446,
-0.0162181444,
-0.4475990534,
0.044014819,
-0.2547736168,
0.0593612529,
0.3497131467,
0.1541700214,
0.3018979132,
-0.0938143805,
0.018885687,
-0.2698602378,
0.1019590497,
-0.1533109844,
0.1081804857,
0.017467387,
0.1186528504,
0.0067653432,
-0.3135104477,
-0.1002922058,
-0.4666460752,
-0.445424974,
0.3380752802,
0.2953380644,
0.3194354773,
0.8248813152,
-0.16706945,
-0.0477719195,
-0.1108452529,
0.1610219777,
0.2100709081,
0.0273995697,
0.2210046053,
-0.2460755706,
-0.1387401372,
-0.1334888041,
0.005333893,
0.0291791745,
0.3621101975,
-0.2343436331,
-0.0264552943,
-0.0387185365,
0.1095170379,
0.1662642062,
0.0689311698,
0.5346989632,
0.2771005929,
0.2051318735,
-0.0717219859,
-0.2466054261,
0.2251664698,
0.5450428128,
0.4341472685,
0.3294352293,
0.1463757604,
0.2293755114,
0.2922334373,
-0.0532068685,
0.2080353051,
0.3000761569,
0.0098279174,
0.4775601923,
0.0632712841,
-0.1031334102,
-0.1840750724,
0.0382703468,
-0.110667035,
0.0486712568,
0.1554291695,
-0.2069405317,
-0.1123564467,
0.0205845907,
-0.0134230927,
-0.3826886714,
-0.0242396034,
0.1152459234,
0.295492053,
0.1438841522,
-0.0172860548,
-0.1485429406,
-0.2984662056,
0.2888367176,
0.325339973,
-0.0598452911,
0.379308641,
-0.295244813,
0.0804323703,
-0.3175660074,
-0.3601868153,
-0.110040158,
0.3189073205,
-0.0847238302,
-0.0077189356,
0.1835469007,
-0.1647549272,
0.1903630495,
-0.0420522243,
0.2470879257,
0.1847953796,
-0.0830093101,
-0.5184536576,
-0.0562219471,
0.0118576363,
0.3997412324,
0.3311589658,
-0.0270571634,
-0.0743682906,
-0.0558984689,
0.0720927864,
0.1218215004,
-0.1918268204,
-0.1297150999,
-0.2415908277,
-0.4315780401,
-0.2292883098,
0.0169672873,
0.0482568443,
0.2081788182,
0.2774985135,
-0.1133427918,
0.1775355339,
-0.1554812044,
0.0177860521,
-0.2938523293,
0.2565215528,
0.0394324437,
0.115561083,
-0.2399401367,
0.1914774626,
-0.2706721425,
0.3414495289,
-0.0520603769,
0.2024667263,
-0.2684368193,
-0.1000603586,
0.3315455914,
0.5321871042,
-0.0422710292,
-0.0014326945,
-0.0665894747,
-0.4235523939,
0.0184181109,
0.2030993104,
-0.1293684244,
-0.2833481133,
0.0891487598,
-0.229177922,
0.2372551858,
0.1529624909,
-0.2294103503,
0.3731083572,
-0.135573253,
-0.4136406183,
0.1329635978,
0.2320308387,
1.218569994,
-0.1927908808,
0.2424694598,
0.1323702335,
-0.3454880118,
0.2222171873,
-0.2093655765,
0.194203645,
-0.4596814513,
0.1864033937,
0.0184309781,
-0.3342028856,
-0.0501217917,
0.0156163508,
0.0858068615,
0.2734681368,
-0.0548892021,
-0.2729022503,
0.1304851621,
0.3938910365,
-0.1766328067,
-0.2877826691,
-0.2902982831,
-0.0144393556,
-0.0708494633,
0.5071827173,
-0.1328956038,
-0.1240095645,
0.1772802025,
-0.2180299163,
0.0646016449,
0.1719402671,
-0.4313862324,
0.0417571291,
-0.0812024027,
-0.0232604221,
-0.0917146206,
0.1257972717,
-0.0543600284,
0.6254203916,
0.0450862646,
-0.2213442326,
-0.0199677721,
-0.3359383643,
0.2360231876,
0.4263319075,
0.0375683457,
-0.0540940128,
0.0606903881,
-0.4100455642,
-0.0237480719,
-0.3558062017,
0.2627712488,
-0.0814678892,
0.2663204372,
0.1987921894,
-0.3775646687,
-0.0022956096,
-0.0742052048,
-0.0449430346,
0.0188488998,
0.1913132817,
-0.0827543736,
0.142454654,
-0.1317515671,
-0.3155429959,
-0.0179330148,
0.4924046099,
-0.0810654089,
-0.0276897587,
0.2180951834,
-0.2610343099,
-0.2072010189,
-0.1047285348,
-0.1250658929,
-0.437827915,
-0.0794601217,
-0.0168341473,
-0.1192843691,
-0.3126403689,
-0.3789980412,
0.2657550275,
0.2184224427,
0.0590274148,
-0.3117889762,
-0.3809974492,
-0.2000358105,
0.0596003756,
0.2855504751,
0.0585903451,
0.2201536596,
0.0488655977,
0.0752918646,
-0.1427544951,
-0.1186264679,
-0.0432412662,
-0.1961244345,
0.0132063795,
0.1663920283,
-0.0661024675,
0.0380036868,
0.092119433,
0.0476539135,
0.1081352904,
0.0495402105,
-0.0159849785,
-0.2462608665,
0.2203298807,
0.1938227415,
-0.1061010137,
0.3261568248,
-0.5064131021,
-0.0603541732,
-0.0794408172,
-0.2599948645,
0.3589473367,
0.1483558714,
-0.0642249882,
0.1416976154,
0.2805728912,
-0.0697368607,
0.3821465671,
-0.3312805891,
-0.1311708242,
0.0909303427,
0.1834526658,
0.1079594344,
0.0135393664,
-0.2332734019,
-0.2288062423,
-0.1173922345,
0.0736959353,
0.1737177074,
-0.0846079439,
-0.1903155446,
0.2157861292,
-0.0114624798,
0.4147981703,
0.0116168223,
-0.1680880785,
0.5315957665,
-0.020754328,
-0.0809613392,
-0.0303736776,
0.1509319842,
-0.1358060837,
-0.0864589736,
0.4999101758,
0.2174675763,
-0.1777749062,
0.2785035372,
-0.0077348538,
0.2855088711,
-0.0789787769,
0.0152459666,
0.0686135143,
-0.0156891197,
0.0382690504,
-0.1985383481,
0.0799783617,
0.1973945349,
-0.0911451131,
-0.0596215762,
-0.1054478362,
-0.0969787687,
-0.1131463796,
0.0123386253,
-0.0061767101,
-0.1821182668,
0.0117926076,
0.5577601194,
0.2451311201,
0.1157416999,
0.2604151368,
-0.4050932825,
0.0172736757,
0.0973193794,
0.1587706506,
-0.1763162762,
-0.0776243508,
-0.1327061355,
0.0050085038,
-0.3201334476,
0.0667281598,
-0.0235008188,
-0.3816546202,
0.3042974472,
0.1711450219,
0.215649277,
-0.3642811179,
0.0291382559,
0.1577450931,
0.0846865997,
-0.2103461325,
0.1931723058,
-0.4692876339,
0.3293416202,
-0.3130380511,
-0.2359607816,
0.1507844925,
0.3931719959,
-0.0258136429,
-0.1243560091,
0.0420359522,
0.1673624963,
-0.2569440007,
-0.0492859483,
-0.0278104953,
0.6900132895,
0.2152641714,
-0.0723850653,
-0.032329388,
-0.3037872314,
0.1399617344,
-0.2608659565,
0.0227422193,
0.6522597075,
0.3411043584,
-0.0329661295,
-0.1084376425,
0.1946371645,
-0.1974801421,
0.0580815673,
0.1874317527,
0.2687219083,
0.1433307081,
0.0451929346,
0.0345934853,
0.3350119293,
0.333910197,
0.0289561227,
0.1249740794,
0.0653204918,
-0.0672076195,
-0.5001580715,
0.2113158107,
-0.2728497684,
-0.3391138911,
-0.0894575715,
-0.1681344807,
-0.0510368831,
0.0347783305,
-0.2608061731,
0.2128749192,
0.1028027683,
-0.1845180988,
0.1697339714,
-0.0235912409,
0.1486403197,
-0.0666165501,
0.0496674255,
-0.3975169957,
-0.0580951385,
-0.1013714224,
-0.0718593672,
-0.1895341426,
-0.1180964336,
-0.2055094242,
0.2318047285,
0.149564594,
-0.0137111014,
0.1201321781,
-0.0202560127,
-0.4006498158,
-0.326546371,
0.0394968875,
-0.2855168581,
-0.0721663982,
-0.1215194017,
0.0657677799,
-0.2892020643,
-0.0710093454,
-0.6805310845,
0.3385009468,
-0.0133687183,
0.0413132608,
0.3364285231,
0.0010189451,
-0.2102989256,
0.3626985252,
0.1839003414,
0.0758147463,
-0.0131364726,
0.0722304955,
-0.3415881097,
-0.2063353956,
0.4868860543,
-0.0624350011,
-0.102550678,
0.175573796,
0.312661469,
-0.2155446112,
-0.0865193531,
-0.3908150494,
-0.308558017,
0.1468040347,
-0.1520266384,
0.058739692,
0.0959873646,
-0.3098626435,
0.0675997138,
-0.0619197711,
0.4163596928,
0.0033588372,
-0.21353589,
0.0681846514,
-0.3184006214
] |
https://github.com/huggingface/datasets/issues/554 | nlp downloads to its module path | > If it's fine for you then this is the recommended way to solve this issue.
I don't want to use a virtual environment, because Nix is fully reproducible, and virtual environments are not. And I am the maintainer of the `transformers` in nixpkgs, so sooner or later I will have to package `nlp`, since it is becoming a dependency of `transformers` ;). | I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`? | 63 | nlp downloads to its module path
I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`?
> If it's fine for you then this is the recommended way to solve this issue.
I don't want to use a virtual environment, because Nix is fully reproducible, and virtual environments are not. And I am the maintainer of the `transformers` in nixpkgs, so sooner or later I will have to package `nlp`, since it is becoming a dependency of `transformers` ;). | [
0.046116069,
0.2342475653,
0.1267918348,
0.0296459291,
0.2665548921,
-0.1070265621,
-0.0495608896,
0.0400402918,
0.0180515572,
-0.0675463527,
0.1628864706,
0.9074664712,
-0.3907275796,
-0.0176895298,
0.408822298,
0.3253488541,
-0.1871854812,
-0.0095228739,
-0.5038885474,
-0.0106997639,
-0.1629399657,
0.3172167838,
0.092808634,
0.2191580087,
0.1168669015,
-0.2506914139,
0.1828441024,
0.2161523253,
-0.1610961109,
-0.3452347517,
0.2223353386,
0.0172992647,
0.0685412064,
0.2536880076,
-0.0001222753,
-0.1021163464,
0.0917156041,
-0.2052919865,
-0.5588752031,
-0.4225789905,
-0.0087844804,
-0.0983678848,
0.2872700095,
-0.2772431672,
0.4104698896,
0.4269274473,
0.2175242156,
-0.0110210851,
0.103275165,
0.2261539102,
0.1559866965,
0.2630304694,
-0.0832817107,
0.3464417458,
0.2658578455,
0.2363733798,
-0.1880696118,
0.4692552686,
-0.1956578195,
-0.1145171076,
0.1284812242,
0.0002660267,
-0.1881885231,
-0.2027672529,
0.4447343051,
0.049065765,
0.547070682,
-0.1268077791,
-0.1714591086,
-0.0210754741,
0.2907857001,
-0.645606637,
-0.2828426957,
-0.3490971923,
0.0826935247,
0.0141361132,
0.4553362727,
0.3532888591,
-0.2175493985,
0.1618500948,
0.1237480789,
-0.4460405707,
-0.1745376289,
0.3221801221,
0.1006395221,
0.5085071325,
0.0016214624,
0.0043662358,
0.3273638189,
0.2121273875,
0.3102003634,
0.1156195179,
0.0912708938,
0.2791389227,
0.2880521715,
-0.0127188489,
-0.1458795518,
0.1641430557,
-0.0892469138,
-0.1161926612,
0.0068754032,
-0.109433651,
-0.1109068543,
0.168402046,
0.120175533,
-0.2382376939,
-0.0224938951,
0.1595932841,
0.182862699,
0.1355991662,
0.2601260841,
0.0825099796,
-0.0887879878,
-0.0337353684,
-0.1104378924,
-0.1315761656,
0.0319397375,
-0.1234216094,
-0.0872783884,
-0.2416336536,
-0.1768741608,
0.3003259301,
0.1871355027,
-0.0008028634,
-0.0788414329,
0.1259500831,
0.2351888716,
0.0070647206,
-0.2724526525,
0.4379550517,
0.0300906226,
0.156027168,
-0.2845475376,
0.0785731599,
0.1313216537,
0.0799526125,
0.3415977359,
-0.2330344319,
-0.142031461,
-0.1482820362,
-0.005402144,
0.2611581385,
-0.1751808226,
0.3640357554,
0.0621220767,
0.4477259517,
-0.2313424647,
-0.1666176021,
-0.4070671797,
0.1336166263,
-0.2844531834,
-0.4027180672,
0.1284138858,
0.0695235133,
-0.2449091077,
-0.1395528913,
0.1998023242,
-0.6429064274,
0.2645702362,
-0.179700613,
0.2135899365,
-0.1824083477,
-0.1748670042,
-0.300873369,
-0.0477913693,
0.4714627862,
0.2294433117,
-0.2523702979,
-0.246240586,
0.0334515236,
0.3789208829,
0.1632091552,
-0.1067513302,
0.1069482118,
-0.2686390281,
-0.291962266,
0.904715836,
-0.3906838596,
-0.3091708124,
0.2915873826,
-0.6707191467,
-0.0479655825,
0.1383056939,
0.1629866362,
-0.4714466333,
-0.0274624173,
0.072186783,
0.7123206854,
0.2875203192,
0.1052504331,
-0.262753129,
-0.3227462769,
0.2228294164,
-0.1149720252,
-0.0389802903,
-0.0065272786,
-0.1487010717,
0.597994864,
0.5738627911,
0.1660399884,
-0.0016500317,
0.3604933619,
0.537553072,
0.1958606988,
-0.2791606188,
-0.2807011902,
0.0373024531,
-0.0844610482,
-0.6614476442,
0.1078973785,
-0.5330188274,
-0.1834051013,
-0.1122300625,
0.1228865087,
0.1824478656,
0.0702089667,
-0.0002823323,
0.2549105585,
0.002228044,
0.2083616555,
-0.0960030556,
0.4704716206,
-0.1031579226,
0.1595538557,
-0.1962209344,
0.027526848,
-0.1773790717,
-0.1482793093,
-0.0782646388,
0.4618582726,
0.1250404119,
-0.1315958649,
0.0474953726,
0.4746610224,
-0.2096367031,
0.0889923498,
-0.2957384586,
0.0512467474,
0.2695709169,
0.0370292142,
0.0307874624,
0.2559651136,
-0.0543592162,
0.2016522139,
-0.2219353914,
0.0880997628,
-0.4998499453,
0.0931910723,
0.1127003357,
0.2424835116,
0.0632279143,
0.0523399487,
-0.1859666705,
-0.1516281217,
0.2849550545,
0.1004782915,
0.288539499,
0.3445745707,
0.161163941,
-0.0273501091,
0.3952872157,
0.1889088899,
0.1825788915,
0.0537535362,
0.0252609011,
-0.0148767084,
0.0243097078,
0.3026710749,
0.1458865106,
0.0798740685,
0.1653844416,
0.1257156432,
-0.2704929113,
-0.1986795366,
0.2159198821,
-0.020898886,
0.1586563885,
0.1924607158,
0.1183891594,
-0.2064064145,
-0.1669660807,
0.1002589092,
0.062779054,
0.1682896763,
-0.3316837549,
0.0543965772,
-0.5954182744,
-0.3879923224,
-0.7191824317,
0.0387099087,
-0.5594834089,
-0.22673136,
-0.149315387,
0.2969734371,
0.0152958333,
-0.098868534,
-0.0740791857,
-0.2233137041,
-0.4157320857,
-0.6021324992,
0.0733014047,
0.0929684341,
-0.2873402834,
-0.1341561079,
0.1705034375,
0.196279645,
0.161069721,
-0.1474250257,
-0.397949338,
0.0391922295,
-0.1772091687,
0.0086358152,
0.319134891,
0.1386233717,
0.2791623771,
-0.1118232831,
0.2117311358,
-0.1992690265,
0.0509130582,
-0.2132971436,
0.111740455,
-0.0033850633,
0.1127618924,
0.0831768811,
-0.3443768919,
-0.1749545336,
-0.2530122399,
-0.5123716593,
0.2787905335,
0.2166330516,
0.3217906952,
0.7720034719,
-0.1027075797,
0.0091883242,
-0.2122786194,
0.1872673333,
0.2074651718,
0.0008654371,
0.2596406043,
-0.1210705563,
-0.0713395774,
-0.0554442294,
0.0765061378,
0.0564995594,
0.4624172151,
-0.2822107077,
-0.0332991853,
0.0441932455,
0.0560682863,
0.0114328591,
0.1246182024,
0.457680285,
0.2989198864,
0.1324489117,
-0.0288430229,
-0.2807245553,
0.2095328718,
0.5572180748,
0.3679830432,
0.3192530274,
0.205629319,
0.1678476334,
0.3601109087,
-0.042243626,
0.2414948493,
0.2436209768,
0.0004380438,
0.5296663046,
0.180127427,
-0.1364840865,
-0.3310608864,
0.1352424622,
-0.0557459407,
0.0593941063,
0.1209199876,
-0.107251212,
-0.0436490364,
-0.1165578663,
-0.0315995701,
-0.4213883877,
0.0777136013,
-0.1087859794,
0.3773984611,
0.154432267,
0.0712347478,
-0.2669484019,
-0.2816902101,
0.295891434,
0.3518321216,
-0.0145985931,
0.3535120487,
-0.2625627518,
0.0431505591,
-0.3250791728,
-0.3262543976,
-0.0907095596,
0.4920758605,
-0.1388786137,
0.0124440715,
0.205597192,
-0.1642446816,
0.3126723468,
0.0556152239,
0.272423178,
0.1352793574,
-0.110595271,
-0.5586287379,
-0.016193606,
0.0890266001,
0.4203822017,
0.1868071556,
0.0291640814,
-0.0691928044,
-0.1596042067,
0.0529489294,
0.0410506539,
-0.189956218,
-0.2359450161,
-0.2774137557,
-0.4206418395,
-0.2196001709,
-0.0223430507,
0.0532165021,
0.1078632921,
0.3100391626,
-0.0294072032,
0.1922952086,
-0.0837135613,
-0.0561041795,
-0.2963165641,
0.1525682658,
-0.0145511776,
0.0554604232,
-0.0099012488,
0.0194990523,
-0.2392359525,
0.3431770802,
-0.0608597063,
0.1602143496,
-0.3263556659,
-0.0778776556,
0.2681139708,
0.51323843,
-0.017317811,
0.0941149592,
-0.0443653837,
-0.4583926201,
0.0289777666,
0.0902791917,
-0.0448995307,
-0.3154546618,
0.2143295258,
-0.3476986587,
0.1942737252,
0.1721809059,
-0.2360341847,
0.2812295258,
-0.1251882762,
-0.4402067661,
0.2245354056,
0.2467523217,
1.0411496162,
-0.2406394631,
0.3079275787,
0.0620333478,
-0.424690485,
0.3527471721,
-0.2589765191,
0.2856740057,
-0.4480980039,
0.2983216643,
0.0141479373,
-0.3479579091,
-0.144400984,
-0.0763518363,
0.1431494951,
0.2755274177,
0.1016655862,
-0.0920100361,
0.1635090709,
0.3466674685,
-0.1620667279,
-0.3068077862,
-0.4199128151,
0.0050216466,
-0.1081734598,
0.4152197838,
-0.0962902755,
-0.2268124074,
0.1657084972,
-0.1818030775,
0.0427781492,
0.1374815255,
-0.1462483406,
-0.0159095712,
-0.0259536151,
0.0052117556,
0.0082681775,
0.179696694,
-0.1161815971,
0.5233231187,
0.0573030412,
-0.2882833183,
-0.2484290898,
-0.2790074348,
0.0953649282,
0.4519753754,
-0.0233348757,
-0.0182634555,
0.0098475665,
-0.3462280035,
0.0162904225,
-0.2407744527,
0.2226671129,
-0.0171748511,
0.2993597686,
0.1988863051,
-0.1895718426,
-0.0237379596,
0.0130642802,
-0.0495460816,
0.0436640643,
0.1888686121,
-0.157639578,
0.1557564586,
-0.13050583,
-0.3345092833,
0.0824718177,
0.5543605089,
-0.0746296868,
0.1600213945,
0.3207659721,
-0.3217405677,
-0.1697057784,
-0.1431186199,
-0.0554565191,
-0.6015008688,
-0.1151206642,
-0.0963398963,
-0.1213521063,
-0.3210610151,
-0.2910472453,
-0.0775585994,
0.1969452649,
-0.0023173075,
-0.3674697876,
-0.3863798082,
-0.1825270653,
0.0739892721,
0.1213883162,
0.0162809864,
0.2047554851,
-0.0354262777,
0.0652361363,
-0.1276286989,
-0.14982602,
-0.0104738474,
-0.2554085255,
0.0443133079,
0.0218518339,
-0.1618615687,
-0.0238603316,
0.0383477993,
0.0634286702,
0.1393600702,
-0.0296470337,
-0.0312708728,
-0.2038728297,
0.2177937329,
0.1383655071,
-0.0901475176,
0.2828360796,
-0.4831415415,
-0.075783655,
-0.0179248899,
-0.2721806765,
0.3135300279,
0.192670837,
-0.0050090477,
0.1831314117,
0.2811109424,
-0.1277562529,
0.4007118046,
-0.4252769649,
-0.1594094038,
0.0407716222,
0.2038411796,
-0.0085452572,
0.0371546894,
-0.3193020225,
-0.2105538696,
0.0072943754,
0.1087434888,
0.1364181936,
-0.1161946058,
-0.1163049042,
0.2203352451,
-0.0702841654,
0.4635686874,
0.0348292515,
-0.1621771157,
0.5252122879,
0.0169342998,
-0.0106112175,
-0.0934464484,
0.1800397635,
-0.22951895,
-0.0997847691,
0.489600718,
0.2366862744,
-0.2012722194,
0.2985927463,
0.0290997997,
0.3146237135,
0.0366921052,
-0.0767068714,
0.1179005057,
-0.0764639527,
-0.010318581,
-0.0833896995,
0.1200808883,
0.2478791624,
0.1602068096,
0.1114097089,
-0.1465800256,
-0.0346581191,
-0.1222593933,
-0.0823650882,
-0.0275231153,
-0.3606383204,
-0.0261694267,
0.65040344,
0.2208254784,
0.1487554312,
0.0993582606,
-0.3846205473,
0.0283916183,
0.2322046906,
0.1140446216,
-0.1544250548,
-0.0870945603,
-0.1269280612,
0.0022707433,
-0.3815243542,
0.068217054,
-0.0539317392,
-0.4044136703,
0.2946985662,
0.1397664547,
0.1493923068,
-0.4419018924,
0.093709819,
0.2336538732,
0.0106885321,
-0.2599323392,
0.1499803513,
-0.421528697,
0.311658442,
-0.2540074587,
-0.2645659149,
0.18342942,
0.3263543248,
-0.1018267423,
-0.1180416718,
0.0643481612,
0.1971222311,
-0.2784438133,
-0.0048411675,
-0.0827914029,
0.7673552036,
0.1852515191,
-0.0460717604,
-0.0333992578,
-0.3045262098,
0.18526344,
-0.2388022244,
0.0338931344,
0.695524931,
0.2933858335,
-0.0328545943,
-0.111007899,
0.191816628,
-0.2263253629,
0.0878710523,
0.1948157996,
0.18795982,
0.1277548075,
0.0391899683,
0.047143437,
0.2659831941,
0.4307883978,
0.0303581581,
0.1286848187,
0.0742658898,
0.0063470304,
-0.5193610787,
0.1597164869,
-0.3004152477,
-0.2859911919,
-0.0224701725,
-0.1499350667,
0.0279867873,
0.0471541435,
-0.2507306933,
0.1889442205,
0.1043843701,
-0.0416015536,
0.1218469962,
-0.0788701922,
0.1488675624,
-0.0447727889,
0.0158180967,
-0.4161305428,
-0.0405766368,
-0.1405722201,
-0.0442639478,
-0.1302484125,
-0.1614126712,
-0.1218258888,
0.195751369,
0.149196893,
-0.0281866062,
0.0009699501,
0.0020339787,
-0.4676627219,
-0.3582862616,
0.0724991262,
-0.3128369451,
-0.1168835908,
-0.0736810714,
0.0873698145,
-0.342038095,
-0.0710327476,
-0.6486017704,
0.287296176,
-0.0913354307,
0.2416807562,
0.2341254056,
0.0768069401,
-0.2380165756,
0.2777352929,
0.1358682364,
0.1072092801,
0.0465158373,
0.0664104223,
-0.4112356305,
-0.1507119983,
0.4748026133,
-0.0082357265,
-0.1946942657,
0.1906995624,
0.2914387882,
-0.0253206864,
-0.0612464771,
-0.3841491938,
-0.2049589008,
0.2253570557,
-0.1689433455,
0.1071543992,
0.0274887085,
-0.318112731,
0.0742193684,
0.0069107041,
0.3483942449,
-0.0557657667,
-0.1817989349,
0.0871191323,
-0.3096681535
] |
https://github.com/huggingface/datasets/issues/554 | nlp downloads to its module path | Ok interesting. We could have another check to see if it's possible to download and import the datasets script at another location than the module path. I think this would probably involve tweaking the python system path dynamically.
I don't know anything about Nix so if you want to give this a try your self we can guide you or you can give us more information on your general project and how this works.
Regarding `nlp` and `transformers`, we are not sure `nlp` will become a required dependency for `transformers`. It will probably be used a lot in the examples but I think it probably won't be a required dependency for the main package since we try to keep it as light as possible in terms of deps.
Happy to help you make all these things work better for your use-case | I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`? | 141 | nlp downloads to its module path
I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`?
Ok interesting. We could have another check to see if it's possible to download and import the datasets script at another location than the module path. I think this would probably involve tweaking the python system path dynamically.
I don't know anything about Nix so if you want to give this a try your self we can guide you or you can give us more information on your general project and how this works.
Regarding `nlp` and `transformers`, we are not sure `nlp` will become a required dependency for `transformers`. It will probably be used a lot in the examples but I think it probably won't be a required dependency for the main package since we try to keep it as light as possible in terms of deps.
Happy to help you make all these things work better for your use-case | [
-0.0510633402,
0.3298893869,
0.1181987673,
0.0430238582,
0.2496244609,
-0.1069277227,
-0.0243600346,
0.0844146386,
0.0435672328,
-0.1043165475,
0.1716201305,
0.8944585323,
-0.3751600385,
0.0496555045,
0.4546217024,
0.3453480005,
-0.2024182677,
-0.0113469251,
-0.5702017546,
-0.0118617713,
-0.1689659059,
0.3025685251,
0.0895463824,
0.2100647688,
0.0889092386,
-0.238259539,
0.1604039073,
0.2349801809,
-0.1778025925,
-0.3359405994,
0.2526457608,
0.0436159112,
0.1131191552,
0.3050607741,
-0.000122344,
-0.1219699606,
0.1226147562,
-0.2054618001,
-0.5831146836,
-0.4315142632,
-0.0276231617,
-0.1339646876,
0.2938423455,
-0.3086283803,
0.4444719553,
0.4372785985,
0.2576223612,
-0.0503476188,
0.1188962832,
0.2702377141,
0.1533282697,
0.3109491467,
-0.1096136272,
0.2995908856,
0.2616928518,
0.2531506717,
-0.1965017617,
0.3940302134,
-0.1449603587,
-0.1727854609,
0.1455774903,
-0.0195417311,
-0.1950077564,
-0.2205272913,
0.4399978518,
0.0483361818,
0.4507362247,
-0.1525627375,
-0.146465078,
-0.0134352818,
0.3437532485,
-0.6811643839,
-0.3214659095,
-0.3661264479,
0.1135827973,
0.010540776,
0.4246501327,
0.3493975401,
-0.2088426799,
0.1319355965,
0.0785172582,
-0.4541172087,
-0.1879278868,
0.3013217449,
0.0854254812,
0.4778109193,
0.013261456,
0.0066709705,
0.2882021368,
0.2142532766,
0.3760665059,
0.0814089328,
0.1321114004,
0.2844474316,
0.2851186395,
-0.0205892771,
-0.1099530905,
0.1793449372,
-0.0300104022,
-0.0847233608,
0.0232696645,
-0.1208943129,
-0.149371326,
0.1514127254,
0.2035764009,
-0.2474545836,
0.008623559,
0.1453721821,
0.199593544,
0.1786697805,
0.2614901364,
0.0819997117,
-0.1421169043,
-0.0424143597,
-0.1242886633,
-0.1236253381,
0.0732089877,
-0.1548984647,
-0.1208457872,
-0.2121137083,
-0.1051157415,
0.2327279747,
0.1520647705,
0.0448908843,
-0.0963122547,
0.1335331202,
0.2258944213,
0.0212668628,
-0.2801139355,
0.3634818196,
0.0146193653,
0.1916172206,
-0.285923034,
0.0753070489,
0.1577266157,
0.1072306708,
0.3415999115,
-0.2339809984,
-0.1381296515,
-0.1220097691,
-0.0012775287,
0.2555346191,
-0.1804873347,
0.3423956335,
0.1419741213,
0.439419657,
-0.255059123,
-0.1631758511,
-0.4166957736,
0.1174448058,
-0.2558012009,
-0.3332112134,
0.1345807165,
0.0606481507,
-0.297681123,
-0.1739301533,
0.1332337856,
-0.6403433084,
0.2606747448,
-0.2005108297,
0.1565440744,
-0.2298748642,
-0.1998522282,
-0.2740627527,
-0.0539590158,
0.553596437,
0.1542537361,
-0.2462254316,
-0.2222321332,
0.0164716803,
0.3207408786,
0.1756647527,
-0.1314172596,
0.1409682184,
-0.2844080627,
-0.2637609839,
0.9092648029,
-0.438229233,
-0.3080138862,
0.3058412969,
-0.6308795214,
-0.0351979025,
0.1344605386,
0.1774562001,
-0.4078785181,
-0.0254899971,
0.0482033007,
0.7276786566,
0.2470006794,
0.0803027153,
-0.2167286426,
-0.3240263462,
0.2392051518,
-0.0756336674,
-0.0015353691,
-0.0259256065,
-0.1296989024,
0.5696966052,
0.5710896254,
0.1803893894,
0.0185483173,
0.3647147715,
0.4654218256,
0.1909323037,
-0.2792431116,
-0.3242797256,
-0.0685012937,
-0.0869884863,
-0.6172343493,
0.0831272006,
-0.5409405828,
-0.1930329055,
-0.1252245605,
0.1388613582,
0.1718780547,
0.0744482949,
-0.0156724937,
0.2380428165,
-0.0023275837,
0.2084996402,
-0.1346631348,
0.4996418059,
-0.0731389895,
0.1621952802,
-0.2590226233,
0.058038421,
-0.1896460354,
-0.1445164531,
-0.011588037,
0.4375536144,
0.1188166738,
-0.1396407932,
0.0758900121,
0.4306361079,
-0.18022421,
0.0920957774,
-0.229440406,
0.0527637154,
0.2509596646,
0.061194934,
0.0663018525,
0.2502961755,
-0.0160636138,
0.176063925,
-0.2467819303,
0.0931271091,
-0.5122022033,
0.1198065951,
0.1599705517,
0.2397342622,
0.0859845802,
0.0437983721,
-0.1961194873,
-0.1598123908,
0.2977915704,
0.130141601,
0.2628801167,
0.3170698285,
0.1482846886,
0.0396015942,
0.4440465868,
0.1459557712,
0.1853349507,
0.0026252773,
-0.0029063374,
0.0064197853,
0.0199257117,
0.3390772939,
0.1651230454,
0.0997666493,
0.1520308703,
0.1469428092,
-0.27741763,
-0.21398592,
0.1724487543,
-0.0141219832,
0.1878109574,
0.1951906979,
0.0787310824,
-0.1810028553,
-0.181981504,
0.0489850044,
0.0554882288,
0.1607275158,
-0.3464356065,
0.0695947781,
-0.6128370166,
-0.4567551613,
-0.7045783401,
0.1096615195,
-0.5078395605,
-0.2648010552,
-0.1699566692,
0.2932732105,
0.0135285631,
-0.1058644131,
-0.1143264621,
-0.1898261458,
-0.4194675684,
-0.5510391593,
0.0548676848,
0.054934077,
-0.2664962709,
-0.1325834095,
0.2395806164,
0.2251582146,
0.1956442147,
-0.1559056342,
-0.3618601859,
0.0104886219,
-0.1625342816,
0.0291623939,
0.2996200621,
0.1283691376,
0.2688086927,
-0.0790973753,
0.2188512534,
-0.1453213096,
0.0332650244,
-0.2137674391,
0.075983271,
-0.0186205134,
0.1018567532,
-0.0267566759,
-0.3278630078,
-0.2171004862,
-0.2940073311,
-0.5226595998,
0.2773311734,
0.2654494941,
0.3282834589,
0.7198588848,
-0.1003149003,
0.0236580074,
-0.2349623293,
0.1796554327,
0.1849346906,
0.0104819164,
0.3286429048,
-0.1557528377,
-0.1278235018,
-0.0141997486,
0.0795354396,
0.0719749033,
0.4339600801,
-0.3098497391,
-0.0570610054,
0.0136777069,
0.0851120204,
0.0483783707,
0.1605342478,
0.4640448391,
0.2874493003,
0.1406029761,
-0.0581678599,
-0.2852256298,
0.1471215487,
0.5211665034,
0.3329202235,
0.2887784541,
0.2531728745,
0.1671908349,
0.3276451826,
0.0030273572,
0.2196825743,
0.2446997017,
-0.0253734682,
0.5758996606,
0.2007049024,
-0.1775512248,
-0.3279164135,
0.1077799574,
-0.0169587433,
0.056640245,
0.08138448,
-0.0753143057,
-0.073992312,
-0.1050529331,
-0.0269178972,
-0.423756808,
0.0826269314,
-0.0962540582,
0.3312709332,
0.2238527983,
0.1235561594,
-0.2656934857,
-0.2641184926,
0.2756396532,
0.3231791556,
0.0153705515,
0.3640688658,
-0.3138629794,
0.0321954489,
-0.3797558546,
-0.3239935637,
-0.1088467389,
0.4844843149,
-0.1609845757,
0.030500941,
0.2107772976,
-0.1735965908,
0.3358689547,
-0.0021740235,
0.2514638305,
0.155941993,
-0.0984620079,
-0.6052797437,
-0.0064763203,
0.029450424,
0.4140464365,
0.2112329602,
0.0046945065,
-0.0695411488,
-0.1452971101,
0.0879954547,
0.0664717555,
-0.2027294636,
-0.2055309266,
-0.2619062662,
-0.4269974828,
-0.2829477787,
-0.0546451397,
0.0684246421,
0.0742828175,
0.2629223764,
-0.0338505246,
0.1958602667,
-0.0462717228,
-0.039327316,
-0.2717517316,
0.1334656775,
0.0085924082,
0.1045138687,
-0.03164915,
0.0204998814,
-0.1956699491,
0.4103148878,
-0.0548784025,
0.1217808202,
-0.3207539022,
-0.0974598825,
0.3002342582,
0.5241622925,
0.0014360063,
0.0281787924,
-0.045609504,
-0.4427624345,
0.0347296111,
0.0793784112,
-0.0407571457,
-0.302377522,
0.2004399747,
-0.4026098847,
0.1797270477,
0.1817047447,
-0.2110960484,
0.2459633201,
-0.1306909025,
-0.3887244463,
0.215353936,
0.1915545315,
1.0236369371,
-0.2396775037,
0.3599081337,
0.0731763244,
-0.3844379187,
0.4012411535,
-0.2609765828,
0.2450613678,
-0.4522325099,
0.2662662268,
-0.0015369654,
-0.3288940191,
-0.1334096044,
-0.0633258075,
0.133866936,
0.3303493857,
0.0987855792,
-0.0731310248,
0.156696558,
0.3976759613,
-0.1640678942,
-0.3028287888,
-0.3594768047,
0.0098462701,
-0.1150180325,
0.4567201138,
-0.0818569586,
-0.2235650718,
0.1479724199,
-0.1842610538,
0.0058867782,
0.2117664665,
-0.1513444483,
0.0323051848,
-0.0858149976,
0.0216812044,
0.0550216287,
0.233132422,
-0.0863150358,
0.5087726712,
0.0393290147,
-0.2616963089,
-0.2504292727,
-0.298161298,
0.1080785021,
0.4550250471,
0.0318251848,
-0.0254155658,
0.0077721477,
-0.3527023792,
0.06633389,
-0.2313762605,
0.1986923963,
0.0110733677,
0.2796638012,
0.1670224965,
-0.2244541198,
-0.0015969872,
0.0190634504,
-0.0449288413,
0.0379767343,
0.2209534943,
-0.1191038117,
0.1960102022,
-0.0633235425,
-0.3193190098,
0.0726590306,
0.5721095204,
-0.1113176942,
0.1076376289,
0.3150822818,
-0.2308073491,
-0.1910786033,
-0.1671550572,
-0.0601864159,
-0.5337633491,
-0.1317316592,
-0.1004975662,
-0.1209858507,
-0.2803729773,
-0.2866315544,
-0.1232301146,
0.1319796145,
-0.0445768982,
-0.3509905338,
-0.4225004613,
-0.1497666538,
0.1053398401,
0.1956517845,
-0.0248332154,
0.2506741881,
-0.0674876273,
0.0741260573,
-0.1372200251,
-0.1543278694,
-0.02142369,
-0.1949544698,
0.0507558808,
0.1204763949,
-0.1465535313,
-0.0282833353,
0.006455496,
0.0490756705,
0.1193576381,
-0.0334135033,
-0.0371216163,
-0.1996514648,
0.2175134569,
0.154966414,
-0.083743751,
0.2546330988,
-0.4744588137,
-0.088923946,
-0.0554785095,
-0.2485375404,
0.3140679598,
0.1804845929,
0.0298968926,
0.1788474768,
0.2484348267,
-0.1243827045,
0.4295687675,
-0.4073115587,
-0.1309646815,
0.0366648994,
0.2387415022,
0.0165812075,
0.0533032492,
-0.3374196291,
-0.1995885521,
0.0303790942,
0.1637501419,
0.1828976274,
-0.13556014,
-0.0794109777,
0.2315356135,
-0.0689691752,
0.5128160715,
-0.0072585717,
-0.1286970079,
0.5030200481,
0.0117334221,
-0.045893684,
-0.0972714573,
0.2089648247,
-0.1990009397,
-0.1058806032,
0.4660561085,
0.2210087776,
-0.1808511168,
0.3089651167,
0.0340251997,
0.3652095795,
0.0222476013,
-0.0381954163,
0.1044050306,
-0.0497451909,
-0.0288079455,
-0.1028468981,
0.1517294496,
0.2119783163,
0.2232205421,
0.076206252,
-0.1425382346,
-0.018712543,
-0.1013327241,
-0.0659231693,
-0.0458264127,
-0.3608074784,
-0.0269390494,
0.6276561022,
0.2042370439,
0.1547160149,
0.0871562958,
-0.342946887,
0.055411119,
0.1697135419,
0.1552277356,
-0.1366814971,
-0.0642932653,
-0.1878052354,
0.0336258262,
-0.3584003448,
0.095984444,
-0.0487387031,
-0.4258346856,
0.2607396245,
0.1607657373,
0.1171656102,
-0.4328007996,
0.1650246382,
0.1692381203,
-0.0207435898,
-0.238696754,
0.1550819427,
-0.4366210997,
0.3467750549,
-0.2618817985,
-0.2523039877,
0.2180019468,
0.3444643319,
-0.0700986534,
-0.0922984779,
0.0812104493,
0.1643690169,
-0.3064894974,
-0.0062732995,
-0.1060756445,
0.7569425702,
0.2321662903,
-0.035925176,
-0.0273308121,
-0.3327260017,
0.2151863575,
-0.2924383581,
0.0483578518,
0.6891188025,
0.2632905841,
-0.0477684028,
-0.1519180685,
0.1965536922,
-0.2658857703,
0.0746656209,
0.2499925196,
0.1845076531,
0.1303054541,
-0.0365407839,
0.0434276424,
0.2541339397,
0.4254163206,
0.0584160239,
0.1500280499,
0.0744768083,
0.0282470584,
-0.5763671994,
0.1385063827,
-0.3032062054,
-0.3131724,
-0.0199990347,
-0.1014503539,
-0.0216269083,
0.0585196018,
-0.2070566565,
0.1820064038,
0.1419277936,
-0.0399256758,
0.0783201158,
-0.099276945,
0.1515910029,
-0.0067263413,
0.0335789099,
-0.4286132455,
-0.0621461459,
-0.21827057,
-0.0557155386,
-0.1428427547,
-0.1603760123,
-0.1300808787,
0.1754812598,
0.1766573489,
-0.079376027,
0.0945206806,
-0.0011484995,
-0.4309162498,
-0.4071859717,
0.0395413116,
-0.3112214804,
-0.0475024134,
-0.0774855763,
0.0218865797,
-0.3503715396,
-0.0836385339,
-0.5981820226,
0.2577427328,
-0.108034566,
0.2534947991,
0.2370180339,
0.0655616,
-0.2741667926,
0.3343587816,
0.1108228415,
0.051928971,
0.008031534,
0.0687488839,
-0.3790955544,
-0.1683958471,
0.4932088852,
-0.0010689795,
-0.169028461,
0.1351237595,
0.2747317851,
-0.0612602308,
-0.1200228408,
-0.3803346753,
-0.1763636172,
0.1848459691,
-0.1816513091,
0.1024645492,
0.0878539234,
-0.3482233882,
0.0780845881,
-0.0038824081,
0.3832035065,
-0.075927496,
-0.238291353,
0.0604647622,
-0.2934118211
] |
https://github.com/huggingface/datasets/issues/554 | nlp downloads to its module path | @danieldk modules are now installed in a different location (by default in the cache directory of the lib, in `~/.cache/huggingface/modules`). You can also change that using the environment variable `HF_MODULES_PATH`
Feel free to play with this change from the master branch for now, and let us know if it sounds good for you :)
We plan to do a release in the next coming days | I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`? | 65 | nlp downloads to its module path
I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`?
@danieldk modules are now installed in a different location (by default in the cache directory of the lib, in `~/.cache/huggingface/modules`). You can also change that using the environment variable `HF_MODULES_PATH`
Feel free to play with this change from the master branch for now, and let us know if it sounds good for you :)
We plan to do a release in the next coming days | [
0.0594275147,
0.2854040563,
0.1192257404,
0.0074427472,
0.2506990433,
-0.1265824586,
-0.0366812237,
0.0295479409,
0.0179712847,
-0.1192385703,
0.1451895535,
0.9340403676,
-0.3485209942,
-0.0550443307,
0.4481516778,
0.3217201531,
-0.1729901284,
-0.0191879682,
-0.542478323,
-0.0087752417,
-0.1322143525,
0.343559742,
0.0899612308,
0.2194030583,
0.1086245775,
-0.2739526629,
0.1533592641,
0.183522597,
-0.1502883434,
-0.36144045,
0.220069766,
0.0330721363,
0.0396202691,
0.2460677028,
-0.0001222594,
-0.0723030716,
0.0618372858,
-0.1991174519,
-0.5908343792,
-0.4036875367,
0.0073661208,
-0.1110638529,
0.3317893744,
-0.2588050961,
0.4212763906,
0.4604953825,
0.2072208822,
-0.0632978603,
0.0779820234,
0.1990578771,
0.1536842287,
0.2625417411,
-0.1157120168,
0.3399080336,
0.2558582127,
0.2060862184,
-0.2066239417,
0.4231727421,
-0.1807649136,
-0.1173193753,
0.1137164533,
0.0151540004,
-0.1605420113,
-0.2232349366,
0.4218440652,
0.0965172797,
0.506522119,
-0.1041273475,
-0.1783573925,
-0.0576235726,
0.248817414,
-0.6838286519,
-0.3021194935,
-0.3956839442,
0.0583832972,
-0.0110458583,
0.4367426634,
0.2950362563,
-0.1969234198,
0.1277931333,
0.1174732968,
-0.4387531579,
-0.1516217589,
0.2879822552,
0.0643962324,
0.5356556773,
-0.0010864343,
-0.0201422051,
0.3124946356,
0.223344028,
0.3028104901,
0.1413411349,
0.1304738522,
0.2637380958,
0.3066149354,
-0.0090491101,
-0.1266915798,
0.1287929863,
-0.107670173,
-0.086409986,
0.0486151762,
-0.1003775299,
-0.1001465321,
0.1405427307,
0.1284126043,
-0.2314267606,
-0.0331276618,
0.1172136366,
0.1847463399,
0.1608625799,
0.237539351,
0.048755154,
-0.0829557031,
-0.0250670798,
-0.0955916122,
-0.1017730981,
0.0238331482,
-0.1320400536,
-0.1010761634,
-0.2369565219,
-0.1393792778,
0.2759145498,
0.1524430215,
0.0107963625,
-0.0849052519,
0.1406838298,
0.2397924066,
0.0124134645,
-0.3015606105,
0.4846945107,
0.0200684033,
0.1443137825,
-0.315993607,
0.0632193238,
0.1285135448,
0.0914972425,
0.363730669,
-0.2417957485,
-0.1056755483,
-0.1247903258,
-0.0109361708,
0.2845504582,
-0.1785973012,
0.3613258898,
0.0448549762,
0.42775172,
-0.2470032275,
-0.142370984,
-0.4103897214,
0.1094858944,
-0.265501827,
-0.3513542414,
0.1702570766,
0.0741486698,
-0.2385265976,
-0.1675655544,
0.2285507321,
-0.6515526772,
0.2796331048,
-0.1777371019,
0.1845385283,
-0.2027803063,
-0.1684631407,
-0.2977360189,
-0.0550171882,
0.4658381343,
0.2440616488,
-0.2764919996,
-0.2273944318,
0.0714985728,
0.3334228992,
0.1575911641,
-0.0960773304,
0.1057571918,
-0.2538481951,
-0.2903706729,
0.9286291003,
-0.4024107754,
-0.2970967889,
0.2756409049,
-0.694303751,
-0.0525970683,
0.1200877652,
0.1547678262,
-0.4531957507,
-0.0136170089,
0.077731818,
0.7065263987,
0.2650772631,
0.1286934912,
-0.254938364,
-0.3223256469,
0.2341713607,
-0.1441070139,
-0.0270408373,
-0.0166179352,
-0.1293729395,
0.6533603072,
0.5752850175,
0.1956339926,
0.0006466322,
0.3474602103,
0.5321193337,
0.2032573819,
-0.2729884982,
-0.2872486413,
0.0032282416,
-0.0935835391,
-0.675114274,
0.1077949107,
-0.5177381039,
-0.1886841059,
-0.1189369708,
0.1137022749,
0.174747631,
0.0378866456,
0.0076453313,
0.2549014986,
0.0228086039,
0.2261761129,
-0.1326500326,
0.4574881494,
-0.0661611706,
0.1869138181,
-0.1980348676,
0.0420324504,
-0.2060027272,
-0.1463299394,
-0.0704802722,
0.500362277,
0.1262577623,
-0.1380063742,
0.0808470324,
0.4534176588,
-0.2065955997,
0.0868908614,
-0.2848191261,
0.0214222223,
0.3208906949,
0.0965325013,
0.1042212173,
0.2300525308,
-0.0680667683,
0.1967656463,
-0.2024433911,
0.1004066914,
-0.4833720624,
0.0858843401,
0.1622383744,
0.2138824016,
0.0588475689,
0.0408265218,
-0.1509971619,
-0.1954517365,
0.2688578367,
0.1386808753,
0.2791649699,
0.3665447831,
0.1968340129,
0.0444832966,
0.3903587759,
0.2215690464,
0.1931091398,
0.0381871462,
0.0259095505,
-0.0182068534,
0.0148507524,
0.2840833366,
0.1803555191,
0.0792314485,
0.1897486448,
0.1379859,
-0.2874858677,
-0.2119709551,
0.2023923397,
-0.0441416763,
0.1804205477,
0.2209618688,
0.1378940493,
-0.1611261964,
-0.1813827902,
0.1015158817,
0.0782350376,
0.1337402314,
-0.2980786562,
0.0790634453,
-0.6222836971,
-0.3927907348,
-0.7182301879,
0.0430183932,
-0.548984766,
-0.2475867122,
-0.1140794605,
0.2927652001,
0.0120861754,
-0.1029174253,
-0.0963459089,
-0.1998204589,
-0.4088549018,
-0.5224941373,
0.0758769661,
0.0744950026,
-0.3028309941,
-0.121731773,
0.1361252964,
0.1972018778,
0.1763102859,
-0.1033911929,
-0.3834488988,
0.0270641074,
-0.2044579834,
0.0011425372,
0.3371043801,
0.1227915064,
0.2648777366,
-0.1386474818,
0.203850314,
-0.1997370124,
0.0460611247,
-0.2331741154,
0.0697177574,
-0.0082562938,
0.1032613292,
0.074944362,
-0.3153725863,
-0.2225982845,
-0.2827697396,
-0.4827473164,
0.297162801,
0.2415864766,
0.3336548209,
0.767028749,
-0.1446877718,
-0.0249776784,
-0.2470213771,
0.1891745031,
0.1809889972,
0.025269866,
0.2319225222,
-0.1236328557,
-0.0485380292,
-0.0837115869,
0.0633225515,
0.059876956,
0.479137063,
-0.2922407389,
-0.0563171171,
0.0676810443,
0.0439295322,
0.0678464919,
0.1174628362,
0.4729620814,
0.2859097421,
0.1283735037,
-0.0220057517,
-0.3026000857,
0.1876471788,
0.5915963054,
0.3701995611,
0.2775199115,
0.1701225042,
0.2223976254,
0.3195581734,
-0.0285747796,
0.2068916857,
0.2589375973,
0.0239454489,
0.5233564377,
0.176519841,
-0.1383139044,
-0.3046540022,
0.1189450175,
-0.0527246967,
0.0406037644,
0.1409619004,
-0.1012793854,
-0.024823904,
-0.0608642176,
-0.0413630195,
-0.4126470089,
0.0877082571,
-0.1293939203,
0.3402693272,
0.1934720278,
0.0796751007,
-0.2661870718,
-0.2592256963,
0.2928960025,
0.3117831945,
0.006999068,
0.3332321942,
-0.2916761041,
0.0138886273,
-0.3250702322,
-0.3397918046,
-0.0627193451,
0.4760495424,
-0.1372724771,
-0.0144195706,
0.2336530685,
-0.143197909,
0.2587842345,
0.0479316711,
0.231487453,
0.1666575372,
-0.1175596714,
-0.5710587502,
-0.0377744623,
0.0776695162,
0.4122166634,
0.1997946799,
0.0090892036,
-0.0728258193,
-0.1482962668,
0.0782655627,
0.0250604674,
-0.210338667,
-0.192157492,
-0.3132321835,
-0.4200138748,
-0.2545806766,
-0.0016289838,
0.0169380754,
0.0680179894,
0.2967764735,
-0.051097393,
0.2507218122,
-0.0551026389,
-0.0773961693,
-0.2792869508,
0.1545900106,
-0.0144263767,
0.0327638835,
-0.0014919047,
0.0513734147,
-0.2495862991,
0.3455180824,
-0.080132626,
0.1714082658,
-0.3153800368,
-0.0383838192,
0.260915041,
0.5330733657,
0.0268945321,
0.1530143321,
-0.0604953468,
-0.4698383808,
0.0383240059,
0.0581181571,
-0.0631177574,
-0.3346704245,
0.2662449479,
-0.3325277567,
0.179911539,
0.1377165616,
-0.2574651241,
0.2627463639,
-0.1769569516,
-0.4302586615,
0.2455015481,
0.2469619066,
0.9723243713,
-0.2307321131,
0.3222674727,
0.0736391395,
-0.4327773452,
0.3743034601,
-0.2826935053,
0.2952538133,
-0.4656217694,
0.3089419603,
0.0315965116,
-0.3489758968,
-0.1434809864,
-0.0949159265,
0.1160011142,
0.2775853872,
0.1108723581,
-0.0856861845,
0.154230237,
0.3288412988,
-0.166770041,
-0.3083037734,
-0.4076281786,
0.0087808371,
-0.0911316425,
0.48111552,
-0.1528787166,
-0.2010471225,
0.1455898434,
-0.1341897398,
0.0559143573,
0.1484690905,
-0.1410249025,
0.0270282254,
-0.0384529941,
-0.0294565484,
-0.0405972935,
0.1738118827,
-0.0472844578,
0.5029783249,
0.0930438936,
-0.2813098133,
-0.2325602621,
-0.281598717,
0.1130905747,
0.410628885,
-0.012892995,
-0.0304560214,
-0.0021414459,
-0.376860857,
0.0323992856,
-0.2656217813,
0.2018043697,
0.0022115372,
0.2960528433,
0.1907170117,
-0.13251324,
0.0046178885,
0.0488899499,
-0.0576300696,
0.0454693027,
0.2137836814,
-0.1503866762,
0.1730719358,
-0.0952474847,
-0.3525272012,
0.0890104026,
0.5636548996,
-0.0558687858,
0.1307430565,
0.2681157589,
-0.3183728158,
-0.1945942342,
-0.1448386312,
-0.1017144918,
-0.5875366926,
-0.0966898724,
-0.102417402,
-0.1494129151,
-0.3403918147,
-0.2876188457,
-0.1150793135,
0.190659374,
-0.0029872786,
-0.397616744,
-0.3932398558,
-0.1758128405,
0.1078059748,
0.116841495,
-0.0234045461,
0.2263485491,
-0.0421321541,
0.0578668565,
-0.123500593,
-0.1513016075,
0.0047079884,
-0.2057930529,
0.064661108,
0.0125344954,
-0.2309375703,
-0.0588081256,
0.0386310257,
0.0674866065,
0.165131256,
-0.0121786501,
-0.0329804607,
-0.1857110411,
0.2187646776,
0.1187469214,
-0.082100831,
0.2349190265,
-0.4461459517,
-0.0323821902,
-0.0367246494,
-0.2283815593,
0.2963554859,
0.2207996547,
-0.0084143952,
0.1888849288,
0.2444492728,
-0.1081677824,
0.3947142065,
-0.4404296875,
-0.1914958954,
0.0587918423,
0.2005116642,
-0.0385498255,
0.0262690932,
-0.301215291,
-0.1798776835,
0.01530076,
0.1391179264,
0.1552980095,
-0.1375177205,
-0.1048078984,
0.2277950495,
-0.0768609047,
0.4794977307,
0.0657888427,
-0.1560293734,
0.4810071886,
0.0181208067,
-0.0185830519,
-0.0961592868,
0.2216252238,
-0.2013338953,
-0.1024582833,
0.4729755819,
0.2948073447,
-0.2150395811,
0.3210659027,
0.0439506546,
0.3380325437,
0.0131287538,
-0.0645013452,
0.0703041255,
-0.0233831182,
0.0092221051,
-0.0728378445,
0.1210109666,
0.220123142,
0.1502482742,
0.1225648969,
-0.1162340119,
-0.0255551822,
-0.1099663824,
-0.0900650173,
-0.0236116759,
-0.3888562918,
-0.0020134971,
0.6453686953,
0.1920177937,
0.1655849814,
0.0818610042,
-0.3722474575,
-0.0144481733,
0.2004948258,
0.1283696443,
-0.2168526649,
-0.0728574023,
-0.1325114965,
0.0121030658,
-0.3438657522,
0.0954855084,
-0.0437258705,
-0.3884956539,
0.3022677302,
0.111462906,
0.1269278228,
-0.4695099294,
0.1270018518,
0.200374648,
0.0192160886,
-0.2104679644,
0.1976361573,
-0.4272417426,
0.2984758615,
-0.2829223275,
-0.2194395661,
0.1775115132,
0.3179078102,
-0.1029377058,
-0.0928038731,
0.0829831362,
0.1744610667,
-0.3017721772,
-0.0103357807,
-0.077597253,
0.7949565649,
0.1794159412,
-0.04315947,
-0.0475341938,
-0.3129453361,
0.2206238955,
-0.2494148314,
0.0387259349,
0.6695900559,
0.3283214569,
-0.0491796285,
-0.0740500912,
0.2219732702,
-0.261903882,
0.061901506,
0.2014832795,
0.144931227,
0.1322550774,
0.0189827159,
0.0547857359,
0.3298667967,
0.3813298643,
-0.0070943758,
0.1342584938,
0.0639848039,
-0.0140788704,
-0.5347215533,
0.1711657494,
-0.3061718643,
-0.2194456905,
-0.024555428,
-0.1132669821,
-0.0000204854,
0.0607563965,
-0.2270717174,
0.1703409106,
0.1308678091,
-0.0588850752,
0.0626301467,
-0.0209635459,
0.1846420169,
-0.0497379303,
0.0059670061,
-0.4095643163,
-0.0503587797,
-0.1564159244,
-0.0454777405,
-0.1061543822,
-0.1463034749,
-0.1245376915,
0.2262840867,
0.1479840279,
-0.0311846547,
-0.0018009506,
0.0141692013,
-0.4841277301,
-0.3667884767,
0.0356944501,
-0.2991975546,
-0.1208487451,
-0.117977649,
0.0703829676,
-0.3258705437,
-0.0642838031,
-0.6216781139,
0.2913424075,
-0.0578833111,
0.2728646696,
0.2130307257,
0.0763408616,
-0.2482999414,
0.2943146527,
0.1480212212,
0.1066601276,
0.0531157404,
0.0504270643,
-0.3799440861,
-0.1667603403,
0.4578513503,
0.0323450565,
-0.1924960166,
0.1639104187,
0.2948585451,
-0.042120412,
-0.075789541,
-0.3868122101,
-0.2089803815,
0.2247837782,
-0.1893405914,
0.1238151938,
0.0422795489,
-0.3343694508,
0.0257922262,
0.0008600429,
0.3600986302,
-0.0796989501,
-0.1856144071,
0.0677080229,
-0.32002756
] |
https://github.com/huggingface/datasets/issues/554 | nlp downloads to its module path | > Feel free to play with this change from the master branch for now, and let us know if it sounds good for you :)
> We plan to do a release in the next coming days
Thanks for making this change! I just packaged the latest commit on master and it works like a charm now! :partying_face: | I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`? | 58 | nlp downloads to its module path
I am trying to package `nlp` for Nix, because it is now an optional dependency for `transformers`. The problem that I encounter is that the `nlp` library downloads to the module path, which is typically not writable in most package management systems:
```>>> import nlp
>>> squad_dataset = nlp.load_dataset('squad')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 530, in load_dataset
module_path, hash = prepare_module(path, download_config=download_config, dataset=True)
File "/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/load.py", line 329, in prepare_module
os.makedirs(main_folder_path, exist_ok=True)
File "/nix/store/685kq8pyhrvajah1hdsfn4q7gm3j4yd4-python3-3.8.5/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [Errno 30] Read-only file system: '/nix/store/2yhik0hhqayksmkkfb0ylqp8cf5wa5wp-python3-3.8.5-env/lib/python3.8/site-packages/nlp/datasets/squad'
```
Do you have any suggested workaround for this issue?
Perhaps overriding the default value for `force_local_path` of `prepare_module`?
> Feel free to play with this change from the master branch for now, and let us know if it sounds good for you :)
> We plan to do a release in the next coming days
Thanks for making this change! I just packaged the latest commit on master and it works like a charm now! :partying_face: | [
0.0607000887,
0.2831606269,
0.125246793,
0.0024894141,
0.2875500619,
-0.1234022453,
-0.0438721478,
0.0497940294,
0.0141631812,
-0.1077146083,
0.18563357,
0.9319190383,
-0.3687042296,
-0.0051251426,
0.4456729889,
0.3321512341,
-0.1585527658,
-0.0046790279,
-0.5357555151,
-0.0219643787,
-0.1532682925,
0.3509792089,
0.1030698866,
0.222655803,
0.0984960571,
-0.2493530959,
0.1723136902,
0.1874394268,
-0.1824508011,
-0.3391272426,
0.2290156782,
0.0284773074,
0.0377189592,
0.2479455471,
-0.0001226053,
-0.0975797102,
0.096115455,
-0.1961422563,
-0.5711399317,
-0.4336972237,
-0.0093920231,
-0.1244055107,
0.3227508068,
-0.276314646,
0.4087741375,
0.4453664422,
0.2210508585,
-0.0199711323,
0.0893472359,
0.1858505458,
0.154252708,
0.2747377753,
-0.110392198,
0.3242487609,
0.2490098178,
0.2277179062,
-0.2165385187,
0.4389006793,
-0.1975560337,
-0.1600364745,
0.1131768674,
0.0045973472,
-0.1425085068,
-0.2446831167,
0.4304544032,
0.0933975652,
0.5288751125,
-0.1197959632,
-0.1822492182,
-0.0372461528,
0.276124835,
-0.6750579476,
-0.287358284,
-0.3752099276,
0.0746398121,
0.0016227514,
0.4576094449,
0.3321341276,
-0.2250932455,
0.1359218657,
0.1363035887,
-0.42854321,
-0.1708864421,
0.3129567206,
0.0732557476,
0.5276491046,
0.0064697377,
-0.0034219418,
0.3391950727,
0.2261256874,
0.2892917693,
0.1331439614,
0.1072314531,
0.2623793483,
0.3028275371,
-0.029912211,
-0.1308885515,
0.1408720762,
-0.0703875348,
-0.1001490429,
0.0285718553,
-0.1111958623,
-0.1021449715,
0.1614953578,
0.1754118651,
-0.2487051785,
-0.0226980709,
0.1262590885,
0.1851589084,
0.1101535037,
0.2593807876,
0.0673630312,
-0.0900576115,
0.0018161125,
-0.1098760366,
-0.1200206876,
0.0314020067,
-0.1485595405,
-0.0833678097,
-0.2215358466,
-0.1621479392,
0.2782628834,
0.1563174427,
0.0119721927,
-0.0685708076,
0.1434450746,
0.2104524225,
0.0210197717,
-0.287229389,
0.4619426429,
0.0285879001,
0.1335114241,
-0.3166642189,
0.0505025983,
0.1373166144,
0.1125013828,
0.3570069075,
-0.2416788787,
-0.0993691832,
-0.1350386739,
-0.0010031536,
0.2678193152,
-0.1936894804,
0.3649392724,
0.0505856127,
0.4417572916,
-0.2347767502,
-0.1650623679,
-0.4237526357,
0.1138923466,
-0.2390213907,
-0.3653304875,
0.1551288962,
0.0705931559,
-0.2658124864,
-0.1641086191,
0.2171801925,
-0.6330212355,
0.2544858754,
-0.1724116504,
0.1936620176,
-0.1905899942,
-0.17884323,
-0.2897792459,
-0.0612880029,
0.4643810391,
0.2399649173,
-0.2634059787,
-0.2285621017,
0.0405275673,
0.3470146358,
0.1847644895,
-0.1119701564,
0.1091835722,
-0.2437647879,
-0.2768905163,
0.9239946604,
-0.3892822266,
-0.2849108577,
0.2982856631,
-0.6965085864,
-0.0430954248,
0.0987514108,
0.1680968553,
-0.4586607218,
-0.0215315372,
0.0751162171,
0.7128701806,
0.2529001236,
0.1142330319,
-0.2597277761,
-0.3190844059,
0.2476549447,
-0.1181304306,
-0.0456686169,
-0.0101615563,
-0.1488675773,
0.645514667,
0.5735570192,
0.1820346415,
-0.0125393886,
0.3438822329,
0.5681897998,
0.1878332496,
-0.2743802071,
-0.2664021552,
-0.0003902093,
-0.0994902626,
-0.6493753195,
0.1542907804,
-0.5045360923,
-0.1930664778,
-0.1210341305,
0.1389089227,
0.1891828179,
0.0615256727,
0.0021859482,
0.2332505137,
-0.0239610225,
0.2157775611,
-0.1198189482,
0.461818248,
-0.1038129777,
0.1886588186,
-0.1736332774,
0.0234375447,
-0.2014029622,
-0.1311799884,
-0.055763308,
0.4927198589,
0.118143104,
-0.137855947,
0.0671340451,
0.4556127787,
-0.1938719749,
0.0698334798,
-0.2731493711,
0.0493068546,
0.2993724048,
0.0655335784,
0.0606359392,
0.2522664964,
-0.048627723,
0.1616556346,
-0.2174682617,
0.1048418805,
-0.4904000759,
0.0865467936,
0.1171606034,
0.2364953756,
0.066782929,
0.0605360717,
-0.1961005628,
-0.1744933128,
0.2643867135,
0.1245058477,
0.2692699134,
0.3537789285,
0.1622395366,
0.0071044788,
0.4008793533,
0.1960747242,
0.1870090961,
0.0447346419,
0.036752753,
-0.0143876858,
0.0241522882,
0.2787658572,
0.1582866162,
0.0996202677,
0.1799614429,
0.1286292821,
-0.3068121374,
-0.2155839056,
0.2058728784,
-0.0370233171,
0.1797151566,
0.1938581616,
0.119446218,
-0.1883073747,
-0.1963339448,
0.124844566,
0.0569101721,
0.1447528899,
-0.3052256405,
0.066362761,
-0.5841505527,
-0.4200508595,
-0.6971717477,
0.0577422976,
-0.5651500821,
-0.2432686687,
-0.131567806,
0.2803405225,
0.0072183162,
-0.1125030518,
-0.0989017189,
-0.1876922846,
-0.4081625044,
-0.5757268667,
0.0602041185,
0.0518220551,
-0.2901477516,
-0.1236890107,
0.1656204462,
0.2025419176,
0.1851046681,
-0.0894981772,
-0.3911503553,
0.0396244377,
-0.2137360722,
0.014832763,
0.2922559977,
0.1402155608,
0.2720707655,
-0.1263947487,
0.1923097074,
-0.1844569594,
0.0526948348,
-0.2295179665,
0.0896230415,
-0.0221745148,
0.1062932089,
0.0566952638,
-0.3506988287,
-0.1971297413,
-0.2604808807,
-0.4971485138,
0.3187969327,
0.2221005261,
0.3196675777,
0.7840188146,
-0.131513387,
-0.0195279792,
-0.2262927592,
0.1622526646,
0.1848050654,
0.035859853,
0.258192867,
-0.1146695614,
-0.0830993354,
-0.0601105504,
0.0697413832,
0.0492027998,
0.4465399086,
-0.2789008617,
-0.0321910232,
0.0596205443,
0.0530920103,
0.0721438453,
0.1278875321,
0.4659276605,
0.291344136,
0.1366932094,
-0.0429817289,
-0.3192870021,
0.2071257532,
0.5812686682,
0.3781488538,
0.2848645151,
0.1985884905,
0.1895770282,
0.3258745074,
-0.0334561914,
0.2295015007,
0.2446359694,
0.0202222597,
0.5292627215,
0.1749499291,
-0.141864717,
-0.3167338967,
0.1077736467,
-0.0488309301,
0.0461145379,
0.1397985816,
-0.1316753775,
-0.033536803,
-0.0543950573,
-0.0411399752,
-0.4204460382,
0.088098675,
-0.0913277864,
0.3229239583,
0.1878473312,
0.0720071197,
-0.242555663,
-0.2681699395,
0.2999161184,
0.3304553926,
-0.0219777413,
0.3394548297,
-0.2850990891,
0.0375345275,
-0.3237116635,
-0.3741633296,
-0.0990822166,
0.4699094296,
-0.1400242597,
-0.0109421164,
0.2265981734,
-0.1689765155,
0.3078709841,
0.061850749,
0.2381908298,
0.1564677656,
-0.1248223856,
-0.5593658686,
-0.0138592944,
0.0608833879,
0.4183634222,
0.2125427425,
0.015892908,
-0.0787870735,
-0.1674569845,
0.0450293645,
0.0172658153,
-0.1862704009,
-0.2216737568,
-0.3028727174,
-0.407414794,
-0.2037133873,
-0.004597988,
0.0468645468,
0.0947289914,
0.2849698067,
-0.0375723094,
0.231184274,
-0.0778233632,
-0.0490508564,
-0.2754965127,
0.1342766881,
-0.0307981949,
0.0601021796,
-0.0201706961,
0.0564733036,
-0.2783782184,
0.3375898898,
-0.0438490808,
0.1900485158,
-0.3183268309,
-0.0657876283,
0.2830539942,
0.5314484835,
0.0186984725,
0.0987236723,
-0.0702534616,
-0.4763906598,
0.0237489007,
0.0817923099,
-0.0662816167,
-0.3168520033,
0.2155398577,
-0.3315453231,
0.1977357119,
0.1427231133,
-0.2527742982,
0.2816804647,
-0.133166939,
-0.4260430932,
0.237169534,
0.2485247552,
0.9905762672,
-0.2436981052,
0.3380001485,
0.0949406624,
-0.4214067757,
0.3640106916,
-0.284044385,
0.2919516563,
-0.471768856,
0.327191323,
0.0250540152,
-0.3468217254,
-0.1333054155,
-0.0808615014,
0.1263563633,
0.2611092329,
0.1430905908,
-0.0839995891,
0.1720124036,
0.360396117,
-0.1622351557,
-0.3209449053,
-0.4041655362,
0.0072637089,
-0.075279437,
0.4611843824,
-0.1371788681,
-0.2295715809,
0.1437278092,
-0.1819766462,
0.0348363668,
0.1538995057,
-0.1240409613,
-0.0089385547,
-0.029609235,
-0.0094482601,
0.0088727251,
0.1726734638,
-0.0743673369,
0.5297319889,
0.0576445833,
-0.2772849798,
-0.2296255976,
-0.2684811652,
0.1140006036,
0.4309663475,
-0.0211038925,
-0.0338723548,
0.0209360421,
-0.3799842,
0.0266885571,
-0.2590020001,
0.1962018013,
0.0089065563,
0.2537412047,
0.19142735,
-0.1751878113,
0.0173281394,
0.0195890591,
-0.040992476,
0.0452312306,
0.2079639733,
-0.1550949961,
0.1690048575,
-0.0811459646,
-0.3309136033,
0.0828364864,
0.5577000976,
-0.0600155778,
0.1329279542,
0.2639299035,
-0.3103863895,
-0.1690563112,
-0.1576103568,
-0.0620954707,
-0.5718548894,
-0.0864450037,
-0.1173256189,
-0.1262333095,
-0.3579059541,
-0.2837914526,
-0.0830993652,
0.1901948899,
-0.0206206366,
-0.3535201252,
-0.3866328597,
-0.187447533,
0.1063270792,
0.1185438037,
-0.0102187842,
0.2149747908,
-0.0358134508,
0.0734796301,
-0.1466744989,
-0.1507854015,
-0.0146022756,
-0.2211664021,
0.0565899648,
0.0260117613,
-0.201083079,
-0.0164539348,
0.0352756977,
0.0666404814,
0.132596314,
-0.0256872941,
-0.0348102301,
-0.1901124716,
0.215998441,
0.1355570853,
-0.0695011392,
0.2680900693,
-0.443297416,
-0.0740880072,
-0.0213887319,
-0.2436882555,
0.2915948331,
0.2023130655,
0.0082688592,
0.1977009922,
0.2854664326,
-0.1080865562,
0.3883490264,
-0.4074508548,
-0.1672104299,
0.0570152216,
0.1897869408,
-0.0244021509,
0.0361053273,
-0.306273818,
-0.1726181209,
0.0013142861,
0.1376882046,
0.1503578871,
-0.1284584254,
-0.1020127833,
0.2298370153,
-0.0594987199,
0.4817071557,
0.0433276631,
-0.1527952254,
0.4984941483,
0.0178825576,
-0.0021819547,
-0.0885127783,
0.209396556,
-0.2022423148,
-0.1023379639,
0.4958070219,
0.2620564103,
-0.1986764073,
0.2865674198,
0.0292145126,
0.314740032,
0.0256732032,
-0.0574433655,
0.0836309344,
-0.0550508909,
0.0174757373,
-0.1141670942,
0.118450135,
0.2199362367,
0.1767617315,
0.1238601357,
-0.1511376947,
-0.0160610657,
-0.1166111603,
-0.0808949172,
0.0021121278,
-0.3904832602,
-0.0148650333,
0.6448429227,
0.1958513409,
0.1516013145,
0.1036150083,
-0.3628798127,
0.0036810711,
0.2076682448,
0.0933159441,
-0.1956071407,
-0.0703349635,
-0.161657095,
0.0210512578,
-0.3600159883,
0.079749167,
-0.0567712933,
-0.4050352573,
0.2851814926,
0.1234718114,
0.1535704434,
-0.4435423017,
0.1153310388,
0.1981544495,
0.0157192536,
-0.2311703563,
0.1667551696,
-0.4791301489,
0.3052453399,
-0.273385793,
-0.2546566427,
0.1634380221,
0.3410561085,
-0.102438949,
-0.1120229065,
0.0654759184,
0.171325624,
-0.2860588729,
0.0029464606,
-0.0672470629,
0.7874854803,
0.190453738,
-0.0534623638,
-0.0291884001,
-0.3258470297,
0.2208028585,
-0.2670590878,
0.0503326654,
0.6900346279,
0.3053017855,
-0.0314410925,
-0.0856996551,
0.2216228098,
-0.2574246824,
0.0926635712,
0.1983488053,
0.1986171305,
0.1361199021,
0.0112201236,
0.0524484441,
0.2934969068,
0.3950170279,
-0.0037639495,
0.1215990335,
0.0661528483,
0.0040798038,
-0.5283043385,
0.1576402336,
-0.3230125606,
-0.2439775467,
-0.0172326043,
-0.1373325586,
0.0265006721,
0.0736417621,
-0.2676175833,
0.1852429807,
0.1084584594,
-0.0692888349,
0.0906953514,
-0.045701731,
0.1658129692,
-0.0438650362,
0.0029080063,
-0.3991202712,
-0.0527483076,
-0.1488349736,
-0.0447635576,
-0.124591805,
-0.1464539021,
-0.1560395658,
0.2114462554,
0.1346189976,
-0.0255455058,
0.0093026906,
0.0284069553,
-0.4777072966,
-0.3569103479,
0.0273275487,
-0.3115733862,
-0.1141432971,
-0.1120086759,
0.0585226417,
-0.3168611526,
-0.0571991652,
-0.6203395128,
0.2801817656,
-0.0818860009,
0.2548211515,
0.2169453502,
0.0841783807,
-0.2436895967,
0.2907481194,
0.1096358821,
0.1155233458,
0.0551218241,
0.0580845401,
-0.3958210945,
-0.1631211042,
0.4747030139,
-0.0023118705,
-0.1922685057,
0.1842217743,
0.276714325,
-0.0266981386,
-0.0891863704,
-0.3841163218,
-0.2153950334,
0.2019433081,
-0.186256066,
0.0974383801,
0.0430321917,
-0.3200288117,
0.0321037397,
-0.0103071779,
0.35743922,
-0.0710976943,
-0.1901918352,
0.0787406713,
-0.334531188
] |
https://github.com/huggingface/datasets/issues/546 | Very slow data loading on large dataset | When you load a text file for the first time with `nlp`, the file is converted into Apache Arrow format. Arrow allows to use memory-mapping, which means that you can load an arbitrary large dataset.
Note that as soon as the conversion has been done once, the next time you'll load the dataset it will be much faster.
However for a 1TB dataset, the conversion can indeed take time. You could try to load parts of it in parallel, and then use `nlp.concatenate_datasets` to get your full dataset. | I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ? | 88 | Very slow data loading on large dataset
I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ?
When you load a text file for the first time with `nlp`, the file is converted into Apache Arrow format. Arrow allows to use memory-mapping, which means that you can load an arbitrary large dataset.
Note that as soon as the conversion has been done once, the next time you'll load the dataset it will be much faster.
However for a 1TB dataset, the conversion can indeed take time. You could try to load parts of it in parallel, and then use `nlp.concatenate_datasets` to get your full dataset. | [
-0.2580244243,
-0.1429568529,
-0.0743959695,
0.2015653998,
-0.1192448661,
0.0688817501,
0.0702809095,
0.4482387006,
0.1473059058,
-0.2636478543,
0.1486455798,
0.2279233038,
-0.1069059297,
0.0611826777,
0.1534850597,
0.0742181763,
-0.020004753,
0.2010742128,
-0.1128232479,
-0.1518826038,
0.0267370939,
0.0252637174,
-0.1835508645,
-0.2006814629,
-0.1286403835,
-0.0606779829,
-0.0646087453,
0.0253586061,
-0.0933858231,
-0.279902637,
-0.0734303147,
-0.0075129727,
0.0779139996,
0.1638167202,
-0.0001190489,
-0.1963579059,
0.3564791083,
0.0899229422,
-0.2416128218,
-0.149906531,
0.2623204589,
-0.6448488235,
0.2374053001,
-0.1103649735,
0.0207279176,
-0.1262656301,
0.1428303719,
-0.1979646832,
0.135707587,
0.0908237249,
0.1139506772,
-0.0867790356,
-0.2106599212,
0.3208089173,
0.1221834123,
0.1906037331,
-0.0291804858,
0.2637993395,
0.2932349145,
-0.2001264691,
-0.3337850869,
0.0332036279,
-0.1195470393,
0.1559601277,
0.3333069384,
0.0464645475,
-0.089846544,
-0.0760487318,
0.118085295,
0.2848020196,
0.5425196886,
-0.0258224476,
-0.0519599058,
-0.451775223,
-0.0019576214,
-0.1291749328,
0.1924463212,
0.3310853243,
-0.2503350079,
-0.2429401726,
-0.2714087665,
-0.3056859076,
-0.1267965287,
0.2447670549,
0.0714996234,
0.059363775,
-0.0026141927,
0.2128316015,
0.236753121,
0.0035540657,
0.0124038905,
-0.1971567869,
0.2276112437,
0.458168447,
-0.3844108582,
-0.0025611594,
0.0357515067,
0.1673822552,
0.1374227703,
0.1301299334,
0.1820942461,
0.046435371,
-0.2015422434,
0.0214451626,
0.2738534808,
0.3421609104,
-0.2031335831,
-0.167660445,
0.3533453047,
-0.2663766146,
-0.1084760651,
0.2691605687,
-0.1542274356,
-0.1744620651,
0.03309986,
-0.4120716155,
-0.1778140962,
-0.3051295578,
-0.0733596906,
-0.0831515864,
-0.0601922609,
-0.2440091074,
-0.0297534168,
0.1259436011,
-0.3379232585,
0.2886788249,
0.1437386423,
-0.1046236455,
-0.3226684332,
-0.0085544102,
-0.0555739179,
-0.0189567301,
-0.1516161859,
0.3027736843,
0.5116080642,
0.03258745,
0.1436164528,
0.0451548025,
-0.2622655928,
-0.0435534716,
-0.1774938554,
-0.2493154258,
-0.065980576,
-0.0313736871,
-0.0227674767,
0.1939994991,
0.0823344439,
0.3259328008,
-0.4238596559,
0.1759598255,
-0.3380049169,
-0.059247274,
0.0427809954,
0.0781177804,
-0.3854232132,
-0.3015828729,
-0.3347424865,
0.4295699,
-0.0600888245,
-0.1445234418,
-0.265732944,
-0.0977048278,
-0.2564184368,
-0.0020689741,
0.0752362832,
0.1738008261,
-0.146603182,
-0.0385927409,
-0.038504377,
0.4130980372,
0.2468641847,
0.5918000937,
-0.1988861859,
0.2207202315,
-0.1733454168,
0.0261159092,
0.5016425252,
0.1247368902,
-0.3987206519,
0.5826672912,
-0.2134342492,
0.0198440924,
0.1535962373,
0.3465154171,
0.0105100162,
-0.021689022,
0.2502731979,
0.7005244493,
0.2258731574,
0.1835713536,
-0.5376798511,
-0.1484408677,
0.2145619541,
0.3103542924,
-0.1488063335,
0.0026408844,
0.0340139419,
0.0508762598,
0.4399497509,
0.1979087442,
-0.0152007584,
0.3547913432,
-0.3042240143,
-0.0217802711,
0.068004325,
0.2160370201,
-0.0974012837,
0.1468752623,
0.0538268015,
0.0879037231,
0.3113020062,
0.1676245183,
-0.2224717736,
-0.2288696468,
0.0802321807,
0.2670136988,
-0.0229947753,
0.0762376785,
0.0345355794,
0.1919611096,
-0.1195790023,
0.3874519467,
-0.0643281639,
-0.1230422109,
-0.4384574592,
-0.0962344259,
0.2748400271,
-0.0464417562,
0.4039375782,
0.233703509,
-0.1707220227,
0.0219947584,
-0.135986656,
-0.0038255034,
-0.3297896385,
0.2558678389,
0.0886444077,
0.0330989435,
0.0563280806,
-0.1082670093,
0.395526737,
0.2351757735,
0.3025709391,
-0.3483244479,
-0.2962737381,
0.4585193098,
-0.1163600981,
0.3341571093,
0.1840614229,
-0.3987696171,
0.088508375,
-0.0222316086,
0.0819799304,
0.3170703053,
0.8408858776,
0.0845444053,
0.6601695418,
0.2273520827,
-0.1195098162,
-0.0499759838,
0.4336680472,
-0.1112651974,
-0.2259488404,
0.5013072491,
-0.1831347644,
-0.4062348604,
0.0281974729,
-0.1438337862,
0.391063869,
0.2931970656,
0.1178413928,
0.0659758151,
-0.1019046605,
-0.1797465384,
0.2066102326,
-0.1422032416,
0.3015495837,
0.0961892009,
0.3848951757,
0.0827195272,
-0.4828438461,
-0.286098361,
-0.0393605009,
0.3457218111,
-0.0580389835,
0.1087856591,
-0.0568858273,
-0.3092048168,
-0.2312424183,
0.0379155651,
-0.317953378,
-0.2344308048,
-0.0101793613,
0.0390954055,
0.4102935195,
-0.1763645411,
-0.0082465187,
0.1628457308,
-0.0902130455,
-0.4379179776,
-0.2815778255,
-0.1494337022,
-0.4388256967,
-0.0467551127,
0.3423184752,
0.2407114953,
0.3009676337,
0.1920311302,
-0.1717657149,
0.2372121811,
-0.0423782691,
-0.1797999889,
-0.0757497475,
-0.0906863213,
-0.2423757613,
0.1320562363,
0.125511542,
0.0224761143,
0.0044578165,
-0.2911378145,
-0.0708212182,
0.2107930183,
-0.0324205793,
-0.0122497641,
0.0540514141,
-0.2657478452,
-0.1826429069,
-0.2270226628,
0.4667136371,
0.1411596984,
0.1653355062,
0.2913053036,
-0.1021565795,
0.2856540382,
0.1465474069,
0.2479121089,
-0.0135068819,
-0.1057492942,
0.3281098008,
0.2865341008,
-0.2873675525,
-0.3175395727,
0.0452959612,
0.1026077941,
0.1219862252,
-0.7564433813,
0.4603054523,
-0.4373182654,
-0.0935814008,
-0.1840258539,
0.1150642186,
0.1560969949,
-0.2291909009,
-0.0480019711,
0.2379163802,
-0.0687177256,
-0.0098810047,
-0.048065152,
0.2736463845,
-0.0779337138,
0.3746146262,
-0.0265050977,
0.167229116,
0.178838253,
-0.0635449067,
0.1850762069,
-0.0453567095,
0.0589770675,
-0.2291682065,
-0.1360010803,
0.1213739663,
-0.1993957162,
-0.0526304394,
0.3622395694,
0.0995582491,
-0.479442656,
-0.2688516676,
-0.1049255431,
-0.0438406989,
-0.1263777614,
0.2061015517,
-0.5250544548,
-0.0895584226,
0.031552963,
-0.1230123863,
-0.2496895492,
-0.7182971239,
-0.1159709245,
0.0851068571,
0.1345786452,
0.068619445,
-0.2203109562,
0.1092064381,
-0.8535845876,
0.1522728801,
-0.2927679121,
0.4942767024,
0.1012603194,
-0.2130984813,
0.3597967327,
-0.1164103597,
0.2636375725,
0.3609698415,
0.106467098,
0.1501388252,
-0.0586198382,
-0.5524620414,
-0.0408006869,
-0.1757725179,
0.2366239727,
0.4474133253,
0.3369254768,
-0.2200976312,
-0.1212980151,
0.003410494,
0.3156628013,
-0.0433469601,
-0.5378082395,
-0.4096363783,
-0.1558474004,
-0.0251948461,
0.0255661737,
-0.3112895489,
0.1965810359,
-0.1206168458,
0.0235488713,
-0.0312655345,
-0.1023794338,
0.2745973468,
0.0140291564,
-0.2206919491,
0.0643058792,
0.3005279899,
-0.0779171735,
0.3423228264,
-0.4795140922,
0.463601917,
0.3183021247,
0.0611686185,
0.0151961334,
0.0597669706,
0.3986094594,
0.2931851745,
0.0773351118,
0.0504429489,
-0.2372924387,
0.190273881,
-0.1804481149,
0.0638738722,
0.1414531767,
-0.0688967705,
-0.4214700162,
-0.3904386163,
0.9887105823,
0.1678911448,
0.0653035641,
0.4738784432,
-0.3927306533,
0.065194495,
0.3359813094,
-0.344171375,
0.8150564432,
-0.4780228138,
0.3495224714,
-0.0670740902,
0.0347132385,
0.4023120105,
-0.1422497928,
0.0654207915,
-0.3965913653,
0.222933203,
0.2521342039,
-0.0400776118,
-0.060766466,
0.3055632114,
-0.1380981505,
0.3873783648,
0.4162446856,
-0.3467719257,
-0.1516744196,
0.7324688435,
-0.1925999522,
-0.2279392481,
-0.5543439388,
0.0806716233,
-0.2827616334,
0.4859900177,
0.0569208898,
0.0594944209,
-0.1382341385,
-0.0232670456,
-0.2746176124,
0.0934718549,
-0.0221107751,
0.0251465403,
-0.1356393844,
-0.3379200697,
0.111544773,
0.0942987725,
-0.3294048011,
0.0775136575,
-0.282643944,
0.038752377,
-0.4085693359,
0.2268722057,
0.4534931779,
-0.0715887547,
0.1698086411,
-0.1747026443,
0.02782543,
-0.1191382557,
-0.0530400053,
-0.2230223417,
-0.0910927057,
0.1972077787,
0.1085299104,
0.1416529566,
-0.3056224585,
0.2965964675,
-0.036791794,
0.0183540136,
0.0337990858,
0.3615487218,
0.0726239681,
0.2760682106,
0.1567230523,
-0.4740464091,
0.0187788568,
0.5012875795,
0.0956101716,
-0.1507555246,
0.4508931041,
0.1583357155,
-0.1157473475,
-0.1204260141,
0.1693950444,
0.0386527963,
-0.2350003719,
0.0467656031,
0.0816516131,
0.1031813473,
0.2221589237,
0.0563782491,
-0.067580238,
-0.1099895462,
0.0928843319,
-0.4718826711,
-0.2744017541,
0.0782020465,
-0.0862438008,
-0.0269837752,
0.0571641177,
-0.0591561869,
0.2229403704,
0.3382496834,
-0.1543868929,
0.1469290853,
0.0962574184,
0.0974756479,
-0.099015601,
-0.2196983397,
-0.0071761049,
0.3180195093,
0.0732911676,
-0.0740762204,
0.1194785535,
-0.1856728196,
0.0396323279,
0.1955276728,
0.0100287702,
-0.028217718,
-0.0172090866,
-0.3136984706,
-0.0298531912,
-0.2065970898,
-0.1353012323,
-0.0746400356,
-0.0685596615,
-0.0881907344,
0.0215054713,
0.0047978442,
-0.0432663187,
0.2160214335,
-0.4406598806,
0.2270675004,
0.1653149724,
0.0807466209,
0.3621738553,
-0.0014524832,
-0.5631073713,
-0.0575424619,
0.1354709268,
0.0851344541,
0.1073212028,
-0.1598533988,
0.1474944204,
-0.1557083875,
0.1999239922,
0.5425596237,
0.0573216043,
-0.1970003545,
0.1085040867,
0.0493312553,
-0.2381012142,
-0.0085444897,
0.2233700454,
0.0445110016,
-0.0989393592,
0.107186839,
0.1314357221,
0.0226624161,
-0.3565068543,
-0.3570156693,
0.2992472351,
-0.1193620712,
-0.2936939597,
0.1449043751,
0.1741230488,
-0.1466108859,
0.2117779702,
0.0921042413,
0.0983329788,
0.4157552421,
0.1702167541,
0.3688650727,
0.1080681086,
0.4503049254,
-0.3991967142,
-0.3068243861,
-0.1270225644,
0.4518743455,
0.1279355735,
0.1989238858,
-0.0008421019,
0.2880346775,
-0.0131749045,
0.0271891579,
-0.1752699018,
0.0176182892,
0.0247532129,
-0.1582581252,
0.1484027505,
0.0871668756,
-0.2405379415,
0.2644077241,
-0.3471243382,
-0.0155323185,
-0.0242400169,
0.3398848176,
-0.0073970295,
-0.411871016,
0.1741111875,
0.0005493215,
0.1975171566,
-0.1807394028,
0.1701128483,
-0.2927470207,
0.1100098044,
0.0769690722,
0.1250457913,
0.2397671491,
0.3727667332,
-0.0875174999,
0.1372534335,
-0.1563857794,
-0.0505314097,
-0.2074502558,
0.3958814144,
0.2734568715,
0.3311546147,
0.1413857937,
-0.0814319551,
-0.0653947592,
0.1674977988,
-0.1196422279,
-0.3277772069,
-0.2014134377,
0.248580426,
0.1475753486,
-0.2981002927,
-0.5778744221,
0.1361487359,
-0.3145257533,
-0.2423122227,
0.1974418461,
-0.1156182736,
-0.0432790294,
-0.0786158442,
0.0222196467,
-0.3155362308,
0.6768493056,
0.4696149826,
0.3979501426,
-0.0979028419,
-0.2553619444,
-0.4972056448,
-0.0835587829,
-0.3635213971,
-0.0955653638,
0.0487686805,
0.3127393723,
0.1088961512,
0.1623490006,
-0.3987658024,
0.2395127863,
-0.1641736031,
-0.075496152,
-0.2630471587,
-0.0352834091,
-0.1491556764,
0.1646287441,
-0.2576057911,
-0.2837488055,
0.1094366461,
-0.16515176,
-0.1021281406,
-0.137277171,
0.0079126731,
0.0893053561,
0.4669981599,
0.311984688,
0.0175566841,
0.3307692707,
0.0810263753,
-0.1050682664,
-0.4045042992,
0.1325250119,
-0.1420275569,
0.2934334874,
0.0654426962,
-0.0204112865,
-0.065046981,
0.2211397439,
0.1623334289,
0.2604759932,
-0.2782455385,
-0.0273940545,
0.0541199744,
-0.3129332364,
-0.0963110924,
0.3950645924,
-0.0569717251,
0.4237586856,
-0.0561879948,
-0.1047647223,
-0.4859928191,
0.0311775133,
0.2326675951,
-0.3553596139,
-0.0956417769,
-0.0655776113,
0.3098989725,
0.1066005975,
0.2938703001,
-0.3976063132,
-0.122412324,
0.3562180996,
0.0164161026,
-0.0080449209,
0.4049393833,
0.0255446807,
-0.0593670532,
-0.0508875474,
0.3089422882,
-0.1203019619,
-0.1957066655,
-0.319883585,
-0.2703249753
] |
https://github.com/huggingface/datasets/issues/546 | Very slow data loading on large dataset | Humm, we can give a look at these large scale datasets indeed.
Do you mind sharing a few stats on your dataset so I can try to test on a similar one?
In particular some orders of magnitudes for the number of files, number of lines per files, line lengths. | I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ? | 50 | Very slow data loading on large dataset
I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ?
Humm, we can give a look at these large scale datasets indeed.
Do you mind sharing a few stats on your dataset so I can try to test on a similar one?
In particular some orders of magnitudes for the number of files, number of lines per files, line lengths. | [
-0.2486382425,
-0.2108439207,
-0.1305087358,
0.271597445,
-0.031259574,
0.0532567352,
0.1728604883,
0.3125980198,
0.3433540463,
-0.2542507648,
0.1040125266,
0.2262749076,
-0.1480091214,
0.24919267,
0.174284339,
0.0440118089,
-0.0694731176,
0.1338016391,
-0.094050318,
-0.208958596,
0.0764913782,
-0.0042573288,
-0.2356912941,
-0.2695989013,
-0.1871653199,
-0.05037532,
-0.0101458505,
0.0359053425,
-0.1234340966,
-0.2105944455,
0.0055683851,
0.09310776,
0.0121270306,
0.3132480681,
-0.0001156001,
-0.2746174932,
0.3784100115,
0.0973196849,
-0.176219523,
-0.0817300975,
0.2085633725,
-0.6034760475,
0.1212055907,
-0.1240258962,
0.0174890831,
-0.0432458594,
0.0762170702,
-0.1090651825,
-0.0044661164,
0.1309252679,
0.1321335286,
-0.090446502,
-0.2628419399,
0.2112325132,
0.0767726898,
0.271699369,
-0.0006691739,
0.357282877,
0.4154172242,
-0.1923882812,
-0.2783371806,
0.0211374052,
-0.1434020847,
0.0957907215,
0.2736080885,
0.0057197027,
-0.1591183841,
-0.0868938565,
0.1164132431,
0.4194947779,
0.5542603731,
-0.1248919815,
-0.0682063699,
-0.4857757688,
0.0235807225,
-0.1223734617,
0.1659092009,
0.2761653662,
-0.1462366879,
-0.2896140218,
-0.4326579571,
-0.2402225435,
-0.0795600116,
0.1657621711,
0.0045834184,
0.0249822401,
-0.0131764878,
0.1476901472,
0.1474991888,
-0.0551851802,
-0.0441149026,
-0.2092200518,
0.2359207869,
0.4067614079,
-0.4368546307,
-0.1065014899,
0.074998036,
0.177108556,
0.1731022745,
0.067776978,
0.1592899561,
0.0109076053,
-0.1891325116,
0.026382871,
0.2703834772,
0.3900515735,
-0.1767292023,
-0.1684201956,
0.3515304625,
-0.1697862148,
-0.2296456844,
0.2596700191,
-0.1128105521,
-0.1672843248,
0.0136893168,
-0.4427073896,
-0.2214361429,
-0.3526124358,
-0.0619460978,
-0.034204971,
0.0437182114,
-0.2788407803,
0.1019066274,
0.2122911662,
-0.3088972867,
0.2901293337,
0.015991725,
-0.0993958116,
-0.2755248845,
0.0564936846,
-0.1495099068,
0.0431285352,
-0.1205478311,
0.2141245902,
0.5559556484,
-0.0202068035,
0.1875708103,
0.1515731364,
-0.2232282311,
-0.1161600277,
-0.2009127736,
-0.3427751064,
-0.0972758681,
-0.0436134338,
-0.0318154581,
0.168525368,
0.0070507657,
0.3376307487,
-0.3838863969,
0.1070855558,
-0.3090752363,
-0.0513509773,
0.071545586,
0.1486167312,
-0.427213788,
-0.3566761613,
-0.2842701077,
0.3984336555,
-0.1028044224,
-0.1434933543,
-0.2055092752,
-0.1970778406,
-0.2360455692,
-0.0038905926,
0.0610401407,
0.208019942,
-0.1028609797,
-0.0373451337,
-0.0420121476,
0.3275893927,
0.3008472621,
0.5783416033,
-0.1911015064,
0.2444196343,
-0.1414471567,
0.1041965932,
0.4420726597,
0.0621604323,
-0.4157562852,
0.5438101292,
-0.1831782758,
0.0356251895,
0.2243207693,
0.3146443069,
0.0846068785,
-0.0453320891,
0.3129040897,
0.6097599864,
0.2616626918,
0.2236880958,
-0.5192389488,
-0.0816204101,
0.2502227426,
0.4377365708,
-0.0662628561,
-0.1100109965,
0.0123594999,
0.0212784447,
0.4793026447,
0.1545647383,
-0.1717212796,
0.3233121634,
-0.2008063793,
-0.0290942602,
0.0442673564,
0.1464745402,
-0.0480414778,
0.2284378111,
0.1344216913,
0.0384672433,
0.3972011805,
0.1483384073,
-0.1765304953,
-0.1649582386,
0.0871810317,
0.1871475577,
0.0015194304,
-0.0436736457,
0.0738985017,
0.2142147124,
-0.1339360178,
0.3165629506,
-0.22237131,
-0.1686551273,
-0.32533288,
-0.0225289762,
0.3222879767,
-0.0935232788,
0.3534989357,
0.2709273696,
-0.1474176049,
0.0001547523,
-0.123433888,
-0.1029442996,
-0.1997460723,
0.2632876337,
0.1220242307,
0.1304643601,
0.0262937117,
-0.0328233689,
0.4114790261,
0.1348861754,
0.2995353937,
-0.3384072781,
-0.3134110868,
0.4126794934,
-0.1818645597,
0.2255382836,
0.1676183194,
-0.5121961832,
0.1029439494,
-0.1461996734,
0.1794182211,
0.361811161,
0.9970804453,
0.042389404,
0.6520145535,
0.3005906343,
-0.1573515832,
-0.0620916821,
0.4747376144,
-0.0766692087,
-0.1524769068,
0.4595929086,
-0.2282648385,
-0.316268146,
0.0290647577,
-0.0418973565,
0.3655610681,
0.2885786891,
0.0570739061,
0.0865551755,
-0.018212188,
-0.2234888971,
0.2061380297,
-0.1537455916,
0.1727262586,
0.2333604246,
0.3821171522,
0.0576561838,
-0.439982444,
-0.2300446033,
-0.0233047642,
0.3337500393,
-0.0937666893,
-0.0122365914,
0.0247635394,
-0.2615177035,
-0.1847617924,
0.0775961727,
-0.3184907138,
-0.2108864933,
-0.015795676,
-0.0252598561,
0.4601274133,
-0.1912714094,
-0.0180879645,
0.1107640266,
-0.1913945973,
-0.4996023774,
-0.2530305088,
-0.1412378848,
-0.3140840232,
0.0136329383,
0.2918313444,
0.2350367457,
0.3330127895,
0.1568285972,
-0.1925464123,
0.2865509689,
-0.0626535714,
-0.1530093998,
-0.0221393555,
-0.0233355183,
-0.1805531085,
0.110746786,
0.2681524158,
0.026947286,
0.0560400188,
-0.34793064,
0.0260534734,
0.14915663,
-0.0442706347,
-0.0498942062,
0.0474365056,
-0.2053921968,
-0.2814621627,
-0.2173213363,
0.5193731785,
0.1146608442,
0.1249854788,
0.2194507122,
-0.1814264953,
0.2895322144,
0.0622398295,
0.2473642528,
-0.0041090362,
-0.2301890254,
0.2904568017,
0.3158659637,
-0.2225533277,
-0.2283248752,
0.0255332142,
0.0695408434,
0.0832811967,
-0.7475149632,
0.4059769511,
-0.4893420041,
-0.0323971957,
-0.0399833322,
0.0885110348,
0.1408661008,
-0.2680482268,
-0.0466729477,
0.2261454761,
-0.1216789484,
-0.0299025103,
-0.0066039483,
0.2154011726,
-0.2178008258,
0.3159345388,
-0.1042945012,
0.0511521026,
0.1029466838,
-0.1928379536,
0.1229183674,
-0.0626876056,
0.0938578397,
-0.1282749176,
-0.2376915812,
0.2467366755,
-0.0836680233,
-0.174587965,
0.3359710276,
0.0408463664,
-0.4224252403,
-0.2895857096,
-0.1774412245,
-0.0099107996,
-0.0943665281,
0.167095229,
-0.2766081095,
-0.0619888492,
-0.0030026734,
-0.1154804975,
-0.222651571,
-0.73543185,
-0.1276738495,
0.1004893482,
0.1220176965,
-0.0587938651,
-0.1328060925,
0.006159883,
-0.763679862,
0.1280783713,
-0.1921784878,
0.4276964068,
0.0147737414,
-0.1615124345,
0.3043777049,
-0.1195622534,
0.3274204731,
0.2310961634,
0.2009234726,
0.1975379586,
-0.113994576,
-0.4052629471,
-0.0265560038,
-0.2160051912,
0.2019545585,
0.5574670434,
0.3640390337,
-0.1233931929,
-0.1083957851,
-0.0000924841,
0.3028020263,
-0.005460076,
-0.6474366188,
-0.2758659124,
-0.0769907832,
0.1311497837,
0.09139622,
-0.357698679,
0.145559907,
-0.2545611858,
0.0722505674,
-0.0261505935,
-0.0077565387,
0.2913789451,
0.0363019817,
-0.2095212638,
0.0593110137,
0.3716658056,
-0.1332943588,
0.3029575944,
-0.4588220716,
0.4044705629,
0.1889939606,
0.1596021056,
-0.0264295824,
0.0294620134,
0.4311203361,
0.2654612958,
0.0792784393,
0.0055526011,
-0.3309393823,
0.2348130345,
-0.2141077071,
0.0845483392,
0.1717278361,
0.0702351332,
-0.435438484,
-0.3807119429,
0.875483036,
0.1672885269,
0.0593724549,
0.3597489297,
-0.4834036827,
0.0001693815,
0.1790502369,
-0.3303281367,
0.7395703793,
-0.4816350341,
0.2712955773,
-0.1411928236,
0.1208736598,
0.3883972764,
-0.1068188399,
-0.0140717169,
-0.4078048468,
0.0012222156,
0.2903619409,
-0.0737053379,
-0.071214363,
0.2629813552,
-0.0949842855,
0.4188198447,
0.3380484879,
-0.361014992,
-0.1628673226,
0.7147490382,
-0.1446308643,
-0.1830555052,
-0.5921897888,
0.1191916987,
-0.3006483912,
0.4313512743,
0.0359370746,
0.1080772802,
-0.122697182,
0.0350914672,
-0.3221389055,
0.1057326719,
-0.0214598905,
-0.1039434746,
-0.0558500141,
-0.3602653146,
0.210426569,
0.1329292506,
-0.3163470626,
0.1919999272,
-0.2904359996,
0.1694648415,
-0.4294071496,
0.2215342373,
0.5690383911,
0.0400776304,
0.0920961872,
-0.1330711246,
-0.0164837688,
-0.1876689941,
-0.0030464493,
-0.0842529312,
-0.0839660466,
0.1287947595,
-0.0102345534,
0.0207703263,
-0.3285721242,
0.2691437304,
0.0174872205,
0.0442582667,
0.0795395002,
0.356028378,
0.0995684639,
0.3716543615,
0.0781683624,
-0.4419522583,
-0.0074975505,
0.5539150238,
0.0446979366,
-0.093468219,
0.3801755011,
0.2489359975,
-0.1149854362,
-0.1516567171,
0.0772767588,
-0.1090517342,
-0.1209497154,
0.0214856379,
0.0952616334,
0.1124398112,
0.2152584344,
0.0036676982,
-0.0396408327,
-0.1684332192,
0.0969595611,
-0.4598966241,
-0.2703389823,
0.018325571,
0.0163764209,
-0.0594286621,
0.0189463794,
-0.0193065405,
0.2558462918,
0.255302906,
-0.2078297436,
0.1938477755,
0.0379542857,
0.0008024294,
0.0161527433,
-0.2550886571,
0.0776677132,
0.1347041279,
0.1336496025,
-0.1695390642,
0.1315924227,
-0.254155159,
0.0428640991,
0.155125469,
-0.0507308319,
0.046961993,
0.0386367552,
-0.29087165,
-0.0510553867,
-0.1894156635,
-0.1260955036,
-0.0372355804,
-0.0687754527,
-0.0327707306,
-0.026629705,
-0.01835794,
-0.1332510412,
0.2710009813,
-0.4025322199,
0.1420171857,
0.1175245643,
-0.0147004332,
0.3519358635,
0.0105443597,
-0.5127435327,
-0.1042866856,
0.1264261156,
0.1025157347,
0.1936851591,
-0.167449221,
0.1844202727,
0.0059570707,
0.2789810598,
0.52806288,
-0.006293837,
-0.185885042,
0.0784662813,
0.0974324495,
-0.2273055166,
-0.0144935045,
0.3085355163,
0.2134362608,
-0.0792190582,
0.1308325678,
0.1391679198,
-0.0282358602,
-0.3540790081,
-0.3104559183,
0.4062986076,
-0.1541443318,
-0.2605237365,
0.1424598396,
0.1900192648,
-0.1254627109,
0.1808305532,
0.1629988104,
0.0297830105,
0.4247862101,
0.2490756214,
0.3453609347,
0.039037168,
0.4415316582,
-0.3563576937,
-0.2952813506,
-0.1471572518,
0.5082956553,
0.0578824244,
0.1391099095,
-0.1498194039,
0.2389548421,
-0.0552015305,
0.0499006137,
-0.216059953,
0.0952131823,
0.0089176614,
-0.0742412806,
0.1654218137,
0.0643648058,
-0.1956369877,
0.2332741618,
-0.3532126844,
0.1447058916,
-0.0955295265,
0.3290524483,
-0.0386079066,
-0.4128959179,
0.1281603873,
-0.0209210031,
0.2326918989,
-0.2117918432,
0.1328069866,
-0.2030375898,
0.0903228447,
0.0802236944,
0.193207413,
0.2651675344,
0.296299696,
-0.0642592236,
0.0547850169,
-0.1846185476,
0.0069710165,
-0.2256304473,
0.4160029888,
0.1715600193,
0.3349129856,
0.1330731362,
-0.0182282478,
-0.0888317823,
0.2212149352,
-0.1641309112,
-0.2552428246,
-0.1376208961,
0.2702533901,
0.1682337821,
-0.2660663128,
-0.60238415,
0.1053806022,
-0.317276597,
-0.1932384223,
0.1913500726,
-0.0232282039,
-0.0075535215,
-0.0448891521,
0.0491050854,
-0.3418809474,
0.6926577687,
0.4073377848,
0.4532287121,
-0.0886885673,
-0.2234825492,
-0.4926072657,
-0.1371571124,
-0.3056132793,
-0.1503495276,
0.1317795813,
0.2089388669,
0.0859814212,
0.0431546867,
-0.4199784696,
0.2226311266,
-0.1943283975,
-0.1410339773,
-0.3337700963,
-0.0309566539,
-0.3584558666,
0.0855704248,
-0.148354888,
-0.1733204424,
0.0769330263,
-0.2930165231,
-0.0804649591,
-0.1545775533,
0.1351482421,
-0.0031844005,
0.4502999783,
0.2374550998,
-0.0331194103,
0.4438130558,
0.0437536687,
-0.0190133825,
-0.3446156681,
0.1138084382,
-0.125445962,
0.2371192425,
0.0708208457,
-0.054981932,
-0.0471284613,
0.264251411,
0.2522794008,
0.1962286681,
-0.2932773232,
-0.0220111981,
-0.0116343759,
-0.4068408906,
0.0198661238,
0.2453897595,
-0.0170998797,
0.4612653852,
-0.1009520367,
-0.1236661002,
-0.3529674411,
0.0987663716,
0.2898670435,
-0.3197879791,
-0.0076201037,
-0.2301363498,
0.3354279101,
-0.0057783984,
0.3162285089,
-0.315834105,
-0.0337671414,
0.3471736908,
0.094502911,
0.0538212806,
0.4641375244,
0.012011718,
-0.1097198874,
0.0060084388,
0.2013750672,
-0.1574349999,
-0.1576908231,
-0.3613032103,
-0.131254077
] |
https://github.com/huggingface/datasets/issues/546 | Very slow data loading on large dataset | @lhoestq Yes, I understand that the first time requires more time. The concatenate_datasets seems to be a workaround, but I believe a multi-processing method should be integrated into load_dataset to make it easier and more efficient for users.
@thomwolf Sure, here are the statistics:
Number of lines: 4.2 Billion
Number of files: 6K
Number of tokens: 800 Billion
The number of lines is distributed equally across these 6k files.
The line length varies between 100 tokens to 40k tokens.
| I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ? | 79 | Very slow data loading on large dataset
I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ?
@lhoestq Yes, I understand that the first time requires more time. The concatenate_datasets seems to be a workaround, but I believe a multi-processing method should be integrated into load_dataset to make it easier and more efficient for users.
@thomwolf Sure, here are the statistics:
Number of lines: 4.2 Billion
Number of files: 6K
Number of tokens: 800 Billion
The number of lines is distributed equally across these 6k files.
The line length varies between 100 tokens to 40k tokens.
| [
-0.2211724818,
-0.0693991035,
-0.0759154856,
0.1695174575,
-0.0901311412,
0.1008214056,
0.2206685394,
0.3593855798,
0.2234859318,
-0.1874324679,
0.1198083162,
0.2741488516,
-0.0474218763,
0.2589619756,
0.1557990909,
0.1031198576,
-0.0261339471,
0.24119322,
0.0420605913,
-0.0941146463,
0.0570232049,
-0.0064292271,
-0.288662374,
-0.3357906044,
-0.2980341911,
0.0214097928,
-0.1646301448,
0.0981197506,
-0.0391981378,
-0.264470309,
-0.009863019,
0.2474714965,
0.0183742009,
0.3060205579,
-0.0001189172,
-0.2587306499,
0.3416222334,
0.066219829,
-0.2302350551,
-0.1102388874,
0.2941590846,
-0.6232700348,
0.1834748089,
-0.1261272877,
0.0808532909,
-0.035474699,
0.1360354722,
-0.2005749345,
0.0374362096,
-0.0441889502,
0.070642367,
-0.0996933132,
-0.2695117593,
0.2410738319,
-0.0642087013,
0.1584539711,
0.1085237488,
0.2832101583,
0.3556242585,
-0.2277130187,
-0.2626975179,
0.0034154803,
-0.1359219849,
0.1137018651,
0.2706818283,
0.0631934553,
-0.1407951564,
-0.1139548644,
0.0160540603,
0.3350260258,
0.5140843391,
-0.0896611661,
-0.0769432336,
-0.6029698849,
0.0666938573,
-0.2233531922,
0.171390608,
0.2411393672,
-0.1910333931,
-0.2759595513,
-0.3890080452,
-0.3037949204,
-0.066813089,
0.1294829994,
0.1065542847,
0.1073351651,
0.0603802763,
0.1835605204,
0.2886106968,
0.0518849753,
0.1139528006,
-0.2195519656,
0.2720117569,
0.4078283906,
-0.5243751407,
0.002806291,
0.0755732656,
0.12629053,
0.1804340482,
0.134829089,
0.1348885596,
0.0588198714,
-0.1292319596,
0.0144623816,
0.2798820734,
0.3531159163,
-0.1694984138,
-0.1542185396,
0.3105046451,
-0.2360601574,
-0.1550752819,
0.2192942798,
-0.1841178536,
-0.2138874829,
-0.0357900038,
-0.387501657,
-0.2708045542,
-0.2539887428,
-0.1084268317,
-0.1540878415,
-0.0759231374,
-0.2951632142,
0.1072273627,
0.1636064649,
-0.3497139513,
0.4387900531,
0.069481425,
-0.0635822117,
-0.3524892926,
0.079626672,
-0.1247898489,
-0.0223256815,
-0.1649294198,
0.2461103499,
0.522038579,
-0.0665995181,
0.0762936398,
0.1878409684,
-0.1300727576,
-0.1539221555,
-0.1207381934,
-0.2807799578,
-0.0596729666,
-0.050983984,
0.141392827,
0.2089808136,
-0.0016486611,
0.2444801927,
-0.3578418493,
0.1410296261,
-0.4562767744,
-0.1309576929,
0.0828592703,
0.1161714867,
-0.3526192307,
-0.3164810538,
-0.3232346177,
0.444293946,
-0.043052502,
-0.1600926518,
-0.3826981187,
-0.0633414015,
-0.2830698788,
-0.0609633923,
0.1561421007,
0.2751452327,
-0.110650599,
-0.0312858075,
-0.1239521205,
0.3189248145,
0.3326696157,
0.5696032643,
-0.2447539568,
0.3569864929,
-0.2544994354,
0.0757550895,
0.4739769101,
0.1069934964,
-0.333135128,
0.5554555655,
-0.2506341636,
0.0948170424,
0.2838945687,
0.3763917685,
0.005406633,
-0.0773132294,
0.3804530799,
0.6409077048,
0.099321343,
0.2818817496,
-0.4149834514,
-0.1698680222,
0.3565430939,
0.320394963,
-0.159736231,
-0.0079933628,
-0.0604927875,
0.0265478007,
0.5553175211,
0.0830681771,
-0.0824094266,
0.3210534751,
-0.2720301449,
0.0587018803,
-0.0290380083,
0.2288731486,
-0.1514304727,
0.2201866359,
0.1715944558,
0.0485992916,
0.3051659465,
0.0657509118,
-0.1193263829,
-0.2602834404,
0.014093928,
0.2311346829,
-0.0359793939,
0.1465756446,
0.0275786817,
0.0845096558,
-0.1976341009,
0.4954997003,
-0.1594906747,
-0.1767944098,
-0.4130461216,
-0.0086107627,
0.2972277105,
-0.0934250057,
0.3136304021,
0.2108668387,
-0.0928666592,
-0.0361032821,
-0.0675523579,
0.051184155,
-0.1797375083,
0.2204972357,
0.0708773807,
0.0440139994,
0.0122544896,
0.0982805789,
0.2722419202,
0.090982914,
0.2742017806,
-0.4083102942,
-0.2470960319,
0.4286513329,
-0.1855555475,
0.3100506067,
0.2059780806,
-0.4950375259,
0.0967754126,
-0.0787003413,
0.1401258111,
0.3078310788,
0.9347122908,
0.0532533452,
0.573190093,
0.2683218718,
-0.2199276686,
0.0297278203,
0.3893181682,
0.0419211537,
-0.2104264349,
0.4030592442,
-0.0934763253,
-0.3390427828,
-0.0685085952,
-0.0761766881,
0.4614569247,
0.2961539626,
0.0580204986,
0.0992353112,
-0.0641598925,
-0.2159319669,
0.1537288129,
-0.0942451209,
0.255558759,
0.2294764072,
0.3906421661,
0.0528666824,
-0.4248521328,
-0.2571045756,
0.1408697367,
0.2379992157,
-0.147553429,
0.0597532243,
-0.1099085063,
-0.2331698984,
-0.2371131778,
-0.0225571487,
-0.3098427951,
-0.2584303617,
-0.0043130862,
0.1100212485,
0.3845560849,
-0.1346056163,
0.0247181728,
0.1140386164,
-0.2102862298,
-0.5112169385,
-0.2711207867,
-0.0879053921,
-0.3018431962,
-0.0550501347,
0.3290279508,
0.2225958407,
0.3865031004,
0.1510089636,
-0.2263595015,
0.2597495317,
-0.1077080667,
-0.2157651484,
-0.0335097536,
-0.0349800438,
-0.1383124888,
0.0841219574,
0.2307424992,
-0.1095755771,
0.0561305657,
-0.2220049351,
0.0077632591,
0.1120326072,
-0.0254598912,
-0.0381969959,
-0.0416818559,
-0.2548494935,
-0.2413042039,
-0.2299584597,
0.4161981642,
0.0681595504,
0.1740724146,
0.1358340979,
-0.1965061873,
0.1943803728,
0.0443458222,
0.2354698777,
-0.0281360578,
-0.2152679563,
0.2673338652,
0.2650150955,
-0.2199539095,
-0.2671394944,
0.0153985098,
0.0858295187,
0.1488802284,
-0.5809307098,
0.3345521688,
-0.4182728231,
-0.0692414194,
-0.2077292651,
0.1764720082,
0.1968635917,
-0.2409965992,
-0.0182339922,
0.1923281848,
-0.1248922497,
0.0034073815,
-0.0392328575,
0.2761058509,
-0.1794133186,
0.3650553823,
-0.0341829881,
0.090649277,
0.1667856872,
-0.0816929489,
0.1674100161,
-0.1193682998,
0.1143756211,
-0.23528032,
-0.2032781541,
0.1629550457,
-0.1864679903,
-0.0897846147,
0.2568529844,
0.0274924338,
-0.4070491493,
-0.3549591899,
-0.0481116697,
-0.0825888887,
-0.085396111,
0.1855714023,
-0.3649889827,
-0.0593573712,
0.0400200635,
-0.1426536292,
-0.314499557,
-0.6197458506,
-0.2300229967,
0.0844620541,
0.1375824511,
-0.0740301013,
-0.2278306037,
0.0216149949,
-0.8412262797,
0.1526017487,
-0.141340524,
0.4096384943,
0.16379264,
-0.1827288717,
0.2759930193,
-0.1852126867,
0.4483211637,
0.3038880825,
0.0324013196,
0.2335908413,
-0.1467617303,
-0.3926109076,
-0.0477922671,
-0.1070291251,
0.2448952049,
0.5613287687,
0.3527039289,
-0.272077322,
-0.1386844665,
0.0024800096,
0.3695440888,
-0.012694262,
-0.6122844815,
-0.3650853932,
-0.1350599974,
-0.0601900555,
-0.0026263706,
-0.2886675,
0.2873372436,
-0.2353352904,
0.0492123067,
-0.0060794223,
-0.0807551891,
0.2711712718,
-0.0706981719,
-0.1623621285,
0.1187892705,
0.4574928582,
-0.0586016066,
0.2564568818,
-0.3785608709,
0.5095498562,
0.1829783767,
0.0380831026,
0.0153012909,
0.0238469578,
0.4698844254,
0.3147839606,
0.0485966876,
0.0663199276,
-0.2757833302,
0.2096985281,
-0.2540391088,
0.1294778734,
0.1517031491,
-0.0418042466,
-0.5193066001,
-0.5292095542,
0.777181685,
0.2270501703,
0.0676863641,
0.4543681443,
-0.5446686149,
-0.026205577,
0.1825096905,
-0.4340158105,
0.7889277935,
-0.4708958268,
0.3348893523,
-0.1298365593,
0.1203686148,
0.4411839247,
-0.1899067611,
0.13109231,
-0.3771680892,
0.0294764936,
0.2731671929,
-0.0956339389,
-0.1184746623,
0.3176464736,
-0.073803477,
0.4722571075,
0.25845474,
-0.3552273512,
-0.1696263552,
0.7586441636,
-0.1207197309,
-0.2305502146,
-0.5395560861,
0.0640119165,
-0.2327132672,
0.3239290118,
0.0217366293,
0.0979683474,
-0.0537596792,
-0.0609821826,
-0.2237213254,
0.1100684404,
-0.0275562964,
-0.0205849484,
-0.1457598805,
-0.2827556729,
0.1525994539,
0.1017947495,
-0.3295612037,
0.1563993543,
-0.2382218391,
0.1088366061,
-0.3978131413,
0.2081934959,
0.4564133584,
-0.1122379452,
0.1258046478,
-0.1088594869,
0.014994964,
-0.2972333133,
-0.0178751126,
-0.251060009,
-0.129437834,
0.2271602154,
0.1268540621,
0.0994769186,
-0.3163398504,
0.357756108,
0.0811466426,
-0.0535729453,
0.0244536139,
0.3702231646,
0.0564928278,
0.3644558489,
-0.0608206764,
-0.4419074059,
-0.0018404056,
0.5231866837,
0.1518365592,
-0.2070402056,
0.4211270809,
0.2474678755,
-0.0902625546,
-0.0967122242,
0.0415418148,
-0.0495921783,
-0.205309391,
0.0923493057,
0.1201531589,
0.1430459619,
0.231980741,
0.078392446,
-0.1750056744,
-0.1093823016,
0.162465319,
-0.423237741,
-0.2539901137,
-0.0248526707,
0.0481915213,
0.0019837408,
0.0486335903,
0.079755336,
0.1869951934,
0.377638489,
-0.1662531495,
0.1743820012,
0.0721224621,
0.1036488339,
-0.0061725117,
-0.2430628836,
0.0630406886,
0.2165744007,
0.1129324436,
-0.0198086221,
0.1745864898,
-0.2009207457,
0.002604261,
0.1835122108,
-0.0430873297,
0.0688632652,
0.0614924207,
-0.3432898223,
-0.1306153536,
-0.2249417305,
-0.0999412537,
0.0214992613,
-0.0673014671,
-0.1040213704,
0.0291912667,
-0.0268022455,
-0.0449197553,
0.294644624,
-0.3951109648,
0.1822444797,
0.080560118,
0.0715901107,
0.2153648734,
0.0801527053,
-0.4667015374,
-0.1507306993,
0.2663045526,
0.0661005378,
0.0761101022,
-0.1938565075,
0.0803809017,
-0.0789444596,
0.3015626967,
0.6326428652,
0.0356381647,
-0.1744020283,
0.1561798453,
0.0443390682,
-0.2226137072,
-0.0299927033,
0.2348786741,
0.1107999831,
-0.0944955051,
0.1726057529,
0.1578611284,
0.0348747969,
-0.3649466634,
-0.2931928933,
0.3783936501,
-0.1450098753,
-0.2155482918,
0.0737630352,
0.1551504433,
-0.1362433881,
0.2214230448,
0.1097291857,
-0.0541348197,
0.3969764113,
0.1784482002,
0.2801536918,
0.0857570097,
0.384337157,
-0.4132493436,
-0.2857426107,
-0.1096894071,
0.527492404,
0.0916659385,
0.1056271195,
-0.107756041,
0.306468904,
-0.0640577823,
0.0382282622,
-0.1252683699,
0.1632191539,
0.0184707399,
-0.1065422893,
0.203377381,
0.0467288941,
-0.212821275,
0.2236434519,
-0.2771656215,
0.0626527071,
-0.0550696105,
0.3192573488,
0.067442216,
-0.4905601442,
0.121072419,
0.0442173593,
0.2686517239,
-0.2866967916,
0.0787749216,
-0.2237660885,
0.0632781535,
0.1702564359,
0.1406056732,
0.3410822153,
0.34329018,
0.0281057023,
0.1186328158,
-0.2185479999,
0.0354688913,
-0.2167065889,
0.3741959333,
0.2538715303,
0.2906409502,
0.165825963,
-0.0742583945,
-0.0946065485,
0.1677907854,
-0.0720724761,
-0.1949002594,
-0.2488557696,
0.31402722,
0.1776566803,
-0.249556005,
-0.5308211446,
0.1827209294,
-0.3358587623,
-0.2405872792,
0.2575833499,
-0.0709406286,
-0.041386731,
-0.0808176696,
0.0254606418,
-0.2634347081,
0.6369584799,
0.4109858274,
0.4801577628,
-0.1210206002,
-0.2793285251,
-0.5621273518,
-0.1291813254,
-0.2751354277,
-0.0435370393,
0.0579453669,
0.3335484862,
0.0830456316,
0.1002025753,
-0.4502341151,
0.2044606805,
-0.1037473232,
-0.0798094273,
-0.3132748008,
-0.0057977359,
-0.2921863198,
0.0728383362,
-0.141101867,
-0.2275682092,
0.0896722525,
-0.2099713385,
-0.101742655,
-0.1241569296,
0.1359552741,
0.1137381643,
0.449231416,
0.2314136922,
0.0325364694,
0.5327632427,
0.0228705704,
-0.0580754802,
-0.4057280123,
0.1121146306,
-0.2275321037,
0.2010346949,
0.1649187654,
-0.0808060393,
-0.1234427989,
0.3457123935,
0.0963355303,
0.2587435544,
-0.1959905624,
-0.0336915217,
-0.0136604607,
-0.2710325122,
0.0340367928,
0.3747012317,
-0.0256209038,
0.505181253,
-0.0297330394,
-0.0742560849,
-0.3696880341,
0.0132707804,
0.2512351274,
-0.29221192,
-0.0986906067,
-0.2751004398,
0.3505185843,
0.005511919,
0.3568723202,
-0.3684666753,
-0.0565421954,
0.3812300265,
0.100053668,
-0.0715709701,
0.5711696744,
0.0044753738,
-0.1593970954,
-0.0558203943,
0.1501428336,
-0.0609392002,
-0.1890613735,
-0.298586607,
-0.2125806063
] |
https://github.com/huggingface/datasets/issues/546 | Very slow data loading on large dataset | @agemagician you can give a try at a multithreaded version if you want (currently on the #548).
To test it, you just need to copy the new `text` processing script which is [here](https://github.com/huggingface/nlp/blob/07d92a82b7594498ff702f3cca55c074e2052257/datasets/text/text.py) somewhere on your drive and give it's local path instead of `text` to `load_dataset`. E.g. in your example:
```python
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('./datasets/text.py', # path to where you've dowloaded the multi-threaded text loading script
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
``` | I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ? | 76 | Very slow data loading on large dataset
I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ?
@agemagician you can give a try at a multithreaded version if you want (currently on the #548).
To test it, you just need to copy the new `text` processing script which is [here](https://github.com/huggingface/nlp/blob/07d92a82b7594498ff702f3cca55c074e2052257/datasets/text/text.py) somewhere on your drive and give it's local path instead of `text` to `load_dataset`. E.g. in your example:
```python
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('./datasets/text.py', # path to where you've dowloaded the multi-threaded text loading script
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
``` | [
-0.2351229638,
-0.1589973271,
-0.1092257947,
0.1558214277,
-0.0522062443,
0.0935268253,
0.2101727873,
0.3628000021,
0.2417388856,
-0.1184582263,
0.0937152505,
0.2659689486,
-0.1745840311,
0.2608833015,
0.2182200849,
0.1295302957,
-0.0509271845,
0.1763874441,
0.0324815772,
-0.0707732812,
0.0224849507,
0.0642836317,
-0.2663990557,
-0.2938196361,
-0.1675430834,
-0.0311636291,
-0.0605308414,
0.1297038496,
-0.0731487274,
-0.2301614285,
-0.0335926302,
0.1957898885,
0.02794962,
0.2765223682,
-0.0001104621,
-0.2245027572,
0.3785299063,
0.0646179318,
-0.1945027113,
-0.108168602,
0.3321501017,
-0.5667903423,
0.1606935263,
-0.1560639292,
0.0011674836,
0.0118142106,
0.0906920806,
-0.0626588166,
-0.0046143681,
0.0926828235,
0.1748110801,
-0.0288206451,
-0.2254573405,
0.2356195748,
0.0194674283,
0.1631800681,
-0.035190437,
0.337980777,
0.4009986818,
-0.1895383894,
-0.3002910018,
0.1376228184,
-0.1286111325,
0.1814780533,
0.2808216214,
0.0355102345,
-0.177979365,
-0.1237100065,
0.0881626457,
0.3048613071,
0.436606735,
-0.0966572464,
-0.0096874423,
-0.4623737335,
0.0237907618,
-0.2291668653,
0.2377025485,
0.2097757757,
-0.2134125233,
-0.3197594881,
-0.435069561,
-0.2194340229,
-0.0508469045,
0.1706945449,
0.1299185455,
0.1563620269,
-0.0357266292,
0.0971488059,
0.3201566637,
-0.0055311811,
-0.0444112159,
-0.0709360465,
0.2302668691,
0.4172829986,
-0.4461004436,
-0.072912246,
0.1141119301,
0.2020265609,
0.0926702991,
0.0474265888,
0.1258239746,
0.0924157202,
-0.1682851315,
0.04492791,
0.2149931192,
0.3550409079,
-0.2170165777,
-0.2151006758,
0.3845050931,
-0.0908295065,
-0.1448481977,
0.2591443658,
-0.0805853456,
-0.1868590266,
-0.0264055878,
-0.3573793173,
-0.1944362372,
-0.3192494512,
-0.0887170583,
-0.1874367595,
0.0210011154,
-0.2330122292,
0.1086942479,
0.2244623899,
-0.3101557493,
0.2738786936,
0.0386035293,
-0.0659092665,
-0.3392502964,
0.0695980638,
-0.1426310241,
0.0815393254,
-0.1882120967,
0.182324186,
0.6098544598,
-0.0081199314,
0.1737591326,
0.1366222054,
-0.2034405172,
-0.1409622729,
-0.1883975118,
-0.279794693,
-0.1239787042,
-0.0474042073,
0.0543629155,
0.1805571765,
-0.0143887382,
0.269684881,
-0.3400959373,
0.1253843904,
-0.3222603798,
-0.1604732424,
0.0425278917,
0.1878407896,
-0.4041227102,
-0.4088692963,
-0.2737721205,
0.4251867831,
-0.0619104095,
-0.1103430837,
-0.258369565,
-0.0493569039,
-0.3161037564,
0.0233811997,
0.1262766421,
0.2047502697,
-0.0492466167,
-0.0198605284,
-0.1246586591,
0.3684649467,
0.2619369626,
0.5896154642,
-0.1822745502,
0.2504911125,
-0.1912333071,
0.0575081259,
0.4228338003,
0.0254766941,
-0.3447415233,
0.4530773759,
-0.2493124157,
0.035221532,
0.1712363511,
0.3352909684,
0.0411992706,
-0.0678874329,
0.2889813483,
0.6735242009,
0.1797510535,
0.2687717974,
-0.4803861976,
-0.191373378,
0.2381403446,
0.4988000095,
-0.1004821658,
-0.1164525673,
-0.0320143476,
0.059352085,
0.4168198407,
0.1106981933,
-0.0889547095,
0.3573487401,
-0.279951483,
0.0601352789,
0.0224748105,
0.0664098412,
-0.169840768,
0.2152332366,
0.1656544805,
0.1306633949,
0.3433745205,
0.1143691614,
-0.1496987939,
-0.3231547773,
0.0520345606,
0.1409708112,
0.0762308836,
0.0999660194,
0.0605357885,
0.2033242881,
-0.1631734371,
0.3446237147,
-0.1490084231,
-0.2035679519,
-0.3692208827,
-0.0732147768,
0.1707625836,
-0.160667792,
0.3976023197,
0.3120850921,
-0.1375889033,
-0.0085785016,
-0.1242142543,
0.0145771913,
-0.3173177242,
0.32436046,
0.0493603759,
0.1176920161,
0.0104944464,
0.0003514141,
0.35714975,
0.1619953066,
0.233458519,
-0.3475338817,
-0.3124430776,
0.3943001628,
-0.227800861,
0.2335009575,
0.1397651285,
-0.4423325956,
0.0909138843,
-0.1508067846,
0.1609528661,
0.2870856225,
0.864359796,
0.1086476296,
0.4909005761,
0.2559532523,
-0.2232833505,
0.0353814512,
0.5316964984,
-0.0115910396,
-0.1453195065,
0.4955512285,
-0.1591793299,
-0.3933447599,
-0.07640329,
-0.0008985549,
0.3892951012,
0.33653301,
0.0801729634,
0.0598103255,
-0.100460358,
-0.2265397757,
0.2111978829,
-0.1728011072,
0.22050488,
0.1607718319,
0.3817355335,
0.0727362335,
-0.5193352699,
-0.2708148658,
-0.0174346641,
0.3317101598,
-0.0837922841,
-0.0142661892,
-0.0677764267,
-0.3323652148,
-0.2030871063,
0.0228838325,
-0.3021141887,
-0.2011322379,
0.0522106178,
0.1582276225,
0.3566643596,
-0.1291053742,
0.0349884704,
0.1396400332,
-0.2011191696,
-0.4455120862,
-0.2782617211,
-0.1495082974,
-0.387642175,
0.0376854986,
0.2981482744,
0.2329721153,
0.3325831294,
0.1779064238,
-0.2270574421,
0.256395936,
-0.0381542891,
-0.2229506224,
-0.100167267,
-0.0277869795,
-0.190345794,
0.0827778727,
0.1761142612,
-0.1057820097,
0.0916428417,
-0.3298031688,
-0.0209476165,
0.1437875926,
0.0481246635,
0.0051463507,
-0.087323986,
-0.2857705951,
-0.2702841461,
-0.3121493459,
0.5135660172,
0.0740727186,
0.194620952,
0.2714244425,
-0.2012182176,
0.2871984541,
0.0897093117,
0.2429692,
-0.0052619614,
-0.1578310281,
0.3171803355,
0.2386639416,
-0.2387668937,
-0.2388913482,
0.0595538691,
0.0963344947,
-0.0034871921,
-0.6804847717,
0.2912914753,
-0.4513460398,
-0.0845334902,
-0.0776607245,
0.1385238916,
0.2351497412,
-0.1911543161,
-0.1271465123,
0.2458197474,
-0.1807978451,
0.0519635566,
0.0566412807,
0.1780401766,
-0.1369867921,
0.3521681428,
-0.0042591244,
0.0260340758,
0.0612691939,
-0.1209981889,
0.221927762,
-0.0383065417,
0.0854205489,
-0.1794352531,
-0.2658481598,
0.2797510028,
-0.1560488939,
-0.1178875417,
0.3768692017,
0.0356276408,
-0.4968028963,
-0.2630111575,
-0.1910328269,
-0.0802944824,
-0.0930125937,
0.2119566798,
-0.3439241052,
-0.1305856854,
-0.0440255366,
-0.1847308278,
-0.1962886453,
-0.7099230289,
-0.1090597138,
0.1553483903,
0.0283266623,
0.0216540582,
-0.1448138505,
0.056586273,
-0.7600152493,
0.0929536968,
-0.1794434935,
0.4511083364,
0.0435852259,
-0.1010142267,
0.3006531298,
-0.2252384275,
0.3230231404,
0.2441086769,
0.2112260461,
0.22973755,
-0.0903325751,
-0.3802278638,
-0.0638857633,
-0.2193335593,
0.1761721075,
0.5705080032,
0.3222825229,
-0.1027429998,
-0.106600441,
0.0648818016,
0.2419091016,
-0.0228409991,
-0.5537737608,
-0.3349890709,
-0.1996305287,
0.0151830241,
0.1195351556,
-0.2522741556,
0.1908491552,
-0.1686491966,
0.0070362091,
0.082800895,
-0.1175254732,
0.2057753354,
0.0423093252,
-0.2121246755,
0.1324385405,
0.3194183409,
-0.0765130818,
0.1944125742,
-0.467710793,
0.3661049008,
0.2831434011,
0.2096605897,
-0.0112775965,
0.0569319315,
0.3561213017,
0.2310210168,
0.1218744516,
0.0553157292,
-0.2382798493,
0.2259386331,
-0.2462033629,
0.0201346166,
0.1647596508,
-0.00669159,
-0.3034950197,
-0.3959757686,
0.8766255379,
0.1751069427,
0.052159518,
0.4086865187,
-0.4542880654,
-0.0554089546,
0.1856636405,
-0.2767316997,
0.7395021319,
-0.4225723743,
0.2307500243,
-0.1247344166,
0.0336237997,
0.3825364113,
-0.218972683,
0.0912724435,
-0.3892377615,
0.0464172326,
0.315510422,
0.004571788,
-0.1580929011,
0.2595570087,
-0.236660406,
0.4072611034,
0.4061505198,
-0.3258393407,
-0.1774653196,
0.7742530704,
-0.1733980626,
-0.2909516394,
-0.5964767933,
0.1502722949,
-0.3037244081,
0.4257582426,
0.0358875766,
0.0840642005,
0.0225061178,
-0.0011621267,
-0.3838971257,
0.1141906828,
-0.1107713729,
-0.071773544,
-0.1724253893,
-0.2370328307,
0.1862775981,
0.1481086016,
-0.2201445997,
0.2130775452,
-0.2657668293,
0.1285836995,
-0.3506118357,
0.2198165953,
0.514567256,
-0.0815348402,
0.1458852142,
-0.2349338084,
-0.0623746663,
-0.2554593682,
0.0540959872,
-0.2006591856,
-0.0338577777,
0.1091722623,
-0.0494349748,
0.1636490226,
-0.2356079668,
0.2690000534,
0.0268390402,
-0.0462337732,
0.1172100082,
0.3300507665,
0.0246530101,
0.3350315988,
-0.0102443229,
-0.4108899236,
-0.0247598663,
0.5550801754,
0.0905227214,
-0.1743536592,
0.3781974018,
0.2993065715,
-0.1026783809,
-0.1935217083,
-0.0058144331,
-0.1136119813,
-0.1300801337,
0.0919473544,
0.1440706402,
0.0014866292,
0.2302415967,
0.0934486687,
-0.0601607002,
-0.0756296441,
0.0322807878,
-0.4494123459,
-0.3595043421,
-0.0184065066,
-0.0575400293,
-0.0257058851,
0.0484155715,
0.0251543038,
0.318577826,
0.2946864069,
-0.269120574,
0.1614513099,
-0.074661918,
0.022746101,
0.0471830554,
-0.3493124247,
0.110475719,
0.2076110095,
0.1626277268,
-0.1240936965,
0.1149848998,
-0.2853879631,
0.0394420624,
0.106225878,
-0.058101207,
0.0430497825,
-0.0194173958,
-0.3257529438,
-0.0764430761,
-0.1878135949,
-0.0525648966,
-0.037562754,
-0.0514753796,
0.0023721233,
-0.0455617234,
-0.0308667086,
-0.0019328222,
0.1513372362,
-0.4418314099,
0.1661905348,
0.1071059555,
0.0150323901,
0.328692615,
0.0265072286,
-0.5275722742,
-0.0933597907,
0.0859324485,
0.1513929963,
0.0765262097,
-0.1655910611,
0.0297098905,
-0.072044678,
0.257823199,
0.4545101523,
0.064974919,
-0.255161643,
0.1589474082,
0.1518912017,
-0.2063766718,
0.0199878998,
0.2340765446,
0.0335200205,
0.0116145611,
0.1474230289,
0.1710616648,
0.0347310454,
-0.2572360933,
-0.2264292985,
0.3829541504,
-0.2253788114,
-0.2730337083,
-0.0023026755,
0.1371924728,
-0.1771079302,
0.2033014894,
0.0657121241,
0.0528955534,
0.4379096627,
0.2583203018,
0.2723622024,
0.12382631,
0.4825108349,
-0.3734049201,
-0.1923395842,
-0.138087064,
0.3719213307,
0.0837011486,
0.1760031879,
-0.0947281569,
0.3289037347,
-0.1408202648,
0.0054065809,
-0.2425810099,
0.1206111759,
-0.0500649735,
-0.1232206821,
0.1673990637,
0.0422043651,
-0.281521827,
0.2524551749,
-0.2612041235,
0.129765749,
0.0048758723,
0.317258805,
-0.0420137942,
-0.3678250909,
0.0365348905,
0.0090391142,
0.2566406727,
-0.2119983733,
0.2118878514,
-0.1680599451,
0.0850644931,
0.0380126126,
0.2695976198,
0.3048684001,
0.3242581785,
-0.0846754313,
0.0877341256,
-0.1773623377,
-0.0074822083,
-0.2701367438,
0.404450804,
0.1985896826,
0.3803495467,
0.1028905213,
0.0262565054,
-0.1636013687,
0.1692244112,
-0.1378524601,
-0.2667510509,
-0.1030697227,
0.2986754477,
0.2648006082,
-0.2539794445,
-0.570564568,
0.1493516713,
-0.34191221,
-0.2652080059,
0.1564535648,
-0.0367873609,
-0.0252591018,
-0.1077064276,
0.0758046135,
-0.2157667279,
0.7528707981,
0.3798870742,
0.3102046549,
-0.0567046963,
-0.2296940982,
-0.5407271385,
-0.1294490695,
-0.2332153022,
-0.1430045962,
0.0734250396,
0.3392793238,
0.0097256973,
0.0686182603,
-0.366165638,
0.2425914109,
-0.0832143053,
-0.0967298374,
-0.3036372662,
0.1143030226,
-0.2806532383,
0.0782234892,
-0.1154288501,
-0.2335685492,
0.0654677004,
-0.2150259316,
-0.0165010616,
-0.1261471808,
0.1125929952,
0.0025022849,
0.383072257,
0.2040145099,
-0.0181004293,
0.3712950647,
-0.0314273797,
-0.1291225404,
-0.4532831311,
0.1737649292,
-0.2454365492,
0.1342476457,
0.1467777342,
0.0351159796,
-0.0224410817,
0.266422987,
0.0916811228,
0.2971768975,
-0.2757700682,
0.0320414081,
0.0875486955,
-0.2770707607,
0.0480130315,
0.3402580321,
0.0337144211,
0.5124371052,
-0.0249809995,
-0.0983122066,
-0.5657388568,
-0.0551325716,
0.2666741014,
-0.2849324048,
-0.1079977602,
-0.2226087451,
0.3603273034,
0.0314404219,
0.3329225183,
-0.4231390357,
-0.0123159736,
0.2397254407,
0.1389219165,
-0.0445494093,
0.5026025176,
-0.0100920219,
-0.1497849375,
0.0342617929,
0.0675463229,
-0.0645565093,
-0.0972399563,
-0.3322934806,
-0.1884771287
] |
https://github.com/huggingface/datasets/issues/546 | Very slow data loading on large dataset | I have already generated the dataset, but now I tried to reload it and it is still very slow.
I also have installed your commit and it is slow, even after the dataset was already generated.
`pip install git+https://github.com/huggingface/nlp.git@07d92a82b7594498ff702f3cca55c074e2052257`
It uses only a single thread.
Did I miss something ? | I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ? | 50 | Very slow data loading on large dataset
I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ?
I have already generated the dataset, but now I tried to reload it and it is still very slow.
I also have installed your commit and it is slow, even after the dataset was already generated.
`pip install git+https://github.com/huggingface/nlp.git@07d92a82b7594498ff702f3cca55c074e2052257`
It uses only a single thread.
Did I miss something ? | [
-0.1991717666,
-0.1527014822,
-0.0914237127,
0.1850313246,
-0.0580271333,
0.0069910884,
0.0542508848,
0.3648397923,
0.2267757356,
-0.1864668727,
0.1637937427,
0.2714457512,
-0.098570101,
0.1658662558,
0.1985836029,
0.1851770729,
-0.0191422924,
0.2842984796,
0.1013759226,
-0.1887505203,
0.0956260115,
0.0487127453,
-0.1847580969,
-0.2295957804,
-0.1763941795,
0.0025340468,
-0.0277064927,
0.10095267,
-0.1621126384,
-0.2173818499,
0.0553881824,
0.1236787811,
-0.0306910463,
0.3531836867,
-0.0001217054,
-0.2242821902,
0.3527977765,
0.1074255332,
-0.1829124391,
-0.0307796374,
0.2109850645,
-0.6483903527,
0.1963074505,
-0.1187643409,
-0.0270371251,
0.0886711851,
0.0861746147,
-0.0851285905,
0.0995742157,
0.1071524769,
0.09661071,
-0.1017710716,
-0.2455631793,
0.2750655413,
0.0188058019,
0.3272773921,
-0.0368347578,
0.3568098843,
0.3645659089,
-0.1647836864,
-0.1937815845,
0.0544301718,
-0.1813872904,
0.176754564,
0.3036624789,
0.0144899338,
-0.0835128799,
-0.1341195107,
0.1963966191,
0.3170996308,
0.5264440775,
-0.1180864722,
-0.0595810115,
-0.4169435501,
0.041528549,
-0.1729480028,
0.1447461545,
0.2603042126,
-0.2250423133,
-0.2841114998,
-0.4277852774,
-0.41022259,
-0.0714328736,
0.2077245563,
0.1785084605,
0.0898233727,
-0.0192155149,
0.1839961112,
0.2658655643,
0.1081896648,
-0.1406802237,
-0.0842886344,
0.1884382665,
0.4555002153,
-0.3959885538,
-0.0676119775,
0.1481574774,
0.1502483934,
0.131848231,
0.082677573,
-0.0320090689,
0.0501999892,
-0.1457348168,
-0.0421878919,
0.2815546393,
0.388104707,
-0.2023485303,
-0.1266424656,
0.4440859556,
-0.2445269525,
-0.1741782129,
0.241317451,
-0.1810798496,
-0.1101008505,
0.0550282001,
-0.3759532869,
-0.1100719944,
-0.3834214211,
-0.1334658563,
-0.0778845698,
0.0038251877,
-0.2980486751,
0.0217746422,
0.2225459069,
-0.3536616564,
0.2660469711,
0.0385540724,
-0.1337033212,
-0.412512362,
0.0827439129,
-0.1261990964,
0.0016381778,
-0.1777823269,
0.246743083,
0.510446012,
-0.1568820775,
0.1790260971,
0.184855178,
-0.204681471,
-0.2262262106,
-0.2678072751,
-0.2743784785,
-0.2211676836,
0.0030920673,
-0.0159183964,
0.2825621367,
0.0864945874,
0.210014239,
-0.366464138,
0.1766587794,
-0.4176151156,
-0.1564799845,
-0.0340587869,
0.0695468858,
-0.4505045116,
-0.2972316444,
-0.2967458367,
0.3501702547,
-0.021043513,
-0.0672933161,
-0.2913910747,
-0.0671825707,
-0.2015487999,
-0.0603888258,
0.1400725245,
0.2322686017,
-0.0928429663,
-0.0590415113,
-0.1035437882,
0.2933354974,
0.3807815909,
0.6156038642,
-0.1922992468,
0.3422889113,
-0.1217294261,
-0.0452180207,
0.5736958981,
-0.0108834952,
-0.4533560574,
0.5298501849,
-0.2530939579,
0.0212490112,
0.1293160617,
0.3815021515,
0.0154523514,
-0.03429088,
0.2500490546,
0.5158033371,
0.1687351465,
0.200635016,
-0.4838426113,
-0.1462136954,
0.2273880988,
0.3783595562,
-0.1032707244,
0.0150908921,
0.0358742885,
0.0942553878,
0.4295170903,
0.169292599,
-0.1244560331,
0.3971878886,
-0.1047699302,
0.0351576768,
0.0449833907,
0.1743529737,
-0.160843268,
0.167067796,
0.1669220626,
0.1247470379,
0.4018217921,
0.1500276625,
-0.2643033564,
-0.1956247985,
0.0506968088,
0.146817565,
-0.04986424,
0.0479864106,
0.0912838131,
0.1791481227,
-0.1856813133,
0.3405305743,
-0.1098366082,
-0.2161189914,
-0.3428981602,
-0.0301034972,
0.2130297273,
-0.1742121279,
0.4311962724,
0.3232761621,
-0.1331784725,
-0.0647698343,
-0.0922234207,
-0.0369057618,
-0.2966367006,
0.2482909262,
0.1047017202,
0.1432349533,
-0.0030895453,
-0.0303133726,
0.3299259841,
0.023628518,
0.2452351153,
-0.4048092067,
-0.1249652356,
0.3379196525,
-0.2271559536,
0.2508985698,
0.0562171414,
-0.5008865595,
0.1578983665,
-0.1253254712,
0.1313191205,
0.3620716929,
0.9713239074,
0.0677561164,
0.5750045776,
0.2706083357,
-0.1945349872,
0.0007567555,
0.527556777,
-0.0308703631,
-0.1748846769,
0.5408585668,
-0.137752831,
-0.3145549893,
-0.0478421636,
-0.1603898406,
0.4037879705,
0.2320176065,
-0.0110660754,
0.0500108227,
-0.0768372938,
-0.2358859479,
0.1037638038,
-0.122410059,
0.300744772,
0.1260946542,
0.3335479498,
0.0369230546,
-0.4046321511,
-0.1581194401,
-0.0177755952,
0.5295584202,
-0.0765734762,
0.073088184,
-0.0248252712,
-0.3180571795,
-0.2187277675,
-0.0239543244,
-0.2930157185,
-0.1844210029,
-0.0439072512,
0.0747430176,
0.5305740833,
-0.1234556139,
0.1065891832,
0.1333360672,
-0.2151920795,
-0.4275612533,
-0.3017469645,
-0.1941063404,
-0.3608126044,
-0.0455352329,
0.3237593472,
0.2041280866,
0.3716341555,
0.1838138998,
-0.2097916901,
0.1934050769,
-0.0787156746,
-0.2579253912,
-0.1157395914,
0.0656481758,
-0.2042789161,
0.2544606328,
0.1804485172,
0.0209542476,
0.1526816338,
-0.3680245578,
-0.0320832133,
0.133816123,
-0.1131732911,
-0.0788569376,
-0.0251729824,
-0.1890714914,
-0.2600288689,
-0.1960692555,
0.5295094252,
0.1719644964,
0.2020898461,
0.2145919502,
-0.1746445,
0.2627964914,
0.1632846892,
0.1996219158,
-0.090522483,
-0.2076040059,
0.2065791488,
0.338142097,
-0.2271893471,
-0.1761687994,
0.1211568862,
0.1823152453,
-0.0823854655,
-0.7940249443,
0.2953014672,
-0.4839647114,
-0.0214369223,
-0.0950322896,
0.152950868,
0.2004409581,
-0.2212818712,
-0.0338634886,
0.2205285579,
-0.269723475,
-0.0121430717,
0.0616769753,
0.1756334603,
-0.162467882,
0.4543954134,
-0.0773450509,
0.1778041422,
0.2052346319,
-0.1453510672,
0.1505453438,
-0.0238700565,
0.0915133134,
-0.1570836604,
-0.2992314696,
0.260060817,
-0.1520545483,
-0.1015936434,
0.3296045065,
0.0612489991,
-0.4600825906,
-0.2959477603,
-0.1237132028,
-0.0436813906,
-0.0430812575,
0.1301202625,
-0.3121578097,
-0.1345962882,
0.0089059472,
-0.1670846343,
-0.2617810071,
-0.7116494179,
-0.1529485583,
0.139778614,
0.0989072621,
0.0223067105,
-0.2154082954,
-0.0797725245,
-0.8438500166,
0.0574548803,
-0.2956528068,
0.4102830887,
0.0209304839,
0.0098652765,
0.4633245468,
-0.2124313563,
0.3210785389,
0.3303011358,
0.0987617224,
0.1878582239,
-0.1014783978,
-0.4424764812,
-0.0420519523,
-0.140691027,
0.3415519893,
0.6328334212,
0.2685374022,
-0.2692621052,
-0.1277538687,
0.0687531531,
0.2400994599,
-0.1106043234,
-0.5722795129,
-0.4723234177,
-0.2107155323,
-0.0132699162,
0.1024188921,
-0.3304751515,
0.1581961364,
-0.1131991595,
0.0735713467,
0.0565331131,
-0.0732528493,
0.272544682,
0.0096540824,
-0.2714056075,
0.1710287035,
0.2738585174,
-0.0345492661,
0.1695297211,
-0.4581919312,
0.4365378618,
0.2314760536,
0.2155779451,
0.0007070936,
0.0947540104,
0.4200230241,
0.2714298069,
0.1474986374,
0.066440694,
-0.1856098771,
0.2110602856,
-0.331625998,
0.0767131001,
0.1759398878,
-0.0090969093,
-0.4444129467,
-0.4641417861,
0.8058823943,
0.1324351579,
0.0663397908,
0.4682040811,
-0.3776952326,
0.0978347957,
0.0659834445,
-0.2738633156,
0.8955383301,
-0.4456985593,
0.1810888052,
-0.0647237971,
0.0101053789,
0.4141215682,
-0.148463428,
-0.0020362195,
-0.43854931,
0.0917850733,
0.2670579255,
-0.0993267149,
-0.0731004998,
0.1387939751,
-0.1598207355,
0.4082348943,
0.3856762648,
-0.3247514069,
-0.1209255308,
0.8059946299,
-0.0009168033,
-0.1387108564,
-0.5118179321,
0.0789531618,
-0.3942386508,
0.5236883759,
-0.0254258923,
0.0788045675,
-0.107371822,
0.0096577629,
-0.3741384149,
0.1029962227,
-0.0440151915,
-0.0804705471,
-0.0706520006,
-0.3025925756,
0.1593976468,
0.0480738953,
-0.1583144367,
0.1963737607,
-0.2845976353,
0.1651973575,
-0.3597009778,
0.229522109,
0.511716783,
0.0027040094,
0.0890643671,
-0.2288411856,
-0.0836589187,
-0.1240885481,
-0.00907759,
-0.3102368116,
-0.0539713353,
0.1654185951,
0.0732494593,
0.1553657353,
-0.2900009751,
0.3394243121,
0.0487428755,
-0.0190634429,
0.0211601146,
0.4879477918,
0.0691232309,
0.3877799809,
0.0445621684,
-0.4567279816,
-0.0220169052,
0.473484993,
0.1988560855,
-0.1774842739,
0.4827732742,
0.2412370741,
-0.1307264566,
-0.1122943088,
0.0090098232,
-0.1025875658,
-0.2773763537,
-0.0096221883,
0.1534975469,
0.0580187589,
0.2951737046,
0.1091603786,
-0.120433107,
-0.061847765,
0.0590345562,
-0.3825007677,
-0.3001972437,
-0.0500293672,
-0.0319830514,
-0.0089847064,
0.0168448761,
0.0097813793,
0.2448108494,
0.4505027831,
-0.1600067317,
0.1078562513,
-0.0533719584,
0.0784185603,
0.1320560724,
-0.2580535412,
0.0839032829,
0.1609607786,
0.0528895855,
-0.1663526148,
0.0882006437,
-0.1834826469,
0.1023487002,
0.1997906268,
-0.0797453672,
0.0218735337,
0.0000603497,
-0.2671489418,
-0.0195671022,
-0.1199632436,
-0.1016683653,
-0.0600433648,
-0.0506259426,
-0.003646642,
0.0608454198,
-0.0245202631,
0.0461867452,
0.1974098533,
-0.4216268659,
0.269349426,
0.1194001734,
0.0213583391,
0.4252481461,
0.0188261494,
-0.6070495248,
-0.123491779,
0.2083403766,
0.1969068199,
0.1300874203,
-0.2446460873,
0.1431638896,
-0.02189596,
0.2003122121,
0.4992821217,
0.0707610846,
-0.2289081216,
0.1936348826,
0.0427842885,
-0.185431838,
-0.0000551641,
0.3791793287,
0.1669327915,
-0.1386949569,
0.1918567419,
0.1223264039,
0.0853969902,
-0.3114672899,
-0.2537025511,
0.4503019154,
-0.2387216687,
-0.2689576745,
-0.0144662149,
0.1872868538,
-0.1875884682,
0.1477090418,
0.0865148529,
-0.0531235412,
0.3869541883,
0.1852476448,
0.2312437445,
0.1526881903,
0.4042870998,
-0.3817748725,
-0.1503843069,
-0.1685388237,
0.4807797968,
0.0616285428,
0.1368675977,
-0.0440533757,
0.2922137082,
-0.0140512493,
-0.022286566,
-0.2037828714,
0.1579190642,
-0.0241179802,
-0.0707796589,
0.1406329572,
0.1290699542,
-0.1040129215,
0.2433005869,
-0.2980499566,
0.1007593349,
-0.0613964163,
0.3115347028,
-0.0987395942,
-0.4200947881,
0.0597671084,
-0.050638739,
0.2242968827,
-0.1350818425,
0.2209186256,
-0.2688322961,
0.0531477332,
-0.0043942393,
0.2688493133,
0.2770282328,
0.4238792658,
0.0000268668,
0.104369998,
-0.0800284743,
-0.0206313655,
-0.1851528883,
0.4359557033,
0.1467039585,
0.3372706473,
-0.0445447266,
-0.0802006796,
-0.0651486963,
0.1318267286,
-0.1469192058,
-0.2350209504,
-0.1072513014,
0.3373670578,
0.1562325954,
-0.3151373267,
-0.5273731351,
0.1796393692,
-0.1902775019,
-0.1569945812,
0.087498337,
-0.0946275964,
-0.0411556512,
-0.1593942195,
0.0105489194,
-0.3326227367,
0.7124176621,
0.3719669878,
0.4264705777,
-0.145303607,
-0.2639221251,
-0.5644289851,
-0.0651562214,
-0.3483982682,
-0.1616449952,
0.0900788158,
0.3021502197,
0.0062262528,
0.0611701198,
-0.3685866594,
0.0544764921,
-0.0754302293,
-0.0999558717,
-0.2846450508,
0.0190536678,
-0.3252052963,
0.0777172148,
-0.1635838449,
-0.1889845878,
0.1147747934,
-0.2919635177,
-0.1257846802,
-0.1764402688,
0.0499907732,
0.0072850734,
0.5267720222,
0.2606076896,
0.0163788851,
0.3779639602,
-0.0012550876,
-0.1727335006,
-0.3434292674,
0.0821247399,
-0.1888544708,
0.2389382422,
0.0655280203,
0.0062900931,
-0.1153210029,
0.2998865545,
0.0443258546,
0.2522704005,
-0.27022174,
0.0204275735,
0.0721704066,
-0.2926416397,
0.0333511382,
0.3275590539,
0.0152922049,
0.3754198253,
-0.0113638435,
-0.0416370109,
-0.5524223447,
0.1342729181,
0.3410865963,
-0.2705781758,
-0.0809869394,
-0.1680970192,
0.3409722447,
0.0285034515,
0.2374149859,
-0.3947708309,
-0.0690631717,
0.3056739569,
0.1608584374,
-0.0737949312,
0.5093422532,
-0.0380031131,
-0.1087642908,
0.0135921985,
0.1959218234,
-0.1505628228,
-0.1879007816,
-0.2074456662,
-0.205204919
] |
https://github.com/huggingface/datasets/issues/546 | Very slow data loading on large dataset | As mentioned in #548 , each time you call `load_dataset` with `data_files=`, they are hashed to get the cache directory name. Hashing can be too slow with 1TB of data. I feel like we should have a faster way of getting a hash that identifies the input data files | I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ? | 49 | Very slow data loading on large dataset
I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ?
As mentioned in #548 , each time you call `load_dataset` with `data_files=`, they are hashed to get the cache directory name. Hashing can be too slow with 1TB of data. I feel like we should have a faster way of getting a hash that identifies the input data files | [
-0.1172168553,
-0.0360580087,
-0.0959251076,
0.3137767017,
0.0166611802,
0.1080763638,
0.1881460696,
0.562651515,
0.3573101759,
-0.0834074914,
0.0841230899,
0.0788890794,
-0.275165379,
0.1463638544,
0.303257376,
0.152251929,
0.0406836905,
0.2266362607,
0.0538095236,
-0.1176901013,
0.0696329921,
-0.0204688106,
-0.0299105793,
-0.296864748,
-0.3017107844,
0.030994555,
-0.0536713526,
0.0801733881,
-0.0924632028,
-0.2358046025,
-0.0621912591,
0.1432611495,
0.0518746302,
0.2695996463,
-0.0001230621,
-0.1986708045,
0.3916835785,
-0.010774821,
-0.2250650078,
-0.1616008282,
0.0600102693,
-0.7300630808,
0.1203290299,
-0.1630463898,
0.0337216929,
-0.0094954092,
0.2076352984,
-0.2160122395,
-0.2010137141,
-0.0743043423,
0.0629802048,
-0.2378573269,
-0.3024275899,
0.4499880672,
0.3159752488,
0.136291787,
-0.0163583383,
0.379699409,
0.2599540949,
-0.2834987044,
-0.3445347548,
0.1477045566,
-0.0814256519,
0.0042502806,
0.3230666816,
0.2134280354,
-0.1169726998,
0.0059479829,
-0.0009595603,
0.3054632545,
0.5086731911,
-0.2065641731,
-0.2373919934,
-0.5030114651,
-0.0786999017,
-0.1927807182,
0.2358421683,
0.1443260014,
-0.2236378491,
-0.2345632017,
-0.4230128229,
-0.3044025004,
-0.0414097756,
0.1779717058,
0.2704319954,
-0.1131950021,
0.1272665709,
0.0756367147,
0.3568821847,
0.0150676072,
0.1043338329,
-0.1023491472,
0.3182213306,
0.3189168274,
-0.3557968736,
-0.0773671195,
0.0799769312,
0.340067029,
0.0997813493,
0.2149658054,
0.2113414556,
0.1314076632,
-0.1452825665,
0.0403847396,
0.1335645169,
0.4111125469,
-0.1875095963,
-0.2380566746,
0.3621020019,
-0.3224427104,
-0.337805748,
0.2185865045,
-0.1644864231,
-0.2672838569,
0.2383484393,
-0.2126471549,
-0.302836746,
-0.2935048938,
0.0493218899,
-0.0200051274,
-0.0900316238,
-0.2638336718,
0.2086574435,
0.2353862226,
-0.2940383554,
0.2273738682,
-0.0614146516,
-0.1680903137,
-0.3403707147,
0.115308851,
-0.1505287141,
0.1385419369,
-0.1539923102,
0.1784457862,
0.629375577,
-0.1461411119,
0.1557848901,
-0.1494505554,
-0.0369183831,
-0.0314945281,
-0.1652009785,
-0.2768215835,
0.0819733962,
-0.047023505,
-0.1023335457,
0.113205269,
0.0330058597,
0.2430368662,
-0.4355453253,
0.036255002,
-0.3797472715,
-0.1476531923,
0.2258040756,
0.0793962181,
-0.4669114947,
-0.2244959474,
-0.2262621075,
0.3716503978,
0.0188339576,
-0.2251744568,
-0.3301510811,
0.0527155958,
-0.2251860201,
-0.0896676332,
0.0870075524,
0.2484912574,
-0.0564144291,
-0.0262246579,
0.0206191055,
0.2286194116,
0.1919907629,
0.454819262,
-0.1003706753,
0.3429882228,
-0.269058615,
0.0256049633,
0.5262585282,
-0.0462650433,
-0.5724009275,
0.4323176146,
-0.2308466136,
0.0318440199,
0.3767979145,
0.5546632409,
0.0004537255,
-0.1959046274,
0.3050214052,
0.5151876211,
0.0486500487,
0.2658168077,
-0.4664852023,
-0.1919651777,
0.3387051225,
0.369599402,
-0.0656164438,
-0.0110719651,
0.1641365141,
0.1056738794,
0.3326954246,
0.1877185702,
-0.0577580556,
0.3255423307,
-0.1506642997,
-0.0861675143,
0.0373498052,
0.1127424389,
-0.2608893514,
0.1496978402,
0.1009581238,
0.0266807545,
0.026029041,
0.0920090973,
-0.0673763752,
-0.2983822227,
0.100933373,
0.3866146207,
-0.0716903135,
0.0765952021,
0.1257009357,
0.0621482655,
-0.1450595111,
0.4783866704,
-0.0661611184,
-0.1453858316,
-0.3675589561,
-0.0822643265,
0.2519704998,
-0.1136754453,
0.2339880615,
0.305456996,
-0.1155246794,
-0.1129524708,
-0.0965615362,
-0.0622178502,
-0.1939639747,
0.1687041223,
0.1451288164,
0.3294304907,
0.0461908281,
0.0611431822,
0.3061033785,
0.1154014915,
0.2048128545,
-0.4098869562,
-0.2786921859,
0.4903715253,
-0.2560999393,
0.1333203614,
0.1243943125,
-0.4574449956,
0.0698685423,
-0.1131151244,
0.1526698768,
0.1416290849,
0.7772764564,
-0.0267723482,
0.599798739,
0.3419164717,
0.0416852012,
-0.0370969698,
0.6666423082,
-0.0325511396,
-0.2055318952,
0.3865838051,
-0.0312759988,
-0.4225258231,
-0.0910394341,
0.001656644,
0.4693946838,
0.2598972321,
0.146708563,
0.0442967787,
-0.0494516976,
-0.2745338082,
0.156100139,
-0.1106316745,
0.0547724739,
0.1620085239,
0.3263848126,
-0.0729022622,
-0.4668150246,
-0.2270129919,
0.1389245093,
0.3100941777,
-0.1783636063,
0.0604968071,
-0.0893568248,
-0.3108633161,
-0.181017369,
-0.0770314187,
-0.2673084736,
-0.287168175,
0.1371498257,
0.1534990519,
0.3889503479,
-0.0441605598,
-0.1306313127,
0.2002461851,
-0.1349124312,
-0.6252068281,
-0.2875653505,
-0.199758023,
-0.2606436014,
-0.0950763226,
0.3254853785,
0.0230574273,
0.2869834006,
0.1275071204,
-0.0611163527,
0.0269518234,
-0.1847217977,
-0.1780192256,
-0.0560668856,
0.0458908677,
-0.1653113961,
0.1088286787,
0.1011013687,
-0.0643064156,
-0.1143612266,
-0.2909547687,
-0.1170978695,
0.1513293982,
0.056345243,
0.0040034689,
-0.0683839694,
-0.2516078949,
-0.3148126006,
-0.15047732,
0.3833205104,
0.108794421,
0.2968519032,
0.2592847347,
-0.2340591252,
0.1469754428,
-0.0350158885,
0.2759184539,
-0.0761278123,
-0.5129767656,
0.350300163,
0.2940908372,
-0.1150033176,
-0.2911066115,
-0.0380454212,
0.0169115905,
0.3138878047,
-0.6159598231,
0.1260199994,
-0.2694465518,
0.0532074124,
0.0712787509,
0.1264046729,
0.1990046054,
-0.1272868663,
0.003335122,
0.0518577322,
-0.0789892301,
0.1004399508,
-0.039975591,
0.4138160646,
0.0149865225,
0.383811146,
0.1389644593,
0.3002883792,
-0.012875095,
-0.2340248972,
0.3244724274,
-0.0722000748,
0.0982687473,
-0.1152263582,
-0.2440122962,
0.1621455997,
-0.0476999134,
-0.0663537011,
0.3179939985,
0.089809902,
-0.3988581896,
-0.3091432154,
-0.0631489828,
-0.1957660615,
-0.136297971,
0.1651678681,
-0.6871646643,
-0.1743243039,
0.0545406789,
-0.0697959065,
-0.3932174444,
-0.7218484879,
-0.020765394,
0.1602680683,
0.1696250439,
0.0409315228,
-0.1226968765,
0.0295515284,
-0.692135036,
0.1490306258,
-0.0126849823,
0.4575086534,
0.0520590991,
-0.2797317207,
0.2788510919,
-0.0648147017,
0.5138842463,
0.1857991666,
0.2223898768,
-0.0494023003,
-0.1781800091,
-0.1109014079,
0.021345146,
-0.0597059131,
0.378093183,
0.3873071373,
0.3428021669,
-0.3467059731,
-0.0881022364,
-0.0436139256,
0.2210693061,
-0.0473364815,
-0.4539403319,
-0.2696040869,
-0.0743694156,
0.1142492145,
0.101649642,
-0.2978038788,
0.1837016046,
-0.0865075588,
0.1280924678,
-0.1037399173,
-0.0666289106,
0.4675744772,
0.0323399454,
-0.1322667301,
0.0966259837,
0.3669078648,
-0.0064786393,
0.3931189477,
-0.4296891093,
0.3949213624,
0.0239032544,
0.0105796605,
-0.0990381986,
0.1186459213,
0.4053673744,
0.3409671187,
0.0405176654,
0.0353956968,
-0.1811347455,
0.0318176746,
-0.2229655087,
0.1625330001,
-0.0378647745,
-0.0219848752,
-0.5102128983,
-0.5090433359,
0.8361481428,
0.2132325172,
-0.0450973883,
0.4602643847,
-0.4012108743,
-0.114593558,
0.2288124859,
-0.4090598822,
0.949239254,
-0.5258204341,
0.2740207911,
-0.1382446289,
0.1987669319,
0.3029141724,
-0.2896645069,
0.1577134132,
-0.4653742313,
-0.2272248864,
0.2176132202,
-0.0880729258,
-0.1694071293,
0.4042202234,
-0.0187939107,
0.5223018527,
0.3355802,
-0.1276985556,
-0.119142741,
0.7030900717,
-0.0966557935,
-0.2274335027,
-0.5007802844,
0.0651401281,
-0.2991482615,
0.4341278076,
0.0483541563,
0.0430691428,
-0.1179756299,
-0.0800275579,
-0.119171828,
0.0320825651,
-0.0518310107,
0.1584092677,
-0.1272393167,
-0.2793522477,
-0.2007529289,
0.0972551033,
-0.1604654491,
0.1582181305,
-0.2683205009,
0.1277571172,
-0.190947473,
0.3028255701,
0.4667538404,
-0.0477818362,
0.1865968853,
-0.029219728,
0.0592970848,
-0.2993098497,
-0.0307526756,
-0.5082599521,
-0.1132694855,
0.1326717287,
-0.0908724815,
0.20935978,
-0.3355439007,
0.0894006118,
-0.1166322753,
-0.0095313415,
0.0105148619,
0.4713150859,
0.011035081,
0.3946826458,
0.1198589504,
-0.4577184618,
0.0139831696,
0.6548169255,
0.1030773222,
-0.3002205193,
0.3670871854,
-0.003452599,
-0.2541690469,
-0.0705166012,
0.0114985257,
-0.1161023676,
-0.0931540504,
0.0365681462,
0.1098339707,
0.0453584194,
0.177218169,
-0.1044460982,
-0.0608139858,
-0.0454835445,
0.0847872496,
-0.4854912758,
-0.3651287854,
0.1256479323,
-0.1291413754,
0.1117488742,
0.0809476823,
-0.1771703959,
0.0301035792,
0.3462207615,
-0.1482208967,
0.1582174599,
-0.0119689703,
0.1362666041,
0.0018275902,
-0.2031129599,
0.1816447377,
0.2041499019,
0.0785747468,
-0.1187302619,
0.0751339197,
-0.1347318143,
0.0523109138,
0.2363777459,
-0.0419002548,
0.1236181706,
-0.1118105575,
-0.2583376169,
-0.0259362422,
-0.2587970197,
-0.0806112438,
0.1525800824,
0.0079999585,
0.0056516938,
0.123237431,
-0.1154654473,
0.0744764358,
0.1694043875,
-0.4375491142,
0.2435422242,
0.0464202873,
0.0212837309,
0.1374114007,
0.0515323952,
-0.4352944195,
-0.0022800826,
0.2835550606,
0.0239233896,
0.1941171736,
-0.0768844783,
0.0947945714,
0.1372134984,
0.2040150166,
0.398796618,
0.0571498647,
-0.217107445,
0.297922641,
-0.0044384971,
-0.1637785882,
-0.10872145,
0.3308370709,
0.053217385,
-0.0702321008,
0.2231259048,
0.170467332,
0.093731001,
-0.329156816,
-0.3950853944,
0.4794485271,
-0.2557061613,
-0.2780346572,
0.101134181,
0.1931037605,
-0.1679012626,
0.3428746462,
0.0200516544,
-0.0306279883,
0.5752462745,
0.0383023135,
0.3500930667,
0.0070866831,
0.4169413149,
-0.5145716071,
-0.2326480299,
-0.1306803077,
0.5559143424,
0.0862724036,
0.1915239245,
-0.0950586051,
0.3922819495,
0.0793459862,
0.2054961622,
-0.1484100819,
0.0668455958,
0.0455489308,
-0.1972439736,
0.2582724392,
0.016517356,
-0.2255830616,
0.307974577,
-0.2795415819,
0.0819223076,
0.067757763,
0.4223353565,
0.1127078235,
-0.2670770884,
0.1762163937,
0.0826563388,
0.2910773158,
-0.2383850217,
0.1545838565,
-0.3150225282,
0.111170657,
0.2030688673,
-0.0098550394,
0.1738351136,
0.3405081034,
-0.0770234466,
-0.0338247046,
-0.1383886486,
-0.0386208147,
-0.3196222186,
0.5069380403,
0.1659613848,
0.3182133734,
0.0938623995,
-0.0991066471,
-0.0015629083,
0.2202218324,
0.0395088568,
-0.183870241,
-0.0821899921,
0.3029426336,
0.1810437441,
-0.2337033153,
-0.5124334097,
0.1375774145,
-0.299197942,
-0.4470987916,
0.2444851995,
0.016389858,
0.1501728147,
-0.0656901002,
0.0049451739,
-0.2424593866,
0.6127954125,
0.4382688403,
0.3431379199,
-0.2574044466,
-0.053863287,
-0.5327313542,
-0.159091413,
-0.1744762808,
0.0389421582,
-0.1136804149,
0.2780539393,
0.0476780199,
0.099218443,
-0.4922328591,
0.1934952438,
-0.0417267904,
-0.0505760051,
-0.3259425759,
-0.0463674553,
-0.2659068108,
0.0837055668,
-0.111372292,
-0.3223602176,
0.0128787402,
-0.1545254588,
-0.0974244699,
-0.1571797132,
0.0748673975,
0.0284899548,
0.4787157178,
0.3693392575,
-0.0176858623,
0.4180803001,
0.0488859192,
-0.0095493384,
-0.4613139927,
0.0245149471,
-0.1619865298,
0.2181370407,
0.0159045625,
0.0216300264,
-0.0940039605,
0.3084727526,
0.1127116382,
0.3075916767,
-0.319009006,
0.0016716719,
0.0568783283,
-0.3979631662,
-0.0713030845,
0.4347880781,
-0.0173201226,
0.4235860407,
-0.0750869364,
0.0365404785,
-0.4108154774,
0.0092511848,
0.2968646288,
-0.3253461719,
-0.010993056,
-0.1295345873,
0.3830999434,
-0.0136335753,
0.2570004463,
-0.3716463447,
-0.0795824826,
0.3040832877,
-0.0451909453,
-0.0746103898,
0.4503563643,
0.0829837918,
0.0080554932,
-0.0372339003,
0.0042992644,
0.0258768648,
-0.0560946353,
-0.3571117818,
-0.1623251587
] |
https://github.com/huggingface/datasets/issues/546 | Very slow data loading on large dataset | I believe this is really a very important feature, otherwise, we will still have the issue of too slow loading problems even if the data cache generation is fast. | I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ? | 29 | Very slow data loading on large dataset
I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ?
I believe this is really a very important feature, otherwise, we will still have the issue of too slow loading problems even if the data cache generation is fast. | [
-0.1708837748,
-0.1193050072,
-0.1088860929,
0.2128691077,
-0.0477885753,
0.0982000232,
0.1322444081,
0.4315572381,
0.3260000646,
-0.1955101341,
0.155040279,
0.2180577815,
-0.1044813842,
0.1303240955,
0.245982945,
0.0549581759,
0.0207398906,
0.2710025012,
0.0549229831,
-0.121584937,
0.0354799181,
-0.0033767112,
-0.2422819436,
-0.2905794084,
-0.2007864118,
0.0008391328,
-0.0093750581,
-0.0193936974,
-0.1012578756,
-0.2493917346,
-0.0168477818,
0.0840887874,
0.0404026061,
0.1795385629,
-0.0001215391,
-0.2880674303,
0.4791034162,
0.1190049946,
-0.2993824482,
-0.0663854331,
0.330889225,
-0.6440520287,
0.1856693923,
-0.1526858658,
-0.0242592506,
0.0225021467,
0.1224436909,
-0.1704469025,
-0.0260150433,
0.0512876399,
0.0980151147,
-0.0761278644,
-0.2565415502,
0.2432713509,
0.1302579343,
0.2178976238,
-0.0313269943,
0.2933129072,
0.4627693295,
-0.203504473,
-0.2461478561,
0.0658882707,
-0.1829820722,
0.0979476422,
0.2545048892,
0.0872410759,
-0.1545422971,
-0.0985599384,
0.0784102529,
0.3017696738,
0.6385812759,
-0.1194593906,
-0.0844114572,
-0.5116708279,
0.1060033739,
-0.2073560357,
0.1937393546,
0.2269616723,
-0.2527630329,
-0.23120749,
-0.3389866054,
-0.3271774948,
-0.0391764194,
0.2343060672,
0.1697994322,
0.0866004154,
-0.0181808472,
0.1299438924,
0.1896475554,
0.1051436365,
0.0536866784,
-0.1221143603,
0.2563978732,
0.3801944256,
-0.3880558014,
-0.0825360641,
0.0302408524,
0.2858529389,
0.1697466671,
0.1421571821,
0.12059623,
-0.0010590833,
-0.1231783777,
0.0225767102,
0.2663189769,
0.3897804022,
-0.1854934692,
-0.1530640125,
0.4266659617,
-0.1951058954,
-0.1227802932,
0.2137536705,
-0.1022173762,
-0.0929642096,
0.1184924096,
-0.360244602,
-0.2023614049,
-0.3070412874,
-0.0508142859,
-0.0440403484,
0.0419511348,
-0.2672596872,
0.0199075826,
0.1974841654,
-0.378993094,
0.2880629301,
0.0326743498,
-0.0822568387,
-0.3433534205,
0.0434472226,
-0.132945925,
0.0192975104,
-0.1289639771,
0.2376375645,
0.5631649494,
-0.0453245267,
0.1705596298,
0.0663461536,
-0.1947801113,
-0.0836632326,
-0.1964969337,
-0.3833451867,
-0.0816074759,
-0.0069684889,
-0.0666888803,
0.2241401076,
0.0425944403,
0.3508837223,
-0.4392553568,
0.180621922,
-0.3687040806,
-0.1353670657,
0.0462308601,
0.1133789644,
-0.4325674772,
-0.3168878555,
-0.2426805496,
0.3940674961,
-0.0099961832,
-0.2400590479,
-0.1738304049,
-0.0357681513,
-0.2878267467,
-0.0154791996,
0.0938033313,
0.2874273658,
-0.0541601777,
-0.0180748403,
-0.0591333881,
0.2659283578,
0.3198642135,
0.4956217408,
-0.1973637193,
0.2581643462,
-0.2065196335,
-0.0021843463,
0.5175064802,
0.0865022093,
-0.4842510819,
0.5351499915,
-0.2245028913,
0.0454427227,
0.278603822,
0.4554196596,
0.0155419298,
-0.0804585516,
0.226241678,
0.6157298684,
0.1381958127,
0.2069052756,
-0.4678856432,
-0.1348219216,
0.3957820535,
0.4141777754,
-0.068976745,
0.0771700665,
0.0468458273,
0.126720801,
0.4656169116,
0.1988312304,
-0.0619081371,
0.2839721441,
-0.2368705124,
0.0469791777,
0.0830768198,
0.1733318418,
-0.2105590999,
0.1638223529,
0.1554176062,
0.0450229347,
0.3792930245,
0.1307998002,
-0.1590067297,
-0.2118353546,
0.1023089737,
0.2358946502,
-0.027560873,
-0.055442486,
0.0593646169,
0.2304656208,
-0.2147922516,
0.44035393,
-0.1765073538,
-0.2186828107,
-0.3693211675,
-0.0035077706,
0.2707009614,
-0.0495322011,
0.2977484465,
0.3167924881,
-0.1896386892,
0.0506396815,
-0.1310395002,
-0.0754614472,
-0.1972572207,
0.1816371083,
-0.0243194234,
0.0862565339,
0.0342188999,
0.001927875,
0.3978551328,
0.198873803,
0.2990909815,
-0.4056384265,
-0.2126012743,
0.3579398692,
-0.1928495616,
0.2532293797,
0.1134450138,
-0.4938903153,
0.1087488458,
-0.1259760708,
0.1698672175,
0.2656655312,
0.9038988948,
0.0348598361,
0.6662998199,
0.2545959651,
-0.1685733646,
0.0134952441,
0.4577054381,
-0.0228780769,
-0.0987980515,
0.4839092493,
-0.06151741,
-0.3843646944,
0.0158466715,
-0.1110041514,
0.4429907799,
0.2537377477,
0.088620171,
0.0565744787,
-0.0263063014,
-0.2527050376,
0.1614327133,
-0.1258852631,
0.2274751961,
0.2149233222,
0.3197238445,
0.1339853704,
-0.4487345517,
-0.2647604644,
-0.0168763623,
0.3032931089,
-0.079426758,
-0.0173538104,
-0.0245068781,
-0.3835328519,
-0.1481561065,
0.0173824094,
-0.2176358849,
-0.2085160613,
0.0413725451,
0.0394825526,
0.4439233541,
-0.1601713896,
-0.0068604723,
0.1897864342,
-0.2549464107,
-0.5058602691,
-0.3831741214,
-0.1712241471,
-0.3200806975,
-0.0412952825,
0.2373733222,
0.1959141195,
0.3190256655,
0.1532439888,
-0.173855722,
0.1706150472,
-0.1240505874,
-0.191141516,
-0.0086412337,
-0.0059723258,
-0.1851427704,
0.1018376276,
0.2547338605,
-0.0169272497,
-0.0134267621,
-0.3910280168,
-0.0591800138,
0.1095029563,
-0.0506861769,
-0.0129670873,
-0.0616960898,
-0.2215944678,
-0.3127225935,
-0.2671325207,
0.4247217476,
0.0532285869,
0.1396273226,
0.1574801058,
-0.2318087816,
0.2191642523,
0.0809981301,
0.2858387232,
-0.0404095389,
-0.2382850349,
0.2692322731,
0.2617773712,
-0.1944067627,
-0.2885739207,
0.0042180344,
0.109098345,
0.0836813599,
-0.7654354572,
0.3025309145,
-0.4086547196,
-0.0389933735,
-0.1383833587,
0.1694160104,
0.2052452564,
-0.2655717731,
-0.006838277,
0.2257290483,
-0.1746085584,
-0.0823598653,
-0.0131538631,
0.2187668383,
-0.1661927998,
0.3839708269,
-0.0843011513,
0.0593582243,
0.0653735772,
-0.0612385236,
0.2289561033,
-0.0226928685,
0.1704554856,
-0.2278117239,
-0.2003524154,
0.2423759252,
-0.147745505,
-0.1537100226,
0.2911826372,
0.0405593663,
-0.5236683488,
-0.2941626608,
-0.1583214551,
-0.0664447248,
-0.0404028147,
0.1588534713,
-0.3776308596,
-0.1067282408,
0.0405201167,
-0.1254661381,
-0.2737347484,
-0.7370407581,
-0.114079535,
0.1241321862,
0.1936887503,
0.0266794115,
-0.1326934844,
-0.0473711416,
-0.7960600853,
0.0488158129,
-0.1576574594,
0.4344092906,
0.0749644935,
-0.1837303936,
0.3018524647,
-0.0890743732,
0.3479285538,
0.291341126,
0.177875787,
0.1246939674,
-0.1579188257,
-0.3158030212,
-0.0321157686,
-0.0669330731,
0.2634797692,
0.4881893694,
0.3126024008,
-0.2156244516,
-0.1253402829,
0.0418461785,
0.3029633164,
-0.0100267529,
-0.7628207207,
-0.3536113203,
-0.0647105277,
0.0765331537,
0.0546411201,
-0.2676482201,
0.148243621,
-0.1760106832,
-0.0063911006,
-0.0748460963,
-0.0502959602,
0.2330686748,
-0.0276346765,
-0.2499347925,
0.0962874293,
0.3787206411,
-0.0933474451,
0.2527825236,
-0.5406364202,
0.4283749759,
0.2015637904,
0.1906181723,
0.0158230178,
0.075872153,
0.4232341945,
0.2755138278,
0.0652987957,
0.0898812711,
-0.2765950561,
0.1825898886,
-0.2460806519,
0.0451996587,
0.0425411612,
-0.05362463,
-0.4035273492,
-0.4539191723,
0.9484815001,
0.1471526921,
0.0550568253,
0.3936357498,
-0.4269385338,
0.0688716322,
0.1636858881,
-0.3242273033,
0.793556571,
-0.4691548944,
0.3224420547,
-0.1204103008,
0.1067989394,
0.5137324929,
-0.1806550324,
0.0926416367,
-0.410043329,
0.0047127232,
0.2545188963,
-0.0279028565,
-0.1416480988,
0.2805438042,
-0.1056548879,
0.4864405394,
0.3472345471,
-0.3024031818,
-0.1157681644,
0.6819108725,
-0.1476732939,
-0.2155459374,
-0.5637031794,
0.0641233698,
-0.2749717832,
0.4820743799,
0.0241857618,
0.1429628879,
-0.0827579722,
-0.0242400393,
-0.2062790096,
0.1470733583,
-0.0777716413,
-0.0442314819,
-0.0703180581,
-0.3486213684,
0.1795980483,
0.1700845361,
-0.2536794245,
0.2336210907,
-0.2937817276,
0.0982649773,
-0.3420512676,
0.2043761313,
0.4629024267,
0.0172062255,
0.1355713904,
-0.2207511365,
0.0362211764,
-0.18737638,
-0.1038952246,
-0.2375707775,
-0.0763848275,
0.1820120811,
0.0744049102,
0.1673855484,
-0.3001608253,
0.2901706696,
0.0506366417,
-0.0320310406,
0.033294145,
0.442712307,
0.0551372543,
0.3508200347,
0.0021835486,
-0.4148608446,
-0.0583072044,
0.5324238539,
0.1278884411,
-0.199359566,
0.4141848087,
0.1323915273,
-0.1521715969,
-0.1286332309,
-0.0661163777,
-0.0891607553,
-0.1918782592,
0.0175550207,
0.0948009714,
0.0830993056,
0.1859889328,
-0.0450850055,
-0.0991523266,
-0.1269468516,
0.1224897578,
-0.4464527965,
-0.354043901,
-0.0039196238,
-0.1062921137,
-0.0885117948,
0.0354365781,
-0.0504938774,
0.2558650076,
0.3383281231,
-0.1661080867,
0.1253556311,
0.061026752,
0.0751440227,
0.0404820889,
-0.2260235846,
0.0345961601,
0.2679466605,
0.0642320365,
-0.1063543707,
0.154424578,
-0.1853527725,
0.015696764,
0.1970502287,
-0.0105964523,
0.1639120281,
0.011531312,
-0.2834030986,
-0.0786419213,
-0.2003030479,
-0.091579929,
-0.0367644243,
-0.0581975207,
-0.02550596,
0.0380094871,
0.0103582907,
-0.0642761737,
0.3120420575,
-0.3961381912,
0.1906912923,
0.0510311574,
0.0342136323,
0.3081757128,
0.104348734,
-0.5492599607,
-0.1659026742,
0.1145076156,
0.0709212348,
0.186665833,
-0.1495614648,
0.1903333962,
0.0521306656,
0.1587468535,
0.5096777678,
0.0740132183,
-0.252969861,
0.2039008588,
0.0371940881,
-0.2015985698,
-0.0274778809,
0.2153614163,
0.1706752777,
-0.1063664407,
0.1950930655,
0.2131562233,
0.0300225616,
-0.2574486434,
-0.3286288381,
0.4000744224,
-0.2123186886,
-0.2883194089,
0.0732744187,
0.2181685418,
-0.1305532455,
0.2378869951,
0.1110630631,
0.0133484192,
0.3842388391,
0.2560364008,
0.3813842237,
0.0927957967,
0.4703731835,
-0.400837183,
-0.2251327336,
-0.1482992172,
0.5131302476,
0.0866047591,
0.1347491145,
-0.1501099318,
0.245480448,
-0.0429762416,
0.1373547167,
-0.1762051582,
-0.0240673777,
-0.0081065986,
-0.0769456476,
0.2742761374,
0.092918098,
-0.1402641237,
0.1897151917,
-0.3039066195,
0.0690492317,
-0.0252105519,
0.3115074933,
-0.0612560026,
-0.420152843,
0.1340336055,
0.063037619,
0.2313793451,
-0.2374030203,
0.0885191634,
-0.380161196,
0.0390043221,
0.0805716068,
0.1637851,
0.2026796639,
0.391623795,
-0.0344125554,
0.0159600675,
-0.1590323597,
-0.0349933356,
-0.2816368341,
0.4697638154,
0.3143093288,
0.3289675117,
0.0662653446,
-0.0791720971,
-0.0623050854,
0.2926388383,
-0.0918341875,
-0.2164209783,
-0.1209675819,
0.2844501734,
0.2031005621,
-0.3142254353,
-0.5752782226,
0.1504357457,
-0.2764352262,
-0.1777212918,
0.159537524,
-0.0912328511,
-0.0382675529,
-0.0871382877,
0.0184831098,
-0.31988886,
0.6624279022,
0.4286261499,
0.4453489482,
-0.1340379715,
-0.2282948792,
-0.5174899101,
-0.0722062066,
-0.2385066599,
-0.0622383654,
0.135601908,
0.2646825016,
-0.0000901967,
0.1179214269,
-0.4758052528,
0.1493464708,
-0.1230767742,
-0.1929817796,
-0.3171252906,
-0.0792830884,
-0.2710913122,
0.1241759658,
-0.1513264924,
-0.2715727091,
0.0279223323,
-0.2592737675,
-0.1142923459,
-0.193705678,
0.1246504486,
-0.0440998375,
0.5869351029,
0.3407253325,
-0.0067028683,
0.3802831769,
0.0303097442,
-0.0634887069,
-0.4289751053,
0.0660753772,
-0.1503824145,
0.186299935,
0.1481111199,
-0.0619395226,
-0.0912792087,
0.3369535208,
0.1819932461,
0.2498592287,
-0.2586730719,
0.0191847607,
0.0775856823,
-0.3091538548,
-0.0328679457,
0.3581902087,
0.0256291144,
0.3765019178,
-0.0346233994,
-0.1105372012,
-0.4577967525,
0.0815936774,
0.250017643,
-0.2777262926,
-0.0528307036,
-0.2340159565,
0.3957238495,
0.062895,
0.2202778757,
-0.3804605901,
-0.0504569039,
0.3263396323,
0.1096776724,
0.0260347575,
0.5247814655,
-0.0093300734,
-0.1259950548,
-0.0275591314,
0.2195759267,
-0.1400581449,
-0.0614504702,
-0.3122715354,
-0.2382737994
] |
https://github.com/huggingface/datasets/issues/546 | Very slow data loading on large dataset | Hmm ok then maybe it's the hashing step indeed.
Let's see if we can improve this as well.
(you will very likely have to regenerate your dataset if we change this part of the lib though since I expect modifications on this part of the lib to results in new hashes) | I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ? | 51 | Very slow data loading on large dataset
I made a simple python script to check the NLP library speed, which loads 1.1 TB of textual data.
It has been 8 hours and still, it is on the loading steps.
It does work when the text dataset size is small about 1 GB, but it doesn't scale.
It also uses a single thread during the data loading step.
```
train_files = glob.glob("xxx/*.txt",recursive=True)
random.shuffle(train_files)
print(train_files)
dataset = nlp.load_dataset('text',
data_files=train_files,
name="customDataset",
version="1.0.0",
cache_dir="xxx/nlp")
```
Is there something that I am missing ?
Hmm ok then maybe it's the hashing step indeed.
Let's see if we can improve this as well.
(you will very likely have to regenerate your dataset if we change this part of the lib though since I expect modifications on this part of the lib to results in new hashes) | [
-0.1601838171,
-0.1073321849,
-0.1088036448,
0.2356132716,
0.0101085212,
0.0855074227,
0.096739009,
0.417462945,
0.240922451,
-0.2387265712,
0.1788457632,
0.2212671936,
-0.1573328972,
0.2012870014,
0.2440308928,
0.0954288766,
-0.0814830512,
0.2566320598,
-0.0297539569,
-0.1509747356,
0.0977720991,
-0.0070640091,
-0.2352126092,
-0.3177329898,
-0.1379802972,
-0.0537618399,
-0.0029597357,
0.118151173,
-0.1523027271,
-0.2356421202,
-0.0744614825,
0.0946117118,
-0.033706408,
0.1301775426,
-0.000117367,
-0.2543480098,
0.4368431568,
0.1030025259,
-0.1783036292,
-0.0722184628,
0.2452518493,
-0.6293507814,
0.156346187,
-0.1135753021,
0.010013774,
0.0221236348,
0.030623382,
-0.0951650366,
0.0151712969,
0.0258702748,
0.1248129457,
-0.1576096565,
-0.2547903061,
0.3403289914,
0.1736138761,
0.1329303086,
-0.03484248,
0.4113651812,
0.3614936173,
-0.1394691765,
-0.244242996,
0.0927666277,
-0.1439821422,
0.0500797182,
0.2993364036,
0.0624084584,
-0.0894315615,
-0.0703955963,
0.0655163825,
0.3119707704,
0.5350314975,
-0.1155159622,
-0.0738630891,
-0.3971730471,
0.0449682996,
-0.1798521727,
0.2664927542,
0.1910125464,
-0.1697355807,
-0.2910932899,
-0.3915232122,
-0.3189975619,
-0.0052018613,
0.2116541862,
0.2382813543,
0.0413415655,
0.0166871995,
0.1217722893,
0.3245378733,
-0.0493632145,
-0.0690266043,
-0.0392776839,
0.1879110634,
0.382935971,
-0.3432563543,
-0.0498865657,
0.0962747037,
0.2656584084,
0.0248608813,
0.1510503888,
0.1117144674,
0.1768586934,
-0.1563509107,
0.0063164625,
0.20690763,
0.2833957672,
-0.0738159642,
-0.2024074793,
0.3430437446,
-0.1749478132,
-0.1971483827,
0.2683000565,
-0.1556073427,
-0.2186596245,
-0.0255657546,
-0.3425050974,
-0.292080313,
-0.3515900373,
0.0043010414,
-0.0825538337,
-0.058710821,
-0.2975077033,
0.1371727139,
0.2337218672,
-0.4289889336,
0.1917136014,
-0.0391213037,
-0.2029282004,
-0.3366801441,
0.098421596,
-0.1477579921,
0.0849541649,
-0.2443003953,
0.2111697942,
0.5522872806,
-0.1529752761,
0.2215204239,
0.1284428388,
-0.1704707295,
-0.0670166537,
-0.2042764276,
-0.2548727095,
-0.0597634912,
-0.0222389325,
-0.0546883196,
0.2033080161,
-0.0341567844,
0.3675208092,
-0.4234897196,
0.1777502298,
-0.350512594,
-0.1370329857,
0.0521857068,
0.1265483052,
-0.4527390897,
-0.2849043906,
-0.1414554119,
0.3504906893,
-0.0039894916,
-0.2377238274,
-0.2531364858,
-0.0554262251,
-0.1817229688,
-0.0687222853,
0.1630558968,
0.211037159,
-0.1411569864,
-0.0157528594,
-0.0991911516,
0.3512481153,
0.3494646251,
0.5109769106,
-0.2431334853,
0.224211663,
-0.1391342878,
0.0753153712,
0.5104478002,
0.1495015621,
-0.48148036,
0.533419311,
-0.186439842,
0.0663993508,
0.1667494774,
0.4503006041,
-0.0154087245,
-0.068640545,
0.1876240373,
0.657384336,
0.186505869,
0.205564037,
-0.5292838812,
-0.1271953136,
0.3831863105,
0.3882477283,
-0.1265486777,
-0.0448078476,
0.0644435659,
0.2115143985,
0.4432806671,
0.1463390589,
-0.0802838802,
0.3024357557,
-0.1130207628,
-0.0086916909,
0.0369792879,
0.1256116927,
-0.0781992525,
0.1612612307,
0.1250466853,
0.1854241937,
0.2704237103,
0.1531008482,
-0.13746351,
-0.3148971498,
0.0726236254,
0.1942131817,
-0.0297197327,
-0.0767293423,
-0.0417795181,
0.1080716923,
-0.1495257169,
0.2906082273,
-0.2226115912,
-0.1651923656,
-0.4035985172,
-0.0779209733,
0.2123476565,
-0.1582202464,
0.3385807276,
0.328904748,
-0.1488136202,
-0.0629238039,
-0.1064999402,
-0.0856107697,
-0.3094916344,
0.220008567,
0.0904116407,
0.1272015721,
0.0426448435,
-0.0571575724,
0.4239549637,
0.1180785075,
0.1862311959,
-0.4027867913,
-0.2185849696,
0.4604167938,
-0.2816231251,
0.1958812177,
0.0863028392,
-0.4638468027,
0.0657614544,
-0.1463586241,
0.1708153486,
0.2818573713,
0.8769373298,
0.0479908176,
0.609596014,
0.3669525981,
-0.0704370141,
0.0417423807,
0.5290091038,
0.0008990094,
-0.2038211524,
0.5212618709,
-0.0849339068,
-0.4068972468,
-0.0294886865,
-0.0923201591,
0.3518697619,
0.2258282304,
0.0709660798,
0.1058856249,
-0.0567709543,
-0.2607475519,
0.1747760028,
-0.204048723,
0.200416252,
0.1718320101,
0.2445514351,
0.1183080748,
-0.4640567303,
-0.1943508089,
0.0368422493,
0.4481580555,
-0.1012152135,
-0.0404051952,
-0.1280094534,
-0.2470642328,
-0.2019759268,
-0.0580908433,
-0.3640303016,
-0.2398006618,
-0.0023722518,
0.0400865413,
0.3839772344,
-0.1228240505,
0.0325510949,
0.0749538317,
-0.1560161412,
-0.5239127874,
-0.27545771,
-0.1642967314,
-0.3267018795,
-0.0128084943,
0.3502833247,
0.2617179155,
0.2895051539,
0.1520952582,
-0.2226362824,
0.2321970612,
-0.1983886808,
-0.2286718339,
-0.0677989572,
0.0007591993,
-0.1222772002,
0.1360550225,
0.1543964893,
-0.0482218377,
0.079474315,
-0.3189549744,
-0.0575905293,
0.1524786055,
-0.0264737047,
0.0079265088,
-0.0531881452,
-0.336625725,
-0.1873837411,
-0.2386596948,
0.5235025287,
0.0800587088,
0.240028441,
0.2175988853,
-0.1314043701,
0.238930136,
0.0015133247,
0.2637148499,
0.0267374329,
-0.3181769848,
0.3042293787,
0.3015607595,
-0.1725574136,
-0.1673832536,
0.0746444613,
0.1149823964,
0.1365851611,
-0.7411954999,
0.2916589379,
-0.3721138835,
-0.0532524064,
-0.0403816514,
0.1372642666,
0.1612441689,
-0.1977793425,
-0.0433081761,
0.2443806231,
-0.2286247015,
0.0600733981,
-0.0032307087,
0.2971611023,
-0.1220103949,
0.4379103482,
-0.0273350589,
0.1570204794,
0.1014794558,
-0.1560006738,
0.1715013236,
0.0441329554,
0.0634029657,
-0.1760017574,
-0.2444693744,
0.2238002568,
-0.0720517188,
-0.0700966716,
0.346693784,
0.1279875636,
-0.4035212994,
-0.2299664617,
-0.1928105056,
-0.1343053877,
-0.0957185328,
0.2891785502,
-0.573512733,
-0.1390633434,
-0.0276081562,
-0.1219026372,
-0.2293454707,
-0.7107152939,
-0.0334072337,
0.1148638576,
0.1338198036,
0.035589695,
-0.1537472606,
0.0088324323,
-0.6964544058,
0.0708510056,
-0.1736066639,
0.4671595395,
0.0477878749,
-0.1481667459,
0.3559192121,
-0.1329544634,
0.3270042241,
0.2354598194,
0.1489394903,
0.1138511747,
-0.1929854751,
-0.2955777645,
-0.0183351487,
-0.1081783399,
0.2827037573,
0.5091446638,
0.2968721986,
-0.2459940314,
-0.1142120734,
0.0738170594,
0.2307706177,
-0.0188425928,
-0.5913385749,
-0.299145788,
-0.1378331333,
0.0352211371,
0.041716598,
-0.4061603844,
0.190084219,
-0.1016410291,
0.0375569835,
0.0361320153,
-0.0489721,
0.3762349188,
0.0298864134,
-0.2006280124,
0.1272472143,
0.4202657342,
-0.1220817417,
0.2732322216,
-0.5304991007,
0.4065214097,
0.2383785397,
0.2210580856,
-0.1334272325,
0.0646390393,
0.4113093615,
0.2798767686,
0.0700516999,
0.1080987006,
-0.2257952094,
0.1607619524,
-0.1429776847,
0.1101179719,
0.1430761814,
0.0137173757,
-0.4057681262,
-0.4632482827,
0.8867707849,
0.127292797,
0.0569166541,
0.448053956,
-0.4334321618,
0.0480899811,
0.1586893201,
-0.2592349648,
0.8431885839,
-0.4389933646,
0.1994745135,
-0.1670081019,
0.1202711761,
0.3556025326,
-0.1527600735,
0.023037225,
-0.5007896423,
0.0173591077,
0.2982679605,
-0.0431022905,
-0.0816728622,
0.2605400681,
-0.087430656,
0.3969624937,
0.3509871662,
-0.2249670476,
-0.0968428254,
0.726824224,
-0.1192473471,
-0.2146690339,
-0.5672606826,
0.1038514674,
-0.3063290119,
0.4123210907,
0.0262243003,
-0.0132768489,
-0.0762002319,
0.0199680403,
-0.2648591995,
0.155310601,
0.0441247635,
-0.0842339247,
-0.0550619476,
-0.2805173397,
0.1886935681,
0.0249194652,
-0.2194395661,
0.2818991244,
-0.2780000567,
0.1446069479,
-0.4240194261,
0.162023142,
0.5813891888,
0.0342738442,
0.1667532921,
-0.1988770366,
-0.0346665978,
-0.2172164619,
-0.0073399469,
-0.3404228687,
-0.0938075483,
0.1256665587,
0.0258527398,
0.1218632609,
-0.3384748697,
0.2759361267,
0.0006479397,
-0.0423351265,
0.0588211752,
0.4667868316,
-0.0157505125,
0.4426683187,
0.0792734921,
-0.3779085875,
-0.0581216514,
0.53860116,
0.1215892881,
-0.1944125295,
0.3733448684,
0.2447125018,
-0.1593824327,
-0.1300232708,
-0.0689209849,
-0.1411023736,
-0.107848309,
0.0062504821,
0.0818996057,
-0.0025037974,
0.2305916846,
-0.0075667109,
-0.0698208809,
-0.0405269749,
0.0405031294,
-0.4410701692,
-0.3163053989,
0.0238049999,
-0.034541294,
0.0053884066,
-0.1201700121,
0.0403419025,
0.2122024596,
0.3924252391,
-0.2014087588,
0.1785957217,
0.045634903,
0.1470191479,
0.0360876061,
-0.2837755084,
0.0253577977,
0.198389858,
0.0944820121,
-0.0428135172,
0.1105439961,
-0.197113961,
0.0569721088,
0.1666143239,
-0.0283109285,
0.0568087399,
-0.0241399594,
-0.2199424803,
-0.1199151129,
-0.1984606683,
-0.1657455266,
-0.0402070917,
-0.0369199216,
-0.0040501058,
0.08126086,
0.0007807184,
-0.0361268967,
0.2397198379,
-0.4830043018,
0.1856576204,
0.0966673046,
-0.0207793061,
0.2858250737,
0.0304302424,
-0.4567317367,
-0.1013531089,
0.123787269,
0.105180271,
0.219739005,
-0.1102242321,
0.1212706864,
0.0106820911,
0.3315261602,
0.523565352,
0.1338112354,
-0.2428765297,
0.2041600943,
0.0784934089,
-0.2715135217,
-0.0141106397,
0.2915894985,
0.1058766991,
-0.1336044371,
0.1146292537,
0.1202778816,
0.0585415214,
-0.3464826345,
-0.3014537394,
0.3380941451,
-0.2441133857,
-0.2979714572,
0.0643427446,
0.0163055956,
-0.2179079354,
0.290007323,
0.077881664,
-0.0167018734,
0.4894798994,
0.1676049381,
0.342122525,
0.0653250664,
0.5332940817,
-0.4446782172,
-0.2065306455,
-0.1768014431,
0.5822595358,
0.2081187218,
0.0829705447,
-0.0738841891,
0.3838200569,
0.027776204,
-0.007232707,
-0.1711735725,
0.1092460155,
-0.0463983007,
-0.1083680019,
0.1905538738,
0.0903562605,
-0.2495438755,
0.2571472824,
-0.2787745893,
0.0264758468,
0.1017141938,
0.3117563426,
-0.0405871421,
-0.4505378902,
0.1253462285,
-0.0486574881,
0.2256066799,
-0.2179589868,
0.1314531565,
-0.2777086794,
0.0332336053,
0.0723270029,
0.1033720374,
0.1764123142,
0.3884590566,
-0.0186528638,
0.0564334393,
-0.1125285625,
0.0122590438,
-0.2258787006,
0.4924060702,
0.2195759267,
0.454197377,
0.040157143,
-0.0623214841,
-0.1163429469,
0.1922285408,
-0.1800903529,
-0.1461564898,
-0.0361530036,
0.3346708417,
0.2543666065,
-0.3017177582,
-0.5345796347,
0.1077116951,
-0.2916919589,
-0.4246153831,
0.1893159896,
-0.0572940148,
-0.0502089821,
-0.1277998537,
0.0379531197,
-0.295509398,
0.7509640455,
0.425160706,
0.3682782948,
-0.1710945964,
-0.199972719,
-0.5557055473,
-0.0755977929,
-0.1810999513,
-0.0966885984,
0.0442205556,
0.3024515808,
0.0710491836,
0.0783823058,
-0.4742081761,
0.233353287,
-0.066232726,
-0.1416825056,
-0.2677324414,
0.0943395346,
-0.3391098976,
0.0613313913,
-0.1751962304,
-0.1982196569,
0.1026211679,
-0.2987849712,
-0.0779154673,
-0.1858753562,
-0.0222113729,
0.0004868656,
0.4985758066,
0.3433731496,
-0.034530744,
0.4027953446,
0.0035729036,
-0.0944422856,
-0.4025662541,
0.0502538309,
-0.1398016363,
0.2345334291,
0.0715539455,
-0.004217945,
0.0414006189,
0.4242456257,
0.0843151063,
0.2722783387,
-0.3205939531,
0.0311021358,
-0.0058251992,
-0.3514072299,
0.011023581,
0.3562556207,
-0.038477201,
0.4106631577,
-0.0319842882,
-0.1470575929,
-0.531272769,
0.0306692794,
0.3248063624,
-0.3233672678,
-0.0022172779,
-0.1551622003,
0.3223171532,
0.0688253567,
0.2539072335,
-0.4685245752,
-0.0492622107,
0.2416203916,
0.0816298574,
-0.0064781904,
0.5046189427,
-0.0128519014,
-0.093412593,
-0.0103992596,
0.0301524121,
-0.0991647094,
-0.1319020987,
-0.3541837931,
-0.1793479621
] |
Subsets and Splits